Presenter: Ting Zhang
Title/Affiliation: Graduate Student, Purdue University
No suitable assistive technology is currently available for students and scientists that are blind or visually impaired (BVI) from advancing in careers in science, technology, engineering, and mathematics (STEM) fields. It is a challenge for them to interpret real-time visual scientific data during lab experimentation, such as performing light microscopy, spectrometry, and observing chemical reactions. To address this problem, a real-time multimodal image perception system is developed to allow individuals who are BVI perceive blood smear images by employing a combination of auditory, haptic, and vibrotactile feedbacks. These sensory feedbacks are used to convey visual information through alternative perceptual channels. Two sets of image features of interest: primary and peripheral features are applied to characterize images. Causal relation links between these two groups were established by developing a Bayesian network. Then, two methods were conceived for optimized matching between primary features and sensory modalities. Experimental results confirmed this real-time approach of higher accuracy in recognizing and analyzing objects within images compared to conventional tactile images.