Along with the invention of portable devices, such as smartphones and tablets, computer-based touch screen sensor-assistive technologies have become significantly more affordable than traditional tactile graphics. The sensor panel in these technologies allows users to receive visual and auditorial responses via interaction with the device. However, visually impaired individuals (with a lack or loss of ability to see) will not find visual responses useful when using tablets or smartphones. Therefore, in this paper we propose a system that helps visually impaired people comprehend information on electronic devices with the help of auditory action feedback. We develop a multimedia system for sound production from a given image via object detection. In this study, YOLO (You Only Look Once) is used in object detection for sonification. A pre-trained model is used; thus, a wider range of object classification can be identified. The system generates the corresponding sound when an object on the sensor screen is touched. The purpose of our research is to aid visually impaired people to perceive information of a picture shown on the device by touching the detected object. The device was tested by simulating visually impaired people by blindfolding people with normal vision, who filled out questionnaires on its performance. The results indicate that most of the users found that the sound presented by the device was helpful for telling them what the shown image was.
CITATION STYLE
Kurniati, T., Yang, C. K., Chen, T. S., Chung, Y. F., Huang, Y. M., & Chen, C. C. (2021). Interactive sound generation to aid visually impaired people via object detection using touch screen sensor. Sensors and Materials, 33(11), 4057–4068. https://doi.org/10.18494/SAM.2021.3528
Mendeley helps you to discover research relevant for your work.