In this paper we propose an electronic travel aid system for the visually impaired that utilizes interactive sonification of U-depth maps of the environment. The system is comprised of a depth sensor connected to a mobile device and a dedicated application for segmenting depth images and converting them into sounds in real time. An important feature of the system is that the user can interactively select the 3D scene region for sonification by simple touch gestures on the mobile device screen. The sonification scheme is using stereo panning for azimuth angle localization of scene objects, loudness for their size and frequency for distance encoding. Such a sonic representation of 3D scenes allows the user to identify the geometric structure of the environment and determine the distances to potential obstacles. The prototype application was tested by three visually impaired users who managed to successfully perform indoor mobility tasks. The system’s usefulness was evaluated quantitatively by means of system usability and task-related questionnaires.
CITATION STYLE
Skulimowski, P., Owczarek, M., Radecki, A., Bujacz, M., Rzeszotarski, D., & Strumillo, P. (2019). Interactive sonification of U-depth images in a navigation aid for the visually impaired. Journal on Multimodal User Interfaces, 13(3), 219–230. https://doi.org/10.1007/s12193-018-0281-3
Mendeley helps you to discover research relevant for your work.