A CNN-based wearable assistive system for visually impaired people walking outdoors

25Citations
Citations of this article
40Readers
Mendeley users who have this article in their library.

Abstract

In this study, we propose an assistive system for helping visually impaired people walk outdoors. This assistive system contains an embedded system—Jetson AGX Xavier (manufacture by Nvidia in Santa Clara, CA, USA) and a binocular depth camera—ZED 2 (manufacture by Stereolabs in San Francisco, CA, USA). Based on the CNN neural network FAST-SCNN and the depth map obtained by the ZED 2, the image of the environment in front of the visually impaired user is split into seven equal divisions. A walkability confidence value for each division is computed, and a voice prompt is played to guide the user toward the most appropriate direction such that the visually impaired user can navigate a safe path on the sidewalk, avoid any obstacles, or walk on the crosswalk safely. Furthermore, the obstacle in front of the user is identified by the network YOLOv5s proposed by Jocher, G. et al. Finally, we provided the proposed assistive system to a visually impaired person and experimented around an MRT station in Taiwan. The visually impaired person indicated that the proposed system indeed helped him feel safer when walking outdoors. The experiment also verified that the system could effectively guide the visually impaired person walking safely on the sidewalk and crosswalks.

Cite

CITATION STYLE

APA

Hsieh, I. H., Cheng, H. C., Ke, H. H., Chen, H. C., & Wang, W. J. (2021). A CNN-based wearable assistive system for visually impaired people walking outdoors. Applied Sciences (Switzerland), 11(21). https://doi.org/10.3390/app112110026

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free