Knowledge driven indoor object-goal navigation aid for visually impaired people

7Citations
Citations of this article
17Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Aiming to help improve quality of life of the visually impaired people, this paper presents a novel wearable aid in the shape of a helmet for helping them find objects in indoor scenes. An object-goal navigation system based on a wearable device is developed, which consists of four modules: object relation prior knowledge (ORPK), perception, decision and feedback. To make the aid also work well in unfamiliar environment, ORPK is used for sub-goal inference to help the user find the target goal. And a method that learns the ORPK from unlabelled images by utilising a scene graph and knowledge graph is proposed. The effectiveness of the aid is demonstrated in real world experiments.

Cite

CITATION STYLE

APA

Hou, X., Zhao, H., Wang, C., & Liu, H. (2022). Knowledge driven indoor object-goal navigation aid for visually impaired people. Cognitive Computation and Systems, 4(4), 329–339. https://doi.org/10.1049/ccs2.12061

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free