VES: A mixed-reality system to assist multisensory spatial perception and cognition for blind and visually impaired people

16Citations
Citations of this article
51Readers
Mendeley users who have this article in their library.

Abstract

In this paper, the Virtually Enhanced Senses (VES) System is described. It is an ARCore-based, mixed-reality system meant to assist blind and visually impaired people's navigation. VES operates in indoor and outdoor environments without any previous in-situ installation. It provides users with specific, runtime-configurable stimuli according to their pose, i.e., position and orientation, and the information of the environment recorded in a virtual replica. It implements three output data modalities: Wall-tracking assistance, acoustic compass, and a novel sensory substitution algorithm, Geometry-based Virtual Acoustic Space (GbVAS). The multimodal output of this algorithm takes advantage of natural human perception encoding of spatial data. Preliminary experiments of GbVAS have been conducted with sixteen subjects in three different scenarios, demonstrating basic orientation and mobility skills after six minutes training.

Cite

CITATION STYLE

APA

Real, S., & Araujo, A. (2020). VES: A mixed-reality system to assist multisensory spatial perception and cognition for blind and visually impaired people. Applied Sciences (Switzerland), 10(2). https://doi.org/10.3390/app10020523

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free