HapAR: Handy Intelligent Multimodal Haptic and Audio-Based Mobile AR Navigation for the Visually Impaired

4Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Visually impaired people have suffered greatly on finding the right direction toward their destination. This paper initiates an innovative and low-cost solution by providing mobile Augmented Reality (HapAR) that is capable to stimulate haptic and audio sensation for guiding them inside the campus. The direction is generated using the geo-location of the building and current position of the user. The initial testing was conducted inside the campus and successfully gives a promising result. They found the system was easy to use by pointing out the mobile devices and they can feel the vibration when the user is out of track and hear the voice assistant to correct their track.

Cite

CITATION STYLE

APA

Basori, A. H. (2020). HapAR: Handy Intelligent Multimodal Haptic and Audio-Based Mobile AR Navigation for the Visually Impaired. In EAI/Springer Innovations in Communication and Computing (pp. 319–334). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-16450-8_13

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free