Characterizing and Predicting Engagement of Blind and Low-Vision People with an Audio-Based Navigation App

4Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Audio-based navigation technologies can help people who are blind or have low vision (BLV) with more independent navigation, mobility, and orientation. We explore how such technologies are incorporated into their daily lives using machine learning models trained on the engagement patterns of over 4,700 BLV people. For mobile navigation apps, we identify user engagement features like the duration of first-time use and engagement with spatial audio callouts that are greatly relevant to predicting user retention. This work contributes a more holistic understanding of important features associated with user retention and strong app usage, as well as insight into the exploration of ambient surroundings as a compelling use case for assistive navigation apps. Finally, we provide design implications to improve the accessibility and usability of audio-based navigation technology.

Cite

CITATION STYLE

APA

Liu, T., Hernandez, J., Gonzalez-Franco, M., Maselli, A., Kneisel, M., Glass, A., … Miller, A. (2022). Characterizing and Predicting Engagement of Blind and Low-Vision People with an Audio-Based Navigation App. In Conference on Human Factors in Computing Systems - Proceedings. Association for Computing Machinery. https://doi.org/10.1145/3491101.3519862

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free