SaSYS: A swipe gesture-based system for exploring urban environments for the visually impaired

5Citations
Citations of this article
18Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Exploring and learning an environment is a particularly challenging issue faced by visually impaired people. Existing interaction techniques for allowing users to learn an environment may not be useful while traveling because they often use dedicated hardware or require users to focus on tactile or auditory feedback. In this paper, we introduce an intuitive interaction technique for selecting areas of interests in urban environments by performing simple swipe gestures on touchscreen. Based on the swipe-based interaction, we developed SaSYS, a location-aware system that enables users to discover points of interest (POI) around them using off-the-shelf smartphones. Our approach canbe easily implemented on handheld devices without requiring any dedicated hardware and having users to constantly focus on tactile or auditory feedback. SaSYS also provides a fine-grained control over Text-to-Speech (TTS). Our userstudy shows that 9 of 11 users preferred swipe-based interaction to existing pointing-based interaction.

Cite

CITATION STYLE

APA

Kim, J. E., Bessho, M., Koshizuka, N., & Sakamura, K. (2014). SaSYS: A swipe gesture-based system for exploring urban environments for the visually impaired. In Lecture Notes of the Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering, LNICST (Vol. 130, pp. 54–71). Springer Verlag. https://doi.org/10.1007/978-3-319-05452-0_5

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free