Advancing In-vehicle Gesture Interactions with Adaptive Hand-Recognition and Auditory Displays

2Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Competition for visual attention in vehicles has increased with the integration of touch-based interfaces, which has led to an increased crash risk. To mitigate this visual distraction, we designed an in-vehicle gesture-based menu system with different auditory feedback types and hand-recognition systems. We are conducting an experiment using a driving simulator where the participant performs a secondary task of selecting a menu item. Three auditory feedback types are tested in addition to the baseline condition (no audio): auditory icons, earcons, and spearcons. For each type of auditory display, two hand-recognition systems are tested: fixed and adaptive. We expect we can reduce the driver's secondary task workload, while minimizing off-road glances for safety. Our experiment would contribute to the existing literature in multimodal signal processing, confirming the Multiple Resource Theory. It would also present practical design guidelines for auditory-feedback for gesture-based in-vehicle interactions.

Cite

CITATION STYLE

APA

Tabbarah, M., Cao, Y., Liu, Y., & Jeon, M. (2021). Advancing In-vehicle Gesture Interactions with Adaptive Hand-Recognition and Auditory Displays. In Adjunct Proceedings - 13th International ACM Conference on Automotive User Interfaces and Interactive Vehicular Applications, AutomotiveUI 2021 (pp. 204–206). Association for Computing Machinery, Inc. https://doi.org/10.1145/3473682.3481870

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free