Novel In-Vehicle Gesture Interactions: Design and Evaluation of Auditory Displays and Menu Generation Interfaces

7Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Using in-vehicle infotainment systems degrades driving performance and increases crash risk. To address this, we developed air gesture interfaces using various auditory displays. Thirty-two participants drove a simulator with air-gesture menu navigation tasks. A 4x2 mixed-model design was used to explore the effects of auditory displays as a within-subjects variable (earcons, auditory icons, spearcons, and no-sound) and menu-generation interfaces as a between-subjects variable (fixed and adaptive) on driving performance, secondary task performance, eye glance, and user experience. The adaptive condition centered the menu around the user's hand position at the moment of activation, whereas the fixed condition located the menu always at the same position. Results demonstrated that spearcons provided the least visual distraction, least workload, best system usability and was favored by participants; and that fixed menu generation outperformed adaptive menu generation in driving safety and secondary task performance. Findings will inform design guidelines for in-vehicle air-gesture interaction systems.

Cite

CITATION STYLE

APA

Tabbarah, M., Cao, Y., Abu Shamat, A., Fang, Z., Li, L., & Jeon, M. (2023). Novel In-Vehicle Gesture Interactions: Design and Evaluation of Auditory Displays and Menu Generation Interfaces. In ACM International Conference Proceeding Series (pp. 224–233). Association for Computing Machinery. https://doi.org/10.1145/3580585.3607164

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free