Should fully autonomous vehicles (FAVs) be designed inclusively and accessibly, independence will be transformed for millions of people experiencing transportation-limiting disabilities worldwide. Although FAVs hold promise to improve efficient transportation without intervention, a truly accessible experience must enable user input, for all people, in many driving scenarios (e.g., to alter a route or pull over during an emergency). Therefore, this paper explores desires for control in FAVs among (n=23) people who are blind and visually impaired. Results indicate strong support for control across a battery of driving tasks, as well as the need for multimodal information. These findings inspired the design and evaluation of a novel multisensory interface leveraging mid-air gestures, audio, and haptics. All participants successfully navigated driving scenarios using our gestural-audio interface, reporting high ease-of-use. Contributions include the first inclusively designed gesture set for FAV control and insight regarding supplemental haptic and audio cues.
CITATION STYLE
Fink, P. D. S., Dimitrov, V., Yasuda, H., Chen, T. L., Corey, R. R., Giudice, N. A., & Sumner, E. S. (2023). Autonomous is Not Enough: Designing Multisensory Mid-Air Gestures for Vehicle Interactions Among People with Visual Impairments. In Conference on Human Factors in Computing Systems - Proceedings. Association for Computing Machinery. https://doi.org/10.1145/3544548.3580762
Mendeley helps you to discover research relevant for your work.