This contribution presents our work towards a system that autonomously guides the user's visual attention on important information (e.g., traffic situation or in-car system status signal, etc.) in error prone situations while driving a car. Therefore we use a highly accurate head-mounted eye-tracking system to estimate the driver's current focus of visual attention. Based on this data, we present our strategies to guide the driver's attention to where he should focus his attention. These strategies use both graphical animations in form of a guiding point on the Graphical User Interface as well as auditory animation that are present via headphones using a Virtual Acoustics system. In the end of this contribution, we present the results from a usability study. © 2009 Springer Berlin Heidelberg.
CITATION STYLE
Poitschke, T., Laquai, F., & Rigoll, G. (2009). Guiding a driver’s visual attention using graphical and auditory animations. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5639 LNAI, pp. 424–433). https://doi.org/10.1007/978-3-642-02728-4_45
Mendeley helps you to discover research relevant for your work.