Gaze-head input: Examining potential interaction with immediate experience sampling in an autonomous vehicle

8Citations
Citations of this article
25Readers
Mendeley users who have this article in their library.

Abstract

Autonomous vehicles (AV) increasingly allow drivers to engage in secondary tasks such as eating or working on a laptop and thus require easy and reliable interaction inputs to facilitate communication between the driver and the vehicle. However, drivers report feeling less in control when driving is no longer the primary task, which suggests that novel approaches for assessing satisfaction regarding AV decision-making are needed. Therefore, we propose an immediate experience sampling method (IESM) that learns driver preferences for AV actions. We also suggest gaze-head input (G-HI) as a novel input in an AV. G-HI provides a hands-free, remote, and intuitive input modality that allows drivers to interact with the AV while continuing to engage in non-driving related tasks. We compare G-HI with voice and touch inputs via IESM for two simulated driving scenarios. Our results report the differences among the three inputs in terms of system usability, reaction time, and perceived workload. It also reveals that G-HI is a promising candidate for AV input interaction, which could replace voice or touch inputs where those inputs could not be utilized. Variation in driver satisfaction and expectations for AV actions confirms the effectiveness of using IESM to increase drivers’ sense of control.

Cite

CITATION STYLE

APA

Ataya, A., Kim, W., Elsharkawy, A., & Kim, S. (2020). Gaze-head input: Examining potential interaction with immediate experience sampling in an autonomous vehicle. Applied Sciences (Switzerland), 10(24), 1–17. https://doi.org/10.3390/app10249011

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free