WheelSense: Enabling tangible gestures on the steering wheel for in-car natural interaction

12Citations
Citations of this article
27Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

This paper presents WheelSense, a system for non-distracting and natural interaction with the In-Vehicle Information and communication System (IVIS). WheelSense embeds pressure sensors in the steering wheel in order to detect tangible gestures that the driver can perform on its surface. In this application, the driver can interact by means of four gestures that have been designed to allow the execution of secondary tasks without leaving the hands from the steering wheel. Thus, the proposed interface aims at minimizing the distraction of the driver from the primary task. Eight users tested the proposed system in an evaluation composed of three phases: gesture recognition test, gesture recognition test while driving in a simulated environment and usability questionnaire. The results show that the accuracy rate is 87% and 82% while driving. The system usability scale scored 84 points out of 100. © 2013 Springer-Verlag.

Cite

CITATION STYLE

APA

Angelini, L., Caon, M., Carrino, F., Carrino, S., Lalanne, D., Khaled, O. A., & Mugellini, E. (2013). WheelSense: Enabling tangible gestures on the steering wheel for in-car natural interaction. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8005 LNCS, pp. 531–540). https://doi.org/10.1007/978-3-642-39262-7_60

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free