Speech, gaze and head motion in a face-to-face collaborative task

0Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In the present work we observe two subjects interacting in a collaborative task on a shared environment. One goal of the experiment is to measure the change in behavior with respect to gaze when one interactant is wearing dark glasses and hence his/her gaze is not visible by the other one. The results show that if one subject wears dark glasses while telling the other subject the position of a certain object, the other subject needs significantly more time to locate and move this object. Hence, eye gaze - when visible - of one subject looking at a certain object speeds up the location of the cube by the other subject. The second goal of the currently ongoing work is to collect data on the multimodal behavior of one of the subjects by means of audio recording, eye gaze and head motion tracking in order to build a model that can be used to control a robot in a comparable scenario in future experiments. © 2011 Springer.

Cite

CITATION STYLE

APA

Fagel, S., & Bailly, G. (2011). Speech, gaze and head motion in a face-to-face collaborative task. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6456 LNCS, pp. 256–264). https://doi.org/10.1007/978-3-642-18184-9_21

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free