Eyetracking for two-person tasks with manipulation of a virtual world

34Citations
Citations of this article
97Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Eyetracking facilities are typically restricted to monitoring a single person viewing static images or prerecorded video. In the present article, we describe a system that makes it possible to study visual attention in coordination with other activity during joint action. The software links two eyetracking systems in parallel and provides an on-screen task. By locating eye movements against dynamic screen regions, it permits automatic tracking of moving on-screen objects. Using existing SR technology, the system can also cross-project each participant's eyetrack and mouse location onto the other's on-screen work space. Keeping a complete record of eyetrack and on-screen events in the same format as subsequent human coding, the system permits the analysis of multiple modalities. The software offers new approaches to spontaneous multimodal communication: joint action and joint attention. These capacities are demonstrated using an experimental paradigm for cooperative on-screen assembly of a two-dimensional model. The software is available under an open source license. © 2010 The Psychonomic Society, Inc.

Cite

CITATION STYLE

APA

Carletta, J., Hill, R. L., Nicol, C., Taylor, T., de Ruiter, J. P., & Bard, E. G. (2010). Eyetracking for two-person tasks with manipulation of a virtual world. Behavior Research Methods, 42(1), 254–265. https://doi.org/10.3758/BRM.42.1.254

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free