Post-processing integration and semi-Automated analysis of eye-Tracking and motion-capture data obtained in immersive virtual reality environments to measure visuomotor integration

3Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Mobile eye-Tracking and motion-capture techniques yield rich, precisely quantifiable data that can inform our understanding of the relationship between visual and motor processes during task performance. However, these systems are rarely used in combination, in part because of the significant time and human resources required for post-processing and analysis. Recent advances in computer vision have opened the door for more efficient processing and analysis solutions. We developed a post-processing pipeline to integrate mobile eye-Tracking and full-body motion-capture data. These systems were used simultaneously to measure visuomotor integration in an immersive virtual environment. Our approach enables calculation of a 3D gaze vector that can be mapped to the participant's body position and objects in the virtual environment using a uniform coordinate system. This approach is generalizable to other configurations, and enables more efficient analysis of eye, head, and body movements together during visuomotor tasks administered in controlled, repeatable environments.

Cite

CITATION STYLE

APA

L. Miller, H., Raphael Zurutuza, I., Fears, N., Polat, S., & Nielsen, R. (2021). Post-processing integration and semi-Automated analysis of eye-Tracking and motion-capture data obtained in immersive virtual reality environments to measure visuomotor integration. In Eye Tracking Research and Applications Symposium (ETRA) (Vol. PartF169260). Association for Computing Machinery. https://doi.org/10.1145/3450341.3458881

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free