Skip to main content

Real-time eye-interaction system developed with eye tracking glasses and motion capture

1Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In the industrial environment such as aircraft cockpits and train driver’s cab we wished to real-timely acquire the eye-tracking position and made it synchronous with all controlled digital screens with which the machine could dynamically response to the user’ s current situation awareness (SA). Wearable eye-tracking glasses could only provide the relative position to the captured video, using which we gathered the data of the eye movement data (2DOF). While the motion capture device could only provide the position and orientation data, using which we accessed the displacement and angular displacement of the head (6DOF). We combined such two devices together into a novel real-time eye-interaction system to synchronize the user’s visual point on the screens. A spatial transform algorithm was proposed to calculate the visual point on the multiple digital screens. With the algorithm and the human factors analysis the machine could strengthen its dynamic service abilities.

Cite

CITATION STYLE

APA

Bao, H., Fang, W., Guo, B., & Wang, P. (2018). Real-time eye-interaction system developed with eye tracking glasses and motion capture. In Advances in Intelligent Systems and Computing (Vol. 608, pp. 72–81). Springer Verlag. https://doi.org/10.1007/978-3-319-60639-2_8

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free