Non-Intrusive Real Time Eye Tracking Using Facial Alignment for Assistive Technologies

2Citations
Citations of this article
24Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Most affordable eye tracking systems use either intrusive setup such as head-mounted cameras or use fixed cameras with infrared corneal reflections via illuminators. In the case of assistive technologies, using intrusive eye tracking systems can be a burden to wear for extended periods of time and infrared based solutions generally do not work in all environments, especially outside or inside if the sunlight reaches the space. Therefore, we propose an eye-tracking solution using state-of-the-art convolutional neural network face alignment algorithms that is both accurate and lightweight for assistive tasks such as selecting an object for use with assistive robotics arms. This solution uses a simple webcam for gaze and face position and pose estimation. We achieve a much faster computation time than the current state-of-the-art while maintaining comparable accuracy. This paves the way for accurate appearance-based gaze estimation even on mobile devices, giving an average error of around 4.5° on the MPIIGaze dataset (Zhang et al., 2019) and state-of-the-art average errors of 3.9° and 3.3° on the UTMultiview (Sugano et al., 2014) and GazeCapture (Krafka et al., 2016; Park et al., 2019) datasets respectively, while achieving a decrease in computation time of up to 91%.

Cite

CITATION STYLE

APA

Leblond-Menard, C., & Achiche, S. (2023). Non-Intrusive Real Time Eye Tracking Using Facial Alignment for Assistive Technologies. IEEE Transactions on Neural Systems and Rehabilitation Engineering, 31, 954–961. https://doi.org/10.1109/TNSRE.2023.3236886

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free