Towards End-to-End Video-Based Eye-Tracking

34Citations
Citations of this article
133Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Estimating eye-gaze from images alone is a challenging task, in large parts due to un-observable person-specific factors. Achieving high accuracy typically requires labeled data from test users which may not be attainable in real applications. We observe that there exists a strong relationship between what users are looking at and the appearance of the user’s eyes. In response to this understanding, we propose a novel dataset and accompanying method which aims to explicitly learn these semantic and temporal relationships. Our video dataset consists of time-synchronized screen recordings, user-facing camera views, and eye gaze data, which allows for new benchmarks in temporal gaze tracking as well as label-free refinement of gaze. Importantly, we demonstrate that the fusion of information from visual stimuli as well as eye images can lead towards achieving performance similar to literature-reported figures acquired through supervised personalization. Our final method yields significant performance improvements on our proposed EVE dataset, with up to 28 % improvement in Point-of-Gaze estimates (resulting in 2. 49∘ in angular error), paving the path towards high-accuracy screen-based eye tracking purely from webcam sensors. The dataset and reference source code are available at https://ait.ethz.ch/projects/2020/EVE.

Cite

CITATION STYLE

APA

Park, S., Aksan, E., Zhang, X., & Hilliges, O. (2020). Towards End-to-End Video-Based Eye-Tracking. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12357 LNCS, pp. 747–763). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-58610-2_44

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free