A task-driven eye tracking dataset for visual attention analysis

4Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

To facilitate the research in visual attention analysis, we design and establish a new task-driven eye tracking dataset of 47 subjects. Inspired by psychological findings that human visual behavior is tightly dependent on the executed tasks, we carefully design specific tasks in accordance with the contents of 111 images covering various semantic categories, such as text, facial expression, texture, pose, and gaze. It results in a dataset of 111 fixation density maps and over 5,000 scanpaths. Moreover, we provide baseline results of thirteen state-of-the-art saliency models. Furthermore, we hold discussions on important clues on how tasks and image contents influence human visual behavior. This task-driven eye tracking dataset with the fixation density maps and scanpaths will be made publicly available.

Cite

CITATION STYLE

APA

Xu, Y., Hong, X., He, Q., Zhao, G., & Pietikäinen, M. (2015). A task-driven eye tracking dataset for visual attention analysis. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9386, pp. 637–648). Springer Verlag. https://doi.org/10.1007/978-3-319-25903-1_55

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free