AutoSelect: What you want is what you get: Real-time processing of visual attention and affect

45Citations
Citations of this article
66Readers
Mendeley users who have this article in their library.
Get full text

Abstract

While objects of our focus of attention ("where we are looking at") and accompanying affective responses to those objects is part of our daily experience, little research exists on investigating the relation between attention and positive affective evaluation. The purpose of our research is to process users' emotion and attention in real-time, with the goal of designing systems that may recognize a user's affective response to a particular visually presented stimulus in the presence of other stimuli, and respond accordingly. In this paper, we introduce the AutoSelect system that automatically detects a user's preference based on eye movement data and physiological signals in a two-alternative forced choice task. In an exploratory study involving the selection of neckties, the system could correctly classify subjects' choice of in 81%. In this instance of AutoSelect, the gaze 'cascade effect' played a dominant role, whereas pupil size could not be shown as a reliable predictor of preference. © Springer-Verlag Berlin Heidelberg 2006.

Cite

CITATION STYLE

APA

Bee, N., Prendinger, H., Nakasone, A., André, E., & Ishizuka, M. (2006). AutoSelect: What you want is what you get: Real-time processing of visual attention and affect. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4021 LNAI, pp. 40–52). Springer Verlag. https://doi.org/10.1007/11768029_5

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free