Eye-Tracking Analysis for Emotion Recognition

80Citations
Citations of this article
130Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

This article reports the results of the study related to emotion recognition by using eye-tracking. Emotions were evoked by presenting a dynamic movie material in the form of 21 video fragments. Eye-tracking signals recorded from 30 participants were used to calculate 18 features associated with eye movements (fixations and saccades) and pupil diameter. To ensure that the features were related to emotions, we investigated the influence of luminance and the dynamics of the presented movies. Three classes of emotions were considered: high arousal and low valence, low arousal and moderate valence, and high arousal and high valence. A maximum of 80% classification accuracy was obtained using the support vector machine (SVM) classifier and leave-one-subject-out validation method.

Cite

CITATION STYLE

APA

Tarnowski, P., Kołodziej, M., Majkowski, A., & Rak, R. J. (2020). Eye-Tracking Analysis for Emotion Recognition. Computational Intelligence and Neuroscience, 2020. https://doi.org/10.1155/2020/2909267

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free