Towards emotional interaction: Using movies to automatically learn users' emotional states

11Citations
Citations of this article
48Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

The HCI community is actively seeking novel methodologies to gain insight into the user's experience during interaction with both the application and the content. We propose an emotional recognition engine capable of automatically recognizing a set of human emotional states using psychophysiological measures of the autonomous nervous system, including galvanic skin response, respiration, and heart rate. A novel pattern recognition system, based on discriminant analysis and support vector machine classifiers is trained using movies' scenes selected to induce emotions ranging from the positive to the negative valence dimension, including happiness, anger, disgust, sadness, and fear. In this paper we introduce an emotion recognition system and evaluate its accuracy by presenting the results of an experiment conducted with three physiologic sensors. © 2011 IFIP International Federation for Information Processing.

Cite

CITATION STYLE

APA

Oliveira, E., Benovoy, M., Ribeiro, N., & Chambel, T. (2011). Towards emotional interaction: Using movies to automatically learn users’ emotional states. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6946 LNCS, pp. 152–161). https://doi.org/10.1007/978-3-642-23774-4_15

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free