Using noninvasive wearable computers to recognize human emotions from physiological signals

373Citations
Citations of this article
510Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

We discuss the strong relationship between affect and cognition and the importance of emotions in multimodal human computer interaction (HCI) and user modeling. We introduce the overall paradigm for our multimodal system that aims at recognizing its users' emotions and at responding to them accordingly depending upon the current context or application. We then describe the design of the emotion elicitation experiment we conducted by collecting, via wearable computers, physiological signals from the autonomic nervous system (galvanic skin response, heart rate, temperature) and mapping them to certain emotions (sadness, anger, fear, surprise, frustration, and amusement). We show the results of three different supervised learning algorithms that categorize these collected signals in terms of emotions, and generalize their learning to recognize emotions from new collections of signals. We finally discuss possible broader impact and potential applications of emotion recognition for multimodal intelligent systems.

Cite

CITATION STYLE

APA

Lisetti, C. L., & Nasoz, F. (2004, September 1). Using noninvasive wearable computers to recognize human emotions from physiological signals. Eurasip Journal on Applied Signal Processing. https://doi.org/10.1155/S1110865704406192

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free