Single trial classification of EEG and peripheral physiological signals for recognition of emotions induced by music videos

153Citations
Citations of this article
231Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Recently, the field of automatic recognition of users' affective states has gained a great deal of attention. Automatic, implicit recognition of affective states has many applications, ranging from personalized content recommendation to automatic tutoring systems. In this work, we present some promising results of our research in classification of emotions induced by watching music videos. We show robust correlations between users' self-assessments of arousal and valence and the frequency powers of their EEG activity. We present methods for single trial classification using both EEG and peripheral physiological signals. For EEG, an average (maximum) classification rate of 55.7% (67.0%) for arousal and 58.8% (76.0%) for valence was obtained. For peripheral physiological signals, the results were 58.9% (85.5%) for arousal and 54.2% (78.5%) for valence. © 2010 Springer-Verlag.

Cite

CITATION STYLE

APA

Koelstra, S., Yazdani, A., Soleymani, M., Mühl, C., Lee, J. S., Nijholt, A., … Patras, I. (2010). Single trial classification of EEG and peripheral physiological signals for recognition of emotions induced by music videos. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6334 LNAI, pp. 89–100). https://doi.org/10.1007/978-3-642-15314-3_9

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free