Towards real-time continuous emotion recognition from body movements

9Citations
Citations of this article
26Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Social psychological research indicates that bodily expressions convey important affective information, although this modality is relatively neglected in the literature as compared to facial expressions and speech. In this paper we propose a real-time system that continuously recognizes emotions from body movements data streams. Low-level 3D postural features and high-level kinematic and geometrical features are through summarization (statistical values) or aggregation (feature patches), fed to a random forests classifier. In a first stage, the MoCap UCLIC affective gesture database has been used for training the classifier, which led to an overall recognition rate of 78% using a 10-fold cross-validation (leave-one-out). Subsequently, the trained classifier was tested with different subjects using continuous Kinect data. A performance of 72% was reached in real-time, which proves the efficiency and effectiveness of the proposed system. © 2013 Springer International Publishing.

Cite

CITATION STYLE

APA

Wang, W., Enescu, V., & Sahli, H. (2013). Towards real-time continuous emotion recognition from body movements. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8212 LNCS, pp. 235–245). https://doi.org/10.1007/978-3-319-02714-2_20

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free