Affective laughter expressions from body movements

0Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The main goal of this study is to classify affective laughter expressions from body movements. Using a non-intrusive Kinect sensor, body movement data from laughing participants were collected, annotated and segmented. A set of features that include the head, torso, shoulder movements, as well as the positions of the right and left hands, were used by a decision tree classifier to determine the type of emotions expressed in the laughter. The decision tree classifier performed with an accuracy of 71.02% using a minimum set of body movement features.

Cite

CITATION STYLE

APA

Cu, J., Luz, M. B., Nocum, M., Purganan, T. J., & Wong, W. S. (2017). Affective laughter expressions from body movements. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10004 LNAI, pp. 139–145). Springer Verlag. https://doi.org/10.1007/978-3-319-60675-0_12

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free