Recognizing emotion from video is an active research theme with many applications such as human-computer interaction and affective computing. The classification of emotions from facial expression is a common approach but it is sometimes difficult to differentiate genuine emotions from faked emotions. In this paper, we use a remote video based cardiac activity sensing technique to obtain physiological data to identify emotional states. We show that from the remotely sensed cardiac pulse patterns alone, emotional states can be differentiated. Specifically, we conducted an experimental study on recognizing the emotions of people watching video clips. We recorded 26 subjects that all watched the same comedy and horror video clips and then we estimated their cardiac pulse signals from the video footage. From the cardiac pulse signal alone, we were able to classify whether the subjects were watching the comedy or horror video clip. We also compare against classifying for the same task using facial action units and discuss how the two modalities compare.
CITATION STYLE
Das, K., Lam, A., Fukuda, H., Kobayashi, Y., & Kuno, Y. (2018). Classification of Emotions from Video Based Cardiac Pulse Estimation. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10956 LNAI, pp. 296–305). Springer Verlag. https://doi.org/10.1007/978-3-319-95957-3_33
Mendeley helps you to discover research relevant for your work.