We propose a method estimating human emotional state in communication from four micro-expressions; mouth motion, head pose, eye sight direction, and blinking interval. Those micro-expressions are picked up by a questionnaire survey of human observers watching on video recorded human conversation. Then we implemented a recognition system for those micro-expressions. We detect facial parts from a RGB-Depth camera, measure those four expressions. Then we apply decision-tree style classifier to detect some emotional state and state changes. In our experiment, we gathered 30 videos of human communicating with his/her friend. Then we trained and validated our algorithm with two-fold cross-validation. We compared the classifier output with human examiners’ observation and confirmed over 70% precision.
CITATION STYLE
Sumi, K., & Ueda, T. (2016). Micro-expression recognition for detecting human emotional changes. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9733, pp. 60–70). Springer Verlag. https://doi.org/10.1007/978-3-319-39513-5_6
Mendeley helps you to discover research relevant for your work.