A Novel Model for Emotion Detection from Facial Muscles Activity

7Citations
Citations of this article
11Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Considering human’s emotion in different applications and systems has received substantial attention over the last three decades. The traditional approach for emotion detection is to first extract different features and then apply a classifier, like SVM, to find the true class. However, recently proposed Deep Learning based models outperform traditional machine learning approaches without requirement of a separate feature extraction phase. This paper proposes a novel deep learning based facial emotion detection model, which uses facial muscles activities as raw input to recognize the type of the expressed emotion in the real time. To this end, we first use OpenFace to extract the activation values of the facial muscles, which are then presented to a Stacked Auto Encoder (SAE) as feature set. Afterward, the SAE returns the best combination of muscles in describing a particular emotion, these extracted features at the end are applied to a Softmax layer in order to fulfill multi classification task. The proposed model has been applied to the CK+, MMI and RADVESS datasets and achieved respectively average accuracies of 95.63%, 95.58%, and 84.91% for emotion type detection in six classes, which outperforms state-of-the-art algorithms.

Cite

CITATION STYLE

APA

Bagheri, E., Bagheri, A., Esteban, P. G., & Vanderborgth, B. (2020). A Novel Model for Emotion Detection from Facial Muscles Activity. In Advances in Intelligent Systems and Computing (Vol. 1093 AISC, pp. 237–249). Springer. https://doi.org/10.1007/978-3-030-36150-1_20

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free