Emotion recognition based on facial components

8Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Machine analysis of facial emotion recognition is a challenging and an innovative research topic in human–computer interaction. Though a face displays different facial expressions, which can be immediately recognized by human eyes, it is very hard for a computer to extract and use the information content from these expressions. This paper proposes an approach for emotion recognition based on facial components. The local features are extracted in each frame using Gabor wavelets with selected scales and orientations. These features are passed on to an ensemble classifier for detecting the location of face region. From the signature of each pixel on the face, the eye and the mouth regions are detected using the ensemble classifier. The eye and the mouth features are extracted using normalized semi-local binary patterns. The multiclass Adaboost algorithm is used to select and classify these discriminative features for recognizing the emotion of the face. The developed methods are deployed on the RML, CK and CMU-MIT databases, and they exhibit significant performance improvement owing to their novel features when compared with the existing techniques.

Cite

CITATION STYLE

APA

Rani, P. I., & Muneeswaran, K. (2018). Emotion recognition based on facial components. Sadhana - Academy Proceedings in Engineering Sciences, 43(3). https://doi.org/10.1007/s12046-018-0801-6

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free