Emotion Recognition from Periocular Features

4Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Image processing and Machine Learning approaches are used for face detection and emotion recognition. There are many features that could be extracted from a facial image, but the focus of this work is on identifying emotions by analyzing the features in the periocular region of the face; the region that consists of the features lying in the area of the immediate vicinity of the eyes. The work is broadly divided into two major modules: facial feature extraction and selection, and classifier training and evaluation. 327 labeled images are used to select seven features (periocular action units). Five classifiers are tested, and the Random Forest classifier provided the highest prediction accuracy at 75.61%, with the best performance observed for the happiness emotion label. The k-Nearest Neighbor classifier follows with a performance of 72% when augmented with Neighborhood Components Analysis. Statistical tests confirm that there is a significant difference between the performance of Random Forest classifier and SVM. The results of this work may serve as inputs for ensemble emotion recognition systems, and also as guidelines for enhancing works involving periocular-feature-based Facial Emotion Recognition systems.

Cite

CITATION STYLE

APA

Agrawal, E., & Christopher, J. (2020). Emotion Recognition from Periocular Features. In Communications in Computer and Information Science (Vol. 1240 CCIS, pp. 194–208). Springer. https://doi.org/10.1007/978-981-15-6315-7_16

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free