Facial emotion recognition (FER) is an important research area in artificial intelligence (AI) and has many applications i.e., face authentication systems, e-learning, entertainment, deepfakes detection, etc. FER is still a challenging task due to more intra-class variations of emotions. Although prior deep learning methods have achieved good performance for FER. However, still there exists a need to develop efficient and effective FER systems robust to certain conditions i.e., variations in illumination, face angles, gender, race, background settings, and people belonging to diverse geographical regions. Moreover, a generalized model for the classification of human emotions is required to be implemented in computer systems so that they can interact with humans according to their emotions and improve their interaction. This work presents a novel light-weight Efficient-SwishNet model for emotion recognition that is robust towards the aforementioned conditions. We have introduced a low-cost, smooth unbounded above and bounded below Swish activation function in our model. Property of unboundedness helps to avoid saturation while smoothing helps in optimization and generalization of the model. Performance of the proposed model is evaluated on five diverse datasets including CK+, JAFFE, FER-2013, KDEF, and FERG datasets. We also performed a cross-corpora evaluation to show the generalizability of our model. The proposed model achieves a very high recognition rate for all datasets that prove the merit of the proposed framework for both the human facial images and stylized cartoon characters. Moreover, we conducted an ablation study with different variants of our model to prove its efficiency and effectiveness for emotions identification.
CITATION STYLE
Dar, T., Javed, A., Bourouis, S., Hussein, H. S., & Alshazly, H. (2022). Efficient-SwishNet Based System for Facial Emotion Recognition. IEEE Access, 10, 71311–71328. https://doi.org/10.1109/ACCESS.2022.3188730
Mendeley helps you to discover research relevant for your work.