Facial Human Emotion Recognition by Using YOLO Faces Detection Algorithm

  • Hasan M
N/ACitations
Citations of this article
23Readers
Mendeley users who have this article in their library.

Abstract

Deep emotions have gained importance recently because they constitute a form of interpersonal nonverbal communication that has been demonstrated and used in a variety of real-world contexts, including human-machine interactions, safety, and health. The best elements of a human face must be extracted in order to forecast the proper emotion expression, making this method extremely difficult. In this work, we provide a brand-new structural model to forecast human emotion on the face. The human face is found using the YOLO faces detection technique, and its attributes are extracted. These features then help to classify the face image into one of the seven emotions: natural, happy, sad, angry, surprised, fear, or disgust. The experiment demonstrated the robustness and speed of the suggested structure. This paper made use of the FER2013 dataset. The experimental findings demonstrated that the proposed system's accuracy was 94%.

Cite

CITATION STYLE

APA

Hasan, M. A. (2023). Facial Human Emotion Recognition by Using YOLO Faces Detection Algorithm. JOINCS (Journal of Informatics, Network, and Computer Science), 6(2), 32–38. https://doi.org/10.21070/joincs.v6i2.1629

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free