Deep Learning on Facial Expression Detection : Artificial Neural Network Model Implementation

  • Kusumah H
  • Zahran M
  • Cholied P
  • et al.
N/ACitations
Citations of this article
2Readers
Mendeley users who have this article in their library.

Abstract

The moods, emotions, and even medical issues of a person can frequently be seen directly reflected in their facial expressions. The fields of social science and human-computer interaction have recently begun to pay more attention to facial emotion detection as a result of this. The primary focus of this study is on the automatic recognition of human facial expressions using an artificial neural network (ANN) model and a technique based on straightforward convolution. The dataset utilized is a self-mined dataset that was obtained by utilizing the web scraping approach on Google Image with the help of the Selenium package for Python. A dataset containing six categories of fundamental human expressions that are likely to be met on a daily basis, namely anger, confusion, contempt, crying, sadness, disgust, and happiness, with a total of 6,016 photos being used. The goal of this research is to determine how accurate the model of artificial neural networks can be in predicting.

Cite

CITATION STYLE

APA

Kusumah, H., Zahran, M. S., Cholied, P. R., Alkusna, M. S., & Hafidhi, N. A. (2022). Deep Learning on Facial Expression Detection : Artificial Neural Network Model Implementation. CCIT Journal, 16(1), 39–53. https://doi.org/10.33050/ccit.v16i1.2518

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free