Real Time Facial Emotion Recognition based on Image Processing and Machine Learning

  • Halder R
  • Sengupta S
  • Pal A
  • et al.
N/ACitations
Citations of this article
24Readers
Mendeley users who have this article in their library.

Abstract

Behaviors, actions, poses, facial expressions and speech; these are considered as channels that convey human emotions. Extensive research has being carried out to explore the relationships between these channels and emotions. This paper proposes a prototype system which automatically recognizes the emotion represented on a face. Thus a neural network based solution combined with image processing is used in classifying the universal emotions: Happiness, Sadness, Anger, Disgust, Surprise and Fear. Colored frontal face images are given as input to the prototype system. After the face is detected, image processing based feature point extraction method is used to extract a set of selected feature points. Finally, a set of values obtained after processing those extracted feature points are given as input to the neural network to recognize the emotion contained.

Cite

CITATION STYLE

APA

Halder, R., Sengupta, S., Pal, A., Ghosh, S., & Kundu, D. (2016). Real Time Facial Emotion Recognition based on Image Processing and Machine Learning. International Journal of Computer Applications, 139(11), 16–19. https://doi.org/10.5120/ijca2016908707

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free