A fully automated, multi-stage architecture for emotion recognition is presented. Faces are located using a tracker based upon the ratio template algorithm [1]. Optical flow of the face is subsequently determined using a multi-channel gradient model [2]. The speed and direction information produced is then averaged over different parts of the face and ratios taken to determine how facial parts are moving relative to one another. This information is entered into multi-layer perceptrons trained using back propagation. The system then allocates any facial expression to one of four categories, happiness, sadness, surprise, or disgust. The three key stages of the architecture are all inspired by biological systems. This emotion recognition system runs in real-time and has a range of applications in the field of human-computer interaction. © Springer-Verlag 2003.
CITATION STYLE
Anderson, K., & McOwan, P. W. (2003). Real-time emotion recognition using biologically inspired models. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2688, 119–127. https://doi.org/10.1007/3-540-44887-x_15
Mendeley helps you to discover research relevant for your work.