Skip to main content

Automatic analysis of affective states: Visual attention based approach

Citations of this article
Mendeley users who have this article in their library.
Get full text


Computing environment is moving from computer centered designs to human-centered designs. Human’s tend to communicate wealth of information through affective states or expressions. Thus automatic analysis of user affective states have become inevitable for computer vision community. In this paper first we focus on understanding human visual system (HVS) when it decodes or recognizes facial expressions. To understand HVS, we have conducted psycho-visual experimental study with an eye-tracker, to find which facial region is perceptually more attractive or salient for a particular expression. Secondly, based on results obtained from psycho-visual experimental study we have proposed a novel framework for automatic analysis of affective states. Framework creates discriminative feature space by processing only salient facial regions to extract Pyramid Histogram of Orientation Gradients (PHOG) features. The proposed framework achieved automatic expression recognition accuracy of 95.3% on extended Cohn-Kanade (CK+) facial expression database for six universal facial expressions. We have also discussed generalization capabilites of proposed framework on unseen data. In the last paper discusses effectiveness of proposed framework against low resolution image sequences.




Khan, R. A., Meyer, A., Konik, H., & Bouakaz, S. (2013). Automatic analysis of affective states: Visual attention based approach. Communications in Computer and Information Science, 414, 98–108.

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free