An affective user interface based on facial expression recognition and eye-gaze tracking

8Citations
Citations of this article
34Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper describes a pipeline by which facial expression and eye-gaze of the user are tracked, and then 3D facial animation is synthesized in the remote place based upon timing information of the facial and eye movement information. The system first detects a facial area within the given image and then classifies its facial expression into 7 emotional weightings. Such weighting information, transmitted to the PDA via a mobile network, is used for non-photorealistic facial expression animation. It turns out that facial expression animation using emotional curves is more effective in expressing the timing of an expression comparing to the linear interpolation method. The emotional avatar embedded on a mobile platform has some potential in conveying emotion between peoples via Internet. © Springer-Verlag Berlin Heidelberg 2005.

Cite

CITATION STYLE

APA

Choi, S. M., & Kim, Y. G. (2005). An affective user interface based on facial expression recognition and eye-gaze tracking. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 3784 LNCS, pp. 907–914). Springer Verlag. https://doi.org/10.1007/11573548_116

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free