Performance driven facial animation by appearance based tracking

2Citations
Citations of this article
11Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We present a method that estimates high level animation parameters (muscle contractions, eye movements, eye lids opening, jaw motion and lips contractions) from a marker-less face image sequence. We use an efficient appearance-based tracker to stabilise images of upper (eyes and eyebrows) and lower (mouth) face. By using a set of stabilised images with known animation parameters, we can learn a re-animation matrix that allows us to estimate the parameters of a new image. The system is able to re-animate a 32 DOF 3D face model in real-time. © Springer-Verlag Berlin Heidelberg 2005.

Cite

CITATION STYLE

APA

Buenaposada, J. M., Muñoz, E., & Baumela, L. (2005). Performance driven facial animation by appearance based tracking. In Lecture Notes in Computer Science (Vol. 3522, pp. 476–483). Springer Verlag. https://doi.org/10.1007/11492429_58

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free