We present a method that estimates high level animation parameters (muscle contractions, eye movements, eye lids opening, jaw motion and lips contractions) from a marker-less face image sequence. We use an efficient appearance-based tracker to stabilise images of upper (eyes and eyebrows) and lower (mouth) face. By using a set of stabilised images with known animation parameters, we can learn a re-animation matrix that allows us to estimate the parameters of a new image. The system is able to re-animate a 32 DOF 3D face model in real-time. © Springer-Verlag Berlin Heidelberg 2005.
CITATION STYLE
Buenaposada, J. M., Muñoz, E., & Baumela, L. (2005). Performance driven facial animation by appearance based tracking. In Lecture Notes in Computer Science (Vol. 3522, pp. 476–483). Springer Verlag. https://doi.org/10.1007/11492429_58
Mendeley helps you to discover research relevant for your work.