Realtime Performance-Based Facial Animation

95Citations
Citations of this article
265Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper presents a system for performance-based character animation that enables any user to control the facial expressions of a digital avatar in realtime. The user is recorded in a natural environment using a non-intrusive, commercially available 3D sensor. The simplicity of this acquisition device comes at the cost of high noise levels in the acquired data. To effectively map low-quality 2D images and 3D depth maps to realistic facial expressions, we introduce a novel face tracking algorithm that combines geometry and texture registration with pre-recorded animation priors in a single optimization. Formulated as a maximum a posteriori estimation in a reduced parameter space, our method implicitly exploits temporal coherence to stabilize the tracking. We demonstrate that compelling 3D facial dynamics can be reconstructed in realtime without the use of face markers, intrusive lighting, or complex scanning hardware. This makes our system easy to deploy and facilitates a range of new applications, e.g. in digital gameplay or social interactions. © 2011, ACM. All rights reserved.

Cite

CITATION STYLE

APA

Weise, T., Bouaziz, S., Li, H., & Pauly, M. (2011). Realtime Performance-Based Facial Animation. ACM Transactions on Graphics, 30(4), 1–10. https://doi.org/10.1145/2010324.1964972

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free