Re-mapping animation parameters between multiple types of facial model

6Citations
Citations of this article
12Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper we describe a method for re-mapping animation parameters between multiple types of facial model for performance driven animation. A facial performance can be analysed automatically in terms of a set of facial action trajectories using a modified appearance model with modes of variation encoding specific facial actions. These parameters can then be used to animate other appearance models, or 3D facial models. Thus, the animation parameters analysed from the video performance may be re-used to animate multiple types of facial model. We demonstrate the effectiveness of our approach by measuring its ability to successfully extract action-parameters from performances and by displaying frames from example animations. © Springer-Verlag Berlin Heidelberg 2007.

Cite

CITATION STYLE

APA

Cosker, D., Roy, S., Rosin, P. L., & Marshall, D. (2007). Re-mapping animation parameters between multiple types of facial model. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4418 LNCS, pp. 365–376). Springer Verlag. https://doi.org/10.1007/978-3-540-71457-6_33

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free