Automating the transfer of a generic set of behaviors onto a virtual character

11Citations
Citations of this article
35Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Humanoid 3D models can be easily acquired through various sources, including online. The use of such models within a game or simulation environment requires human input and intervention in order to associate such a model with a relevant set of motions and control mechanisms. In this paper, we demonstrate a pipeline where humanoid 3D models can be incorporated within seconds into an animation system, and infused with a wide range of capabilities, such as locomotion, object manipulation, gazing, speech synthesis and lip syncing. We offer a set of heuristics that can associate arbitrary joint names with canonical ones, and describe a fast retargeting algorithm that enables us to instill a set of behaviors onto an arbitrary humanoid skeleton. We believe that such a system will vastly increase the use of 3D interactive characters due to the ease that new models can be animated. © 2012 Springer-Verlag Berlin Heidelberg.

Author supplied keywords

Cite

CITATION STYLE

APA

Feng, A., Huang, Y., Xu, Y., & Shapiro, A. (2012). Automating the transfer of a generic set of behaviors onto a virtual character. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 7660 LNCS, pp. 134–145). Springer Verlag. https://doi.org/10.1007/978-3-642-34710-8_13

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free