A controller-based animation system for synchronizing and realizing human-like conversational behaviors

4Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The Embodied Conversational Agents (ECAs) are an application of virtual characters that is subject of considerable ongoing research. An essential prerequisite for creating believable ECAs is the ability to describe and visually realize multimodal conversational behaviors. The recently developed Behavior Markup Language (BML) seeks to address this requirement by granting a means to specify physical realizations of multimodal behaviors through human-readable scripts. In this paper we present an approach to implement a behavior realizer compatible with BML language. The system's architecture is based on hierarchical controllers which apply preprocessed behaviors to body modalities. Animation database is feasibly extensible and contains behavior examples constructed upon existing lexicons and theory of gestures. Furthermore, we describe a novel solution to the issue of synchronizing gestures with synthesized speech using neural networks and propose improvements to the BML specification. © 2010 Springer-Verlag.

Cite

CITATION STYLE

APA

Čereković, A., Pejša, T., & Pandžić, I. S. (2010). A controller-based animation system for synchronizing and realizing human-like conversational behaviors. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5967 LNCS, pp. 80–91). Springer Verlag. https://doi.org/10.1007/978-3-642-12397-9_6

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free