Closing the loop: Towards tightly synchronized robot gesture and speech

2Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

To engage in natural interactions with humans, social robots should produce speech-accompanying non-verbal behaviors such as hand and arm gestures. Given the special constraints imposed by the physical properties of a humanoid robot, successful multimodal synchronization is difficult to achieve. Introducing the first closed-loop approach to speech and gesture generation for humanoid robots, we propose a multimodal scheduler for improved synchronization based on two novel features, namely an experimentally fitted forward model and a feedback-based adaptation mechanism. Technical results obtained with the implemented scheduler demonstrate the feasibility of our approach; empirical results from an evaluation study highlight the implications of the present work. © Springer International Publishing 2013.

Cite

CITATION STYLE

APA

Salem, M., Kopp, S., & Joublin, F. (2013). Closing the loop: Towards tightly synchronized robot gesture and speech. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8239 LNAI, pp. 381–391). https://doi.org/10.1007/978-3-319-02675-6_38

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free