Embodied Gesture Processing: Motor-Based Integration of Perception and Action in Social Artificial Agents

25Citations
Citations of this article
64Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

A close coupling of perception and action processes is assumed to play an important role in basic capabilities of social interaction, such as guiding attention and observation of others' behavior, coordinating the form and functions of behavior, or grounding the understanding of others' behavior in one's own experiences. In the attempt to endow artificial embodied agents with similar abilities, we present a probabilistic model for the integration of perception and generation of hand-arm gestures via a hierarchy of shared motor representations, allowing for combined bottom-up and top-down processing. Results from human-agent interactions are reported demonstrating the model's performance in learning, observation, imitation, and generation of gestures. © 2010 The Author(s).

Cite

CITATION STYLE

APA

Sadeghipour, A., & Kopp, S. (2011). Embodied Gesture Processing: Motor-Based Integration of Perception and Action in Social Artificial Agents. Cognitive Computation, 3(3), 419–435. https://doi.org/10.1007/s12559-010-9082-z

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free