Analysis of articulated motion for social signal processing

0Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Companion technologies aim at developing sustained long-term relationships by employing non-verbal communication (NVC) skills. Visual NVC signals can be conveyed over a variety of non-verbal channels, such as facial expressions, gestures, or spatio-temporal behavior. It remains a challenge to equip technical systems with human-like abilities to reliably and effortlessly detect and analyze such social signals. In this proposal, we focus our investigation on the modeling of visual mechanisms for the processing and analysis of human-articulated motion and posture information from spatially intermediate to remote distances. From a modeling perspective, we investigate how visual features and their integration over several stages in a processing hierarchy take part in the establishment of articulated motion representations. We build upon known structures and mechanisms in cortical networks of primates and emphasize how generic processing principles might realize the building blocks for such network-based distributed processing through learning. We demonstrate how feature representations in segregated pathways and their convergence lead to integrated form and motion representations using artificially generated articulated motion sequences.

Cite

CITATION STYLE

APA

Layher, G., Glodek, M., & Neumann, H. (2017). Analysis of articulated motion for social signal processing. In Cognitive Technologies (pp. 345–364). Springer Verlag. https://doi.org/10.1007/978-3-319-43665-4_17

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free