Auditory Modulation of Multisensory Representations

3Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Motor control and motor learning as well as interpersonal coordination are based on motor perception and emergent perceptuomotor representations. At least in early stages motor learning and interpersonal coordination are emerging heavily on visual information in terms of observing others and transforming the information into internal representations to guide owns behavior. With progressing learning, also other perceptual modalities are added when a new motor pattern is established by repeated physical exercises. In contrast to the vast majority of publications on motor learning and interpersonal coordination referring to a certain perceptual modality here we look at the perceptual system as a unitary system coordinating and unifying the information of all involved perceptual modalities. The relation between perceptual streams of different modalities, the intermodal processing and multisensory integration of information as a basis for motor control and learning will be the main focus of this contribution. Multi-/intermodal processing of perceptual streams results in multimodal representations and opens up new approaches to support motor learning and interpersonal coordination: Creating an additional perceptual stream adequately auditory movement information can be generated suitable to be integrated with information of other modalities and thereby modulating the resulting perceptuomotor representations without the need of attention and higher cognition. Here, the concept of a movement defined real-time acoustics is used to serve the auditory system in terms of an additional movement-auditory stream. Before the computational approach of kinematic real-time sonification is finally described, a special focus is directed to the level of adaptation modules of the internal models. Furthermore, this concept is compared with different approaches of additional acoustic movement information. Moreover, a perspective of this approach is given in a broad spectrum of new applications of supporting motor control and learning in sports and motor rehabilitation as well as a broad spectrum of joint action and interpersonal coordination between humans but also concerning human-robot-interaction.

Cite

CITATION STYLE

APA

Effenberg, A. O., Hwang, T. H., Ghai, S., & Schmitz, G. (2018). Auditory Modulation of Multisensory Representations. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 11265 LNCS, pp. 284–311). Springer Verlag. https://doi.org/10.1007/978-3-030-01692-0_20

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free