Audiotactile multisensory interactions in human information processing

62Citations
Citations of this article
78Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

The last few years has seen a very rapid growth of interest in how signals from different sensory modalities are integrated in the brain to form the unified percepts that fill our daily lives. Research on multisensory interactions between vision, touch, and proprioception has revealed the existence of multisensory spatial representations that code the location of external events relative to our own bodies. In this review, we highlight recent converging evidence from both human and animal studies that has revealed that spatially-modulated multisensory interactions also occur between hearing and touch, especially in the space immediately surrounding the head. These spatial audiotactile interactions for stimuli presented close to the head can affect not only the spatial aspects of perception, but also various other non-spatial aspects of audiotactile information processing. Finally, we highlight some of the most important questions for future research in this area. © 2006 Japanese Psychological Association.

Cite

CITATION STYLE

APA

Kitagawa, N., & Spence, C. (2006, September). Audiotactile multisensory interactions in human information processing. Japanese Psychological Research. https://doi.org/10.1111/j.1468-5884.2006.00317.x

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free