Hearing in a “Moving” Visual World: Coordinate Transformations Along the Auditory Pathway

  • Willett S
  • Groh J
  • Maddox R
N/ACitations
Citations of this article
6Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This chapter reviews the literature on how auditory signals are transformed into a coordinate system that facilitates interactions with the visual system. Sound location is deduced from cues that depend on the position of the sound with respect to the head, but visual location is deduced from the pattern of light illuminating the retina, yielding an eye-centered code. Connecting sights and sounds originating from the same position in the physical world requires the brain to incorporate information about the position of the eyes with respect to the head. Eye position has been found to interact with auditory signals at all levels of the auditory pathway that have been tested, but usually yields a code that is in a hybrid reference frame – neither head-nor eye-centered. Computing a coordinate transformation, in principal, may be easy, which could suggest that the looseness of the computational constraints may permit hybrid coding. A review of the behavioral literature addressing the effects of eye gaze on auditory spatial perception and a discussion of its consistency with physiological observations concludes the chapter.

Cite

CITATION STYLE

APA

Willett, S. M., Groh, J. M., & Maddox, R. K. (2019). Hearing in a “Moving” Visual World: Coordinate Transformations Along the Auditory Pathway (pp. 85–104). https://doi.org/10.1007/978-3-030-10461-0_5

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free