A neural model of auditory space compatible with human perception under simulated echoic conditions

0Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

Abstract

In a typical auditory scene, sounds from different sources and reflective surfaces summate in the ears, causing spatial cues to fluctuate. Prevailing hypotheses of how spatial locations may be encoded and represented across auditory neurons generally disregard these fluctuations and must therefore invoke additional mechanisms for detecting and representing them. Here, we consider a different hypothesis in which spatial perception corresponds to an intermediate or sub-maximal firing probability across spatially selective neurons within each hemisphere. The precedence or Haas effect presents an ideal opportunity for examining this hypothesis, since the temporal superposition of an acoustical reflection with sounds arriving directly from a source can cause otherwise stable cues to fluctuate. Our findings suggest that subjects' experiences may simply reflect the spatial cues that momentarily arise under various acoustical conditions and how these cues are represented. We further suggest that auditory objects may acquire "edges" under conditions when interaural time differences are broadly distributed. Copyright:

Cite

CITATION STYLE

APA

Nelson, B. S., Donovan, J. M., & Takahashi, T. T. (2015). A neural model of auditory space compatible with human perception under simulated echoic conditions. PLoS ONE, 10(9). https://doi.org/10.1371/journal.pone.0137900

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free