Our perceptual experience is formed by combining incoming sensory information with prior knowledge and expectation. When speech is not fully intelligible, non-acoustic information may be particularly important. Predictions about a degraded acoustic signal can be provided extrinsically (for example, by presenting a written cue) or intrinsically (if the speech is still partially intelligible). Here I review two studies in which the neural response to speech was measured using magnetoencephalography (MEG), with speech clarity parametrically manipulated using noise vocoding. In a study of isolated word processing, accurate predictions provided by written text enhanced subjective clarity and changed the response in early auditory processing regions of temporal cortex. In a separate study looking at connected speech, the phase of ongoing cortical oscillations was matched to that of the acoustic speech envelope in the range of the syllable rate (4-8 Hz). Critically, this phase-locking was enhanced in left temporal cortex when speech is intelligible. Both experiments thus highlight neural responses in brain regions associated with relatively low-level speech perception. Together these findings support the ability of linguistic information to provide predictions that shape auditory processing of spoken language, particularly when acoustic clarity is compromised. © 2013 Acoustical Society of America.
CITATION STYLE
Peelle, J. E. (2013). Cortical responses to degraded speech are modulated by linguistic predictions. In Proceedings of Meetings on Acoustics (Vol. 19). https://doi.org/10.1121/1.4798652
Mendeley helps you to discover research relevant for your work.