Auditory to visual cross-modal adaptation for emotion: Psychophysical and neural correlates

16Citations
Citations of this article
67Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Adaptation is fundamental in sensory processing and has been studied extensively within the same sensorymodality. However, little is known about adaptation across sensory modalities, especially in the context of high-level processing, such as the perception ofemotion. Previous studies have shownthat prolonged exposure to a face exhibiting oneemotion, such as happiness, leads to contrastive biases in the perception of subsequently presented faces toward the oppositeemotion, such as sadness. Such work has shown the importance of adaptation in calibrating face perception based on prior visual exposure. In the present study, we showed for the first time that emotion-laden sounds, like laughter, adapt the visual perception of emotional faces, that is, subjects more frequently perceived faces as sad after listening to a happy sound. Furthermore, via electroencephalography recordings and event-related potential analysis, we showed that there was a neural correlate underlying the perceptual bias: There was an attenuated response occurring at ~ 400ms to happy test faces and a quickened response to sad test faces, after exposure to a happy sound. Our results provide the first direct evidence for a behavioral cross-modal adaptation effect on the perception of facial emotion, and its neural correlate.

Cite

CITATION STYLE

APA

Wang, X., Guo, X., Chen, L., Liu, Y., Goldberg, M. E., & Xu, H. (2017). Auditory to visual cross-modal adaptation for emotion: Psychophysical and neural correlates. Cerebral Cortex, 27(2), 1337–1346. https://doi.org/10.1093/cercor/bhv321

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free