The emotional expression of the face provides an important social signal that allows humans to make inferences about other people's state of mind. However, the underlying brain mechanisms are complex and still not completely understood. Using magnetoencephalography (MEG), we analyzed the spatiotemporal structure of regional electrical brain activity in human adults during a categorization task (faces or hands) and an emotion discrimination task (happy faces or neutral faces). Brain regions that are specifically important for different aspects of processing emotional facial expressions showed interesting hemispheric dominance patterns. The dorsal brain regions showed a right predominance when participants paid attention to facial expressions: The right parietofrontal regions, including the somatosensory, motor/premotor, and inferior frontal cortices showed significantly increased activation in the emotion discrimination task, compared to in the categorization task, in latencies of 350 to 550 ms, while no activation was found in their left hemispheric counterparts. Furthermore, a left predominance of the ventral brain regions was shown for happy faces, compared to neutral faces, in latencies of 350 to 550 ms within the emotion discrimination task. Thus, the present data suggest that the right and left hemispheres play different roles in the recognition of facial expressions depending on cognitive context. © 2014 Nakamura et al.
CITATION STYLE
Nakamura, A., Maess, B., Knösche, T. R., & Friederici, A. D. (2014). Different hemispheric roles in recognition of happy expressions. PLoS ONE, 9(2). https://doi.org/10.1371/journal.pone.0088628
Mendeley helps you to discover research relevant for your work.