Faces provide a wealth of information essential to social interaction, including both static features, such as identity, and dynamic features, such as emotional state. Classic models of face perception propose separate neural-processing routes for identity and facial expression (Bruce & Young, 1986), but more recent models suggest that these routes are not independent of each other (Calder & Young, 2005). Using a perceptual adaptation paradigm in the present study, we attempted to further examine the nature of the relation between the neural representations of identity and emotional expression. In Experiment 1, adaptation to the basic emotions of anger, surprise, disgust, and fear resulted in significantly biased perception away from the adapting expression. A significantly decreased aftereffect was observed when the adapting and the test faces differed in identity. With a statistical model that separated surface texture and reflectance from underlying expression geometry, Experiment 2 showed a similar decrease in adaptation when the face stimuli had identical underlying prototypical geometry but differed in the static surface features supporting identity. These results provide evidence that expression adaptation depends on perceptual features important for identity processing and thus suggest at least partly overlapping neural processing of identity and facial expression. Copyright 2008 Psychonomic Society, Inc.
CITATION STYLE
Ellamil, M., Susskind, J. M., & Anderson, A. K. (2008). Examinations of identity invariance in facial expression adaptation. Cognitive, Affective and Behavioral Neuroscience, 8(3), 273–281. https://doi.org/10.3758/CABN.8.3.273
Mendeley helps you to discover research relevant for your work.