Specific EEG/ERP responses to dynamic facial expressions in virtual reality environments

5Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Visual event-related potentials of facial expressions (FEs) have been studied using usually static stimuli after a nonspecific black screen as a baseline. However, when studying social events, the ecology of the environment and stimuli can be a bias. Virtual reality provides a possible approach to improve ecology while keeping stimulus control. We propose a new approach to study responses to FEs. A human avatar in a virtual environment (a plaza) performs the six universal FEs along the time. The setup consisted of a 3D projection system coupled with a precision-position tracker. Subjects (N=6, mean age=25.6y) wore a 32-channel EEG cap together with 3D glasses and two infrared emitters for position tracking. The environment adapted in real time to subjects' position, giving the feeling of immersion. Each animation was composed by the instantaneous morphing of the FE, which is maintained for one second before the 'unmorphing' to the neutral expression, which takes another second. Inter-trial interval was set to three seconds, keeping the neutral facial expression as baseline for one second before the morphing of any facial expression. For the occipito-temporal region, we found a right asymmetrical negativity [150-350]ms after stimulus onset. Timefrequency analysis showed a significant difference in the beta frequency band (20-25Hz) around 350ms in the temporal lobe for the processing of the different facial expressions. This result suggests an important role played by the temporal lobe in discriminating facial expressions. Furthermore, this study provides a proof-of-concept of the possibility of using a complex virtual reality setup coupled with an EEG system for the study of dynamic and ecological social stimuli.

Cite

CITATION STYLE

APA

Simões, M., Amaral, C., Carvalho, P., & Castelo-Branco, M. (2014). Specific EEG/ERP responses to dynamic facial expressions in virtual reality environments. In IFMBE Proceedings (Vol. 42, pp. 331–334). Springer Verlag. https://doi.org/10.1007/978-3-319-03005-0_84

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free