How do we perceive emotion from a 3-D virtual talking head: Evidence from functional near-infrared spectroscopy

1Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Emotional interaction plays an important role in human-computer interaction. Although more and more virtual heads are endowed with a series of facial expressions under the conversational scenarios, the similarities and differences in neural mechanism underlying emotional perception between virtual head and human are still largely unknown. In the present study, we investigated the activation of emotional perception activities in the dorsolateral prefrontal cortex (DLPFC) region associated with when humans when beings perceiving emotions from 3-D virtual talking heads by using functional near-infrared spectroscopy (fNIRS). Three dynamic emotional stimuli were consisting of 3-D virtual talking head and human face with mute mode or voice presented by twenty participants. The behavioral results showed that participants had lower accuracy watching mute 3-D virtual talking head videos (3DMute) than watching human face videos with voice (HFMute) especially for in anger and happiness stimuli except for neutral emotion. The fNIRS results observed that there was no difference in DLPFC activity area for the observation of 3-D virtual talking head and human face. However, it was found that a stronger DLPFC region had stronger activation was observed for 3-D virtual talking head videos with voice (3DVoice) than 3DMute. In addition, the a stronger activation in DLPFC activation region for affectively angry emotion was videos observed in females was demonstrated, but yet there was no gender difference was found in gender during watching both types for happiness videos. The present results work provided preliminary evidence in physiology mechanism in human-computer interaction.

Cite

CITATION STYLE

APA

Wang, J., Chen, J., Yan, N., Wang, L., & Ng, L. (2019). How do we perceive emotion from a 3-D virtual talking head: Evidence from functional near-infrared spectroscopy. In Multi Conference on Computer Science and Information Systems, MCCSIS 2019 - Proceedings of the International Conferences on Interfaces and Human Computer Interaction 2019, Game and Entertainment Technologies 2019 and Computer Graphics, Visualization, Computer Vision and Image Processing 2019 (pp. 115–122). IADIS Press. https://doi.org/10.33965/ihci2019_201906l015

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free