Recognition of Empathy from Synchronization between Brain Activity and Eye Movement

5Citations
Citations of this article
32Readers
Mendeley users who have this article in their library.

Abstract

In the era of user-generated content (UGC) and virtual interactions within the metaverse, empathic digital content has become increasingly important. This study aimed to quantify human empathy levels when exposed to digital media. To assess empathy, we analyzed brain wave activity and eye movements in response to emotional videos. Forty-seven participants watched eight emotional videos, and we collected their brain activity and eye movement data during the viewing. After each video session, participants provided subjective evaluations. Our analysis focused on the relationship between brain activity and eye movement in recognizing empathy. The findings revealed the following: (1) Participants were more inclined to empathize with videos depicting pleasant-arousal and unpleasant-relaxed emotions. (2) Saccades and fixation, key components of eye movement, occurred simultaneously with specific channels in the prefrontal and temporal lobes. (3) Eigenvalues of brain activity and pupil changes showed synchronization between the right pupil and certain channels in the prefrontal, parietal, and temporal lobes during empathic responses. These results suggest that eye movement characteristics can serve as an indicator of the cognitive empathic process when engaging with digital content. Furthermore, the observed changes in pupil size result from a combination of emotional and cognitive empathy elicited by the videos.

Cite

CITATION STYLE

APA

Zhang, J., Park, S., Cho, A., & Whang, M. (2023). Recognition of Empathy from Synchronization between Brain Activity and Eye Movement. Sensors, 23(11). https://doi.org/10.3390/s23115162

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free