Identifying Gender Differences in Multimodal Emotion Recognition Using Bimodal Deep AutoEncoder

8Citations
Citations of this article
16Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper mainly focuses on investigating the differences between males and females in emotion recognition using electroencephalography (EEG) and eye movement data. Four basic emotions are considered, namely happy, sad, fearful and neutral. The Bimodal Deep AutoEncoder (BDAE) and the fuzzy-integral-based method are applied to fuse EEG and eye movement data. Our experimental results indicate that gender differences do exist in neural patterns for emotion recognition; eye movement data is not as good as EEG data for examining gender differences in emotion recognition; the activation of the brains for females is generally lower than that for males in most bands and brain areas especially for fearful emotions. According to the confusion matrix, we observe that the fearful emotion is more diverse among women compared with men, and men behave more diversely on the sad emotion compared with women. Additionally, individual differences in fear are more pronounced than other three emotions for females.

Cite

CITATION STYLE

APA

Yan, X., Zheng, W. L., Liu, W., & Lu, B. L. (2017). Identifying Gender Differences in Multimodal Emotion Recognition Using Bimodal Deep AutoEncoder. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10637 LNCS, pp. 533–542). Springer Verlag. https://doi.org/10.1007/978-3-319-70093-9_56

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free