Emotion recognition based on convolutional gated recurrent units with attention

22Citations
Citations of this article
16Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Studying brain activity and deciphering the information in electroencephalogram (EEG) signals has become an emerging research field, and substantial advances have been made in the EEG-based classification of emotions. However, using different EEG features and complementarity to discriminate other emotions is still challenging. Most existing models extract a single temporal feature from the EEG signal while ignoring the crucial temporal dynamic information, which, to a certain extent, constrains the classification capability of the model. To address this issue, we propose an Attention-Based Depthwise Parameterized Convolutional Gated Recurrent Unit (AB-DPCGRU) model and validate it with the mixed experiment on the SEED and SEED-IV datasets. The experimental outcomes revealed that the accuracy of the model outperforms the existing state-of-the-art methods, which confirmed the superiority of our approach over currently popular emotion recognition models.

Cite

CITATION STYLE

APA

Ye, Z., Jing, Y., Wang, Q., Li, P., Liu, Z., Yan, M., … Gao, D. (2023). Emotion recognition based on convolutional gated recurrent units with attention. Connection Science, 35(1). https://doi.org/10.1080/09540091.2023.2289833

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free