Graph Emotion Decoding from Visually Evoked Neural Responses

2Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Brain signal-based affective computing has recently drawn considerable attention due to its potential widespread applications. Most existing efforts exploit emotion similarities or brain region similarities to learn emotion representations. However, the relationships between emotions and brain regions are not explicitly incorporated into the representation learning process. Consequently, the learned representations may not be informative enough to benefit downstream tasks, e.g., emotion decoding. In this work, we propose a novel neural decoding framework, Graph Emotion Decoding (GED), which integrates the relationships between emotions and brain regions via a bipartite graph structure into the neural decoding process. Further analysis shows that exploiting such relationships helps learn better representations, verifying the rationality and effectiveness of GED. Comprehensive experiments on visually evoked emotion datasets demonstrate the superiority of our model. The code is publicly available at https://github.com/zhongyu1998/GED.

Cite

CITATION STYLE

APA

Huang, Z., Du, C., Wang, Y., & He, H. (2022). Graph Emotion Decoding from Visually Evoked Neural Responses. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 13438 LNCS, pp. 396–405). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-16452-1_38

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free