Multi-hypergraph Neural Networks for Emotion Recognition in Multi-party Conversations

0Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Emotion recognition in multi-party conversations (ERMC) is becoming increasingly popular as an emerging research topic in natural language processing. Although previous work exploited inter-dependency and self-dependency among participants, they paid more attention to the use of specific-speaker contexts. Specific-speaker context modeling can well consider the speaker’s self-dependency, but inter-dependency has not been fully utilized. In this paper, two hypergraphs are designed to model specific-speaker context and non-specific-speaker context respectively, so as to deal with self-dependency and inter-dependency among participants. To this end, we design a multi-hypergraph neural network for ERMC, namely ERMC-MHGNN. In particular, we combine average aggregation and attention aggregation to generate hyperedge features, which can make better use of utterance information. Extensive experiments are conducted on two ERC benchmarks with state-of-the-art models employed as baselines for comparison. The empirical results demonstrate the superiority of this new model and confirm that further exploiting inter-dependency is of great value for ERMC. In addition, we also achieved good results on the emotional shift issue.

Cite

CITATION STYLE

APA

Zheng, C., Xu, H., & Sun, X. (2023). Multi-hypergraph Neural Networks for Emotion Recognition in Multi-party Conversations. In Communications in Computer and Information Science (Vol. 1765 CCIS, pp. 44–58). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-981-99-2401-1_4

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free