DualGATs: Dual Graph Attention Networks for Emotion Recognition in Conversations

26Citations
Citations of this article
20Readers
Mendeley users who have this article in their library.

Abstract

Capturing complex contextual dependencies plays a vital role in Emotion Recognition in Conversations (ERC). Previous studies have predominantly focused on speaker-aware context modeling, overlooking the discourse structure of the conversation. In this paper, we introduce Dual Graph ATtention networks (DualGATs) to concurrently consider the complementary aspects of discourse structure and speaker-aware context, aiming for more precise ERC. Specifically, we devise a Discourse-aware GAT (DisGAT) module to incorporate discourse structural information by analyzing the discourse dependencies between utterances. Additionally, we develop a Speaker-aware GAT (SpkGAT) module to incorporate speaker-aware contextual information by considering the speaker dependencies between utterances. Furthermore, we design an interaction module that facilitates the integration of the DisGAT and SpkGAT modules, enabling the effective interchange of relevant information between the two modules. We extensively evaluate our method on four datasets, and experimental results demonstrate that our proposed DualGATs surpass state-of-the-art baselines on the majority of the datasets.

Cite

CITATION STYLE

APA

Zhang, D., Chen, F., & Chen, X. (2023). DualGATs: Dual Graph Attention Networks for Emotion Recognition in Conversations. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 1, pp. 7395–7408). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.acl-long.408

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free