DVDGCN: Modeling Both Context-Static and Speaker-Dynamic Graph for Emotion Recognition in Multi-speaker Conversations

0Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Emotion recognition in conversation has been one hot topic in natural language processing (NLP). Speaker information plays an important role in the dialogue system, especially speaker state closely related to emotion. Because of the increasing speakers, it is more challenging to model speakers’ state in multi-speaker conversation than in two-speaker conversation. In this paper, we focus on emotion detection in multi-speaker conversation–a more generalized conversation emotion task. We mainly try to solve two problems. First, the more speakers, the more difficulties we have to meet to model speakers’ interactions and get speaker state. Second, because of conversations’ temporal variations, it’s necessary to model speaker dynamic state in the conversation. For the first problem, we adopt graph structure which has expressive ability to model speaker interactions and speaker state. For the second problem, we use dynamic graph neural network to model speaker dynamic state. Therefore, we propose Dual View Dialogue Graph Neural Network (DVDGCN), a graph neural network to model both context-static and speaker-dynamic graph. The experimental results on a multi-speaker conversation emotion recognition corpus demonstrate the great effectiveness of the proposed approach.

Cite

CITATION STYLE

APA

Zhao, S., & Liu, P. (2020). DVDGCN: Modeling Both Context-Static and Speaker-Dynamic Graph for Emotion Recognition in Multi-speaker Conversations. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 12430 LNAI, pp. 104–115). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-60450-9_9

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free