Knowledge Aware Emotion Recognition in Textual Conversations via Multi-Task Incremental Transformer

58Citations
Citations of this article
84Readers
Mendeley users who have this article in their library.

Abstract

Emotion recognition in textual conversations (ERTC) plays an important role in a wide range of applications, such as opinion mining, recommender systems, and so on. ERTC, however, is a challenging task. For one thing, speakers often rely on the context and commonsense knowledge to express emotions; for another, most utterances contain neutral emotion in conversations, as a result, the confusion between a few non-neutral utterances and much more neutral ones restrains the emotion recognition performance. In this paper, we propose a novel Knowledge Aware Incremental Transformer with Multi-task Learning (KAITML) to address these challenges. Firstly, we devise a dual-level graph attention mechanism to leverage commonsense knowledge, which augments the semantic information of the utterance. Then we apply the Incremental Transformer to encode multi-turn contextual utterances. Moreover, we are the first to introduce multi-task learning to alleviate the aforementioned confusion and thus further improve the emotion recognition performance. Extensive experimental results show that our KAITML model outperforms the state-of-the-art models across five benchmark datasets.

Cite

CITATION STYLE

APA

Zhang, D., Chen, X., Xu, S., & Xu, B. (2020). Knowledge Aware Emotion Recognition in Textual Conversations via Multi-Task Incremental Transformer. In COLING 2020 - 28th International Conference on Computational Linguistics, Proceedings of the Conference (pp. 4429–4440). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.coling-main.392

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free