Exploiting unsupervised data for emotion recognition in conversations

13Citations
Citations of this article
73Readers
Mendeley users who have this article in their library.

Abstract

Emotion Recognition in Conversations (ERC) aims to predict the emotional state of speakers in conversations, which is essentially a text classification task. Unlike the sentence-level text classification problem, the available supervised data for the ERC task is limited, which potentially prevents the models from playing their maximum effect. In this paper, we propose a novel approach to leverage unsupervised conversation data, which is more accessible. Specifically, we propose the Conversation Completion (ConvCom) task, which attempts to select the correct answer from candidate answers to fill a masked utterance in a conversation. Then, we Pre-train a basic COntext-Dependent Encoder (PRE-CODE) on the ConvCom task. Finally, we fine-tune the PRE-CODE on the datasets of ERC. Experimental results demonstrate that pre-training on unsupervised data achieves significant improvement of performance on the ERC datasets, particularly on the minority emotion classes.

Cite

CITATION STYLE

APA

Jiao, W., Lyu, M. R., & King, I. (2020). Exploiting unsupervised data for emotion recognition in conversations. In Findings of the Association for Computational Linguistics Findings of ACL: EMNLP 2020 (pp. 4839–4846). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.findings-emnlp.435

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free