eMLM: A New Pre-training Objective for Emotion Related Tasks

10Citations
Citations of this article
71Readers
Mendeley users who have this article in their library.

Abstract

Bidirectional Encoder Representations from Transformers (BERT) have been shown to be extremely effective on a wide variety of natural language processing tasks, including sentiment analysis and emotion detection. However, the proposed pre-training objectives of BERT do not induce any sentiment or emotion-specific biases into the model. In this paper, we present Emotion Masked Language Modeling, a variation of Masked Language Modeling, aimed at improving the BERT language representation model for emotion detection and sentiment analysis tasks. Using the same pre-training corpora as the original BERT model, Wikipedia and BookCorpus, our BERT variation manages to improve the downstream performance on 4 tasks for emotion detection and sentiment analysis by an average of 1:2% F1. Moreover, our approach shows an increased performance in our task-specific robustness tests. We make our code and pre-trained model available at https://github.com/tsosea2/eMLM.

Cite

CITATION STYLE

APA

Sosea, T., & Caragea, C. (2021). eMLM: A New Pre-training Objective for Emotion Related Tasks. In ACL-IJCNLP 2021 - 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing, Proceedings of the Conference (Vol. 2, pp. 286–293). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.acl-short.38

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free