Continuing Pre-trained Model with Multiple Training Strategies for Emotional Classification

4Citations
Citations of this article
36Readers
Mendeley users who have this article in their library.

Abstract

Emotion is the essential attribute of human beings. Perceiving and understanding emotions in a human-like manner is the most central part of developing emotional intelligence. This paper describes the contribution of the LingJing team’s method to the Workshop on Computational Approaches to Subjectivity, Sentiment & Social Media Analysis (WASSA) 2022 shared task on Emotion Classification. The participants are required to predict seven emotions from empathic responses to news or stories that caused harm to individuals, groups, or others. This paper describes the continual pre-training method for the masked language model (MLM) to enhance the DeBERTa pre-trained language model. Several training strategies are designed to further improve the final downstream performance including the data augmentation with the supervised transfer, child-tuning training, and the late fusion method. Extensive experiments on the emotional classification dataset show that the proposed method outperforms other state-of-the-art methods, demonstrating our method’s effectiveness. Moreover, our submission ranked Top-1 with all metrics in the evaluation phase for the Emotion Classification task.

Cite

CITATION STYLE

APA

Li, B., Weng, Y., Song, Q., Sun, B., & Li, S. (2022). Continuing Pre-trained Model with Multiple Training Strategies for Emotional Classification. In WASSA 2022 - 12th Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis, Proceedings of the Workshop (pp. 233–238). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.wassa-1.22

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free