Intermediate-Task Transfer Learning with BERT for Sarcasm Detection

53Citations
Citations of this article
78Readers
Mendeley users who have this article in their library.

Abstract

Sarcasm detection plays an important role in natural language processing as it can impact the performance of many applications, including sentiment analysis, opinion mining, and stance detection. Despite substantial progress on sarcasm detection, the research results are scattered across datasets and studies. In this paper, we survey the current state-of-the-art and present strong baselines for sarcasm detection based on BERT pre-trained language models. We further improve our BERT models by fine-tuning them on related intermediate tasks before fine-tuning them on our target task. Specifically, relying on the correlation between sarcasm and (implied negative) sentiment and emotions, we explore a transfer learning framework that uses sentiment classification and emotion detection as individual intermediate tasks to infuse knowledge into the target task of sarcasm detection. Experimental results on three datasets that have different characteristics show that the BERT-based models outperform many previous models.

Cite

CITATION STYLE

APA

Savini, E., & Caragea, C. (2022). Intermediate-Task Transfer Learning with BERT for Sarcasm Detection. Mathematics, 10(5). https://doi.org/10.3390/math10050844

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free