COUNT: COntrastive UNlikelihood Text Style Transfer for Text Detoxification

9Citations
Citations of this article
19Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Offensive and toxic text on social media platforms can lead to polarization and divisiveness within online communities and hinders constructive dialogue. Text detoxification is a crucial task in natural language processing to ensure the generation of non-toxic and safe text. Text detoxification is a special case of the Text Style Transfer (TST) problem, where an input text is rephrased to an output text that preserves its content while modifying the style (in this case to a more neutral, non-toxic style). State-of-the-art methods for detoxification use supervised training of encoder-decoder models to produce gold-standard outputs with a standard likelihood-based objective. However, it can be hard for these models to deviate from their pretrained auto-encoder identity mapping. While previous methods have used unlikelihood-based losses to penalize input-to-output copying of toxic content, these methods also unfortunately penalize non-toxic content in the input that would be fine to preserve in the output. To address these issues, we introduce a novel contrastive unlikelihood objective (COUNT) that directly contrasts the gold standard rephrasing with the identity input-to-output mapping to effectively isolate and focus learning on non-toxic style transfer. We benchmark COUNT on two parallel datasets, ParaDetox and APPDIA, showing that it achieves significant improvements in jointly combined fluency, content preservation, and detoxification (i.e., the highest “J” score).

Cite

CITATION STYLE

APA

Pour, M. M. A., Farinneya, P., Bharadwaj, M., Verma, N., Pesaranghader, A., & Sanner, S. (2023). COUNT: COntrastive UNlikelihood Text Style Transfer for Text Detoxification. In Findings of the Association for Computational Linguistics: EMNLP 2023 (pp. 8658–8666). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2023.findings-emnlp.579

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free