SupCL-Seq: Supervised Contrastive Learning for Downstream Optimized Sequence Representations

7Citations
Citations of this article
51Readers
Mendeley users who have this article in their library.

Abstract

While contrastive learning is proven to be an effective training strategy in computer vision, Natural Language Processing (NLP) is only recently adopting it as a self-supervised alternative to Masked Language Modeling (MLM) for improving sequence representations. This paper introduces SupCL-Seq, which extends the supervised contrastive learning from computer vision to the optimization of sequence representations in NLP. By altering the dropout mask probability in standard Transformer architectures (e.g. BERTbase), for every representation (anchor), we generate augmented altered views. A supervised contrastive loss is then utilized to maximize the system's capability of pulling together similar samples (e.g., anchors and their altered views) and pushing apart the samples belonging to the other classes. Despite its simplicity, SupCLSeq leads to large gains in many sequence classification tasks on the GLUE benchmark compared to a standard BERTbase, including 6% absolute improvement on CoLA, 5.4% on MRPC, 4.7% on RTE and 2.6% on STSB. We also show consistent gains over selfsupervised contrastively learned representations, especially in non-semantic tasks. Finally we show that these gains are not solely due to augmentation, but rather to a downstream optimized sequence representation. Code: https://github.com/hooman650/SupCL-Seq.

Cite

CITATION STYLE

APA

Sedghamiz, H., Raval, S., Santus, E., Alhanai, T., & Ghassemi, M. (2021). SupCL-Seq: Supervised Contrastive Learning for Downstream Optimized Sequence Representations. In Findings of the Association for Computational Linguistics, Findings of ACL: EMNLP 2021 (pp. 3398–3403). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.findings-emnlp.289

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free