Structure-aware Sentence Encoder in BERT-Based Siamese Network

5Citations
Citations of this article
63Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Recently, impressive performance on various natural language understanding tasks has been achieved by explicitly incorporating syntax and semantic information into pre-trained models, such as BERT and RoBERTa. However, this approach depends on problem-specific fine-tuning, and as widely noted, BERT-like models exhibit weak performance, and are inefficient, when applied to unsupervised similarity comparison tasks. Sentence-BERT (SBERT) has been proposed as a general-purpose sentence embedding method, suited to both similarity comparison and downstream tasks. In this work, we show that by incorporating structural information into SBERT, the resulting model outperforms SBERT and previous general sentence encoders on unsupervised semantic textual similarity (STS) datasets and transfer classification tasks.

Cite

CITATION STYLE

APA

Peng, Q., Weir, D., & Weeds, J. (2021). Structure-aware Sentence Encoder in BERT-Based Siamese Network. In RepL4NLP 2021 - 6th Workshop on Representation Learning for NLP, Proceedings of the Workshop (pp. 57–63). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.repl4nlp-1.7

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free