tBERT: Topic models and BERT joining forces for semantic similarity detection

149Citations
Citations of this article
302Readers
Mendeley users who have this article in their library.

Abstract

Semantic similarity detection is a fundamental task in natural language understanding. Adding topic information has been useful for previous feature-engineered semantic similarity models as well as neural models for other tasks. There is currently no standard way of combining topics with pretrained contextual representations such as BERT. We propose a novel topic-informed BERT-based architecture for pairwise semantic similarity detection and show that our model improves performance over strong neural baselines across a variety of English language datasets. We find that the addition of topics to BERT helps particularly with resolving domain-specific cases.

Cite

CITATION STYLE

APA

Peinelt, N., Nguyen, D., & Liakata, M. (2020). tBERT: Topic models and BERT joining forces for semantic similarity detection. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 7047–7055). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.acl-main.630

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free