Triplet-Trained Vector Space and Sieve-Based Search Improve Biomedical Concept Normalization

9Citations
Citations of this article
63Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Concept normalization, the task of linking textual mentions of concepts to concepts in an ontology, is critical for mining and analyzing biomedical texts. We propose a vector-space model for concept normalization, where mentions and concepts are encoded via transformer networks that are trained via a triplet objective with online hard triplet mining. The transformer networks refine existing pre-trained models, and the online triplet mining makes training efficient even with hundreds of thousands of concepts by sampling training triples within each mini-batch. We introduce a variety of strategies for searching with the trained vector-space model, including approaches that incorporate domain-specific synonyms at search time with no model retraining. Across five datasets, our models that are trained only once on their corresponding ontologies are within 3 points of state-of-the-art models that are retrained for each new domain. Our models can also be trained for each domain, achieving new state-of-the-art on multiple datasets.

Cite

CITATION STYLE

APA

Xu, D., & Bethard, S. (2021). Triplet-Trained Vector Space and Sieve-Based Search Improve Biomedical Concept Normalization. In Proceedings of the 20th Workshop on Biomedical Language Processing, BioNLP 2021 (pp. 11–22). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.bionlp-1.2

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free