LATTE: Latent type modeling for biomedical entity linking

25Citations
Citations of this article
65Readers
Mendeley users who have this article in their library.

Abstract

Entity linking is the task of linking mentions of named entities in natural language text, to entities in a curated knowledge-base. This is of significant importance in the biomedical domain, where it could be used to semantically annotate a large volume of clinical records and biomedical literature, to standardized concepts described in an ontology such as Unified Medical Language System (UMLS). We observe that with precise type information, entity disambiguation becomes a straightforward task. However, fine-grained type information is usually not available in biomedical domain. Thus, we propose LATTE, a LATent Type Entity Linking model, that improves entity linking by modeling the latent fine-grained type information about mentions and entities. Unlike previous methods that perform entity linking directly between the mentions and the entities, LATTE jointly does entity disambiguation, and latent fine-grained type learning, without direct supervision. We evaluate our model on two biomedical datasets: MedMentions, a large scale public dataset annotated with UMLS concepts, and a de-identified corpus of dictated doctor’s notes that has been annotated with ICD concepts. Extensive experimental evaluation shows our model achieves significant performance improvements over several state-of-the-art techniques.

Cite

CITATION STYLE

APA

Zhu, M., Celikkaya, B., Bhatia, P., & Reddy, C. K. (2020). LATTE: Latent type modeling for biomedical entity linking. In AAAI 2020 - 34th AAAI Conference on Artificial Intelligence (pp. 9757–9764). AAAI press. https://doi.org/10.1609/aaai.v34i05.6526

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free