A Sequence Learning Method for Domain-Specific Entity Linking

5Citations
Citations of this article
81Readers
Mendeley users who have this article in their library.

Abstract

Recent collective Entity Linking studies usually promote global coherence of all the mapped entities in the same document by using semantic embeddings and graph-based approaches. Although graph-based approaches are shown to achieve remarkable results, they are computationally expensive for general datasets. Also, semantic embeddings only indicate relatedness between entity pairs without considering sequences. In this paper, we address these problems by introducing a two-fold neural model. First, we match easy mention-entity pairs and using the domain information of this pair to filter candidate entities of closer mentions. Second, we resolve more ambiguous pairs using bidirectional Long Short-Term Memory and CRF models for the entity disambiguation. Our proposed system outperforms state-of-the-art systems on the generated domain-specific evaluation dataset.

Cite

CITATION STYLE

APA

Inan, E., & Dikenelli, O. (2018). A Sequence Learning Method for Domain-Specific Entity Linking. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 14–21). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/w18-2403

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free