Zero-shot entity linking by reading entity descriptions

162Citations
Citations of this article
414Readers
Mendeley users who have this article in their library.

Abstract

We present the zero-shot entity linking task, where mentions must be linked to unseen entities without in-domain labeled data. The goal is to enable robust transfer to highly specialized domains, and so no metadata or alias tables are assumed. In this setting, entities are only identified by text descriptions, and models must rely strictly on language understanding to resolve the new entities. First, we show that strong reading comprehension models pre-trained on large unlabeled data can be used to generalize to unseen entities. Second, we propose a simple and effective adaptive pre-training strategy, which we term domain-adaptive pre-training (DAP), to address the domain shift problem associated with linking unseen entities in a new domain. We present experiments on a new dataset that we construct for this task and show that DAP improves over strong pre-training baselines, including BERT. The data and code are available at https://github.com/lajanugen/zeshel.

Cite

CITATION STYLE

APA

Logeswaran, L., Chang, M. W., Lee, K., Toutanova, K., Devlin, J., & Lee, H. (2020). Zero-shot entity linking by reading entity descriptions. In ACL 2019 - 57th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (pp. 3449–3460). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/p19-1335

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free