Highly Parallel Autoregressive Entity Linking with Discriminative Correction

22Citations
Citations of this article
74Readers
Mendeley users who have this article in their library.

Abstract

Generative approaches have been recently shown to be effective for both Entity Disambiguation and Entity Linking (i.e., joint mention detection and disambiguation). However, the previously proposed autoregressive formulation for EL suffers from i) high computational cost due to a complex (deep) decoder, ii) non-parallelizable decoding that scales with the source sequence length, and iii) the need for training on a large amount of data. In this work, we propose a very efficient approach that parallelizes autoregressive linking across all potential mentions and relies on a shallow and efficient decoder. Moreover, we augment the generative objective with an extra discriminative component, i.e. a correction term which lets us directly optimize the generator's ranking. When taken together, these techniques tackle all the above issues: our model is >70 times faster and more accurate than the previous generative method, outperforming state-of-the-art approaches on the standard English dataset AIDA-CoNLL.

Cite

CITATION STYLE

APA

De Cao, N., Aziz, W., & Titov, I. (2021). Highly Parallel Autoregressive Entity Linking with Discriminative Correction. In EMNLP 2021 - 2021 Conference on Empirical Methods in Natural Language Processing, Proceedings (pp. 7662–7669). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.emnlp-main.604

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free