Cross-Lingual Transfer in Zero-Shot Cross-Language Entity Linking

4Citations
Citations of this article
58Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Cross-language entity linking grounds mentions written in several languages to a monolingual knowledge base. We use a simple neural ranking architecture for this task that uses multilingual BERT representations of both the mention and the context as input, so as to explore the ability of a transformer model to perform well on this task. We find that the multilingual ability of BERT leads to good performance in monolingual and multilingual settings. Furthermore, we explore zero-shot language transfer and find surprisingly robust performance. We conduct several analyses to identify the sources of performance degradation in the zero-shot setting. Results indicate that while multilingual transformer models transfer well between languages, issues remain in disambiguating similar entities unseen in training.

Cite

CITATION STYLE

APA

Schumacher, E., Mayfield, J., & Dredze, M. (2021). Cross-Lingual Transfer in Zero-Shot Cross-Language Entity Linking. In Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021 (pp. 583–595). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.findings-acl.52

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free