Abstract
This paper considers the problem of zero-shot entity linking, in which a link in the test time may not present in training. Following the prevailing BERT-based research efforts, we find a simple yet effective way is to expand the long-range sequence modeling. Unlike many previous methods, our method does not require expensive pre-training of BERT with long position embeddings. Instead, we propose an efficient position embeddings initialization method called Embedding-repeat, which initializes larger position embeddings based on BERT-Base. On Wikia’s zero-shot EL dataset, our method improves the SOTA from 76.06% to 79.08%, and for its long data, the corresponding improvement is from 74.57% to 82.14%. Our experiments suggest the effectiveness of long-range sequence modeling without retraining the BERT model.
Cite
CITATION STYLE
Yao, Z., Cao, L., & Pan, H. (2020). Zero-shot entity linking with efficient long range sequence modeling. In Findings of the Association for Computational Linguistics Findings of ACL: EMNLP 2020 (pp. 2517–2522). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.findings-emnlp.228
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.