Cross-Lingual Training of Dense Retrievers for Document Retrieval

11Citations
Citations of this article
57Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Dense retrieval has shown great success for passage ranking in English. However, its effectiveness for non-English languages remains unexplored due to limitation in training resources. In this work, we explore different transfer techniques for document ranking from English annotations to non-English languages. Our experiments reveal that zero-shot model-based transfer using mBERT improves search quality. We find that weakly-supervised target language transfer is competitive compared to generation-based target language transfer, which requires translation models.

Cite

CITATION STYLE

APA

Shi, P., Zhang, R., Bai, H., & Lin, J. (2021). Cross-Lingual Training of Dense Retrievers for Document Retrieval. In MRL 2021 - 1st Workshop on Multilingual Representation Learning, Proceedings of the Conference (pp. 251–253). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.mrl-1.24

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free