Existing methods for open-retrieval question answering in lower resource languages (LRLs) lag significantly behind English. They not only suffer from the shortcomings of non-English document retrieval, but are reliant on language-specific supervision for either the task or translation. We formulate a task setup more realistic to available resources, that circumvents document retrieval to reliably transfer knowledge from English to lower resource languages. Assuming a strong English question answering model or database, we compare and analyze methods that pivot through English: to map foreign queries to English and then English answers back to target language answers. Within this task setup we propose Reranked Multilingual Maximal Inner Product Search (RM-MIPS), akin to semantic similarity retrieval over the English training set with reranking, which outperforms the strongest baselines by 2.7% on XQuAD and 6.2% on MKQA. Analysis demonstrates the particular efficacy of this strategy over state-of-the-art alternatives in challenging settings: low-resource languages, with extensive distractor data and query distribution misalignment. Circumventing retrieval, our analysis shows this approach offers rapid answer generation to many other languages off-the-shelf, without necessitating additional training data in the target language.
CITATION STYLE
Montero, I., Longpre, S., Lao, N., Frank, A. J., & DuBois, C. (2022). Pivot Through English: Reliably Answering Multilingual Questions without Document Retrieval. In MIA 2022 - Workshop on Multilingual Information Access, Proceedings of the Workshop (pp. 16–28). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.mia-1.3
Mendeley helps you to discover research relevant for your work.