A multi-task architecture on relevance-based neural query translation

10Citations
Citations of this article
119Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We describe a multi-task learning approach to train a Neural Machine Translation (NMT) model with a Relevance-based Auxiliary Task (RAT) for search query translation. The translation process for Cross-lingual Information Retrieval (CLIR) task is usually treated as a black box and it is performed as an independent step. However, an NMT model trained on sentence-level parallel data is not aware of the vocabulary distribution of the retrieval corpus. We address this problem with our multitask learning architecture that achieves 16% improvement over a strong NMT baseline on Italian-English query-document dataset. We show using both quantitative and qualitative analysis that our model generates balanced and precise translations with the regularization effect it achieves from multi-task learning paradigm.

Cite

CITATION STYLE

APA

Sarwar, S. M., Bonab, H., & Allan, J. (2020). A multi-task architecture on relevance-based neural query translation. In ACL 2019 - 57th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (pp. 6339–6344). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/p19-1639

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free