Massively multilingual transfer for NER

162Citations
Citations of this article
237Readers
Mendeley users who have this article in their library.

Abstract

In cross-lingual transfer, NLP models over one or more source languages are applied to a low-resource target language. While most prior work has used a single source model or a few carefully selected models, here we consider a “massive” setting with many such models. This setting raises the problem of poor transfer, particularly from distant languages. We propose two techniques for modulating the transfer, suitable for zero-shot or few-shot learning, respectively. Evaluating on named entity recognition, we show that our techniques are much more effective than strong baselines, including standard ensembling, and our unsupervised method rivals oracle selection of the single best individual model1.

Cite

CITATION STYLE

APA

Rahimi, A., Li, Y., & Cohn, T. (2020). Massively multilingual transfer for NER. In ACL 2019 - 57th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (pp. 151–164). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/p19-1015

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free