Abstract
Adapter modules enable modular and efficient zero-shot cross-lingual transfer, where current state-of-the-art adapter-based approaches learn specialized language adapters (LAs) for individual languages. In this work, we show that it is more effective to learn bilingual language pair adapters (BAs) when the goal is to optimize performance for a particular source-target transfer direction. Our novel BAD-X adapter framework trades off some modularity of dedicated LAs for improved transfer performance: we demonstrate consistent gains in three standard downstream tasks, and for the majority of evaluated low-resource languages.
Cite
CITATION STYLE
Parović, M., Glavaš, G., Vulić, I., & Korhonen, A. (2022). BAD-X: Bilingual Adapters Improve Zero-Shot Cross-Lingual Transfer. In NAACL 2022 - 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Proceedings of the Conference (pp. 1791–1799). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.naacl-main.130
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.