Word Reordering for Zero-shot Cross-lingual Structured Prediction

6Citations
Citations of this article
51Readers
Mendeley users who have this article in their library.

Abstract

Adapting word order from one language to another is a key problem in cross-lingual structured prediction. Current sentence encoders (e.g., RNN, Transformer with position embeddings) are usually word order sensitive. Even with uniform word form representations (MUSE, mBERT), word order discrepancies may hurt the adaptation of models. This paper builds structured prediction models with bag-of-words inputs. It introduces a new reordering module to organize words following the source language order, which learns task-specific reordering strategies from a general-purpose order predictor model. Experiments on zero-shot cross-lingual dependency parsing, POS tagging, and morphological tagging show that our model can significantly improve target language performances, especially for languages that are distant from the source language.

Cite

CITATION STYLE

APA

Ji, T., Jiang, Y., Wang, T., Huang, Z., Huang, F., Wu, Y., & Wang, X. (2021). Word Reordering for Zero-shot Cross-lingual Structured Prediction. In EMNLP 2021 - 2021 Conference on Empirical Methods in Natural Language Processing, Proceedings (pp. 4109–4120). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2021.emnlp-main.338

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free