Scalable Cross-lingual Treebank Synthesis for Improved Production Dependency Parsers

1Citations
Citations of this article
54Readers
Mendeley users who have this article in their library.

Abstract

We present scalable Universal Dependency (UD) treebank synthesis techniques that exploit advances in language representation modeling which leverage vast amounts of unlabeled general-purpose multilingual text. We introduce a data augmentation technique that uses synthetic treebanks to improve production-grade parsers. The synthetic treebanks are generated using a state-of-the-art biaffine parser adapted with pretrained Transformer models, such as Multilingual BERT (M-BERT). The new parser improves LAS by up to two points on seven languages. The production models’ LAS performance improves as the augmented treebanks scale in size, surpassing performance of production models trained on originally annotated UD treebanks.

Cite

CITATION STYLE

APA

El-Kurdi, Y., Kanayama, H., Kayi, E. S., Ward, T., Castelli, V., & Florian, H. (2020). Scalable Cross-lingual Treebank Synthesis for Improved Production Dependency Parsers. In COLING 2020 - 28th International Conference on Computational Linguistics, Proceedings of the Industry Track (pp. 172–178). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.coling-industry.16

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free