Unsupervised cross-lingual adaptation of dependency parsers using CRF autoencoders

2Citations
Citations of this article
61Readers
Mendeley users who have this article in their library.

Abstract

We consider the task of cross-lingual adaptation of dependency parsers without annotated target corpora and parallel corpora. Previous work either directly applies a discriminative source parser to the target language, ignoring unannotated target corpora, or employs an unsupervised generative parser that can leverage unannotated target data but has weaker representational power than discriminative parsers. In this paper, we propose to utilize unsupervised discriminative parsers based on the CRF autoencoder framework for this task. We train a source parser and use it to initialize and regularize a target parser that is trained on unannotated target data. We conduct experiments that transfer an English parser to 20 target languages. The results show that our method significantly outperforms previous methods.

Cite

CITATION STYLE

APA

Li, Z., & Tu, K. (2020). Unsupervised cross-lingual adaptation of dependency parsers using CRF autoencoders. In Findings of the Association for Computational Linguistics Findings of ACL: EMNLP 2020 (pp. 2127–2133). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.findings-emnlp.193

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free