A simple yet effective joint training method for cross-lingual universal dependency parsing

2Citations
Citations of this article
62Readers
Mendeley users who have this article in their library.

Abstract

This paper describes Fudan's submission to CoNLL 2018's shared task Universal Dependency Parsing. We jointly train models when two languages are similar according to linguistic typology and then do an ensemble of the models using a simple re-parse algorithm. Our system outperforms the baseline method by 4.4% and 2.1% on the development and test set of CoNLL 2018 UD Shared Task, separately.1. Our code is available on https://github.com/taineleau/FudanParser.

Cite

CITATION STYLE

APA

Chen, D., Lin, M., Hu, Z., & Qiu, X. (2018). A simple yet effective joint training method for cross-lingual universal dependency parsing. In CoNLL 2018 - SIGNLL Conference on Computational Natural Language Learning, Proceedings of the CoNLL 2018 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies (pp. 256–263). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/K18-2026

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free