Adaptation of multilingual transformer Encoder for robust Enhanced universal dependency Parsing

4Citations
Citations of this article
61Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper presents our enhanced dependency parsing approach using transformer encoders, coupled with a simple yet powerful ensemble algorithm that takes advantage of both tree and graph dependency parsing. Two types of transformer encoders are compared, a multilingual encoder and language-specific encoders. Our dependency tree parsing (DTP) approach generates only primary dependencies to form trees whereas our dependency graph parsing (DGP) approach handles both primary and secondary dependencies to form graphs. Since DGP does not guarantee the generated graphs are acyclic, the ensemble algorithm is designed to add secondary arcs predicted by DGP to primary arcs predicted by DTP. Our results show that models using the multilingual encoder outperform ones using the language specific encoders for most languages. Moreover, the ensemble models generally show higher labeled attachment score on enhanced dependencies (ELAS) than the DTP and DGP models. As the result, our best parsing models rank the third place on the macro-average ELAS over 17 languages.

Cite

CITATION STYLE

APA

He, H., & Choi, J. D. (2020). Adaptation of multilingual transformer Encoder for robust Enhanced universal dependency Parsing. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 181–191). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.iwpt-1.19

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free