Sequence labeling parsing by learning across representations

15Citations
Citations of this article
119Readers
Mendeley users who have this article in their library.

Abstract

We use parsing as sequence labeling as a common framework to learn across constituency and dependency syntactic abstractions. To do so, we cast the problem as multitask learning (MTL). First, we show that adding a parsing paradigm as an auxiliary loss consistently improves the performance on the other paradigm. Secondly, we explore an MTL sequence labeling model that parses both representations, at almost no cost in terms of performance and speed. The results across the board show that on average MTL models with auxiliary losses for constituency parsing outperform single-task ones by 1.05 F1 points, and for dependency parsing by 0.62 UAS points.

Cite

CITATION STYLE

APA

Strzyz, M., Vilares, D., & Gómez-Rodríguez, C. (2020). Sequence labeling parsing by learning across representations. In ACL 2019 - 57th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (pp. 5350–5357). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/p19-1531

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free