Abstract
To accomplish the shared task on dependency parsing we explore the use of a linear transition-based neural dependency parser as well as a combination of three of them by means of a linear tree combination algorithm. We train separate models for each language on the shared task data. We compare our base parser with two biaffine parsers and also present an ensemble combination of all five parsers, which achieves an average UAS 1.88 point lower than the top official submission. For producing the enhanced dependencies, we exploit a hybrid approach, coupling an algorithmic graph transformation of the dependency tree with predictions made by a multitask machine learning model.
Cite
CITATION STYLE
Attardi, G., Sartiano, D., & Simi, M. (2020). Linear neural Parsing and hybrid Enhancement for Enhanced universal Dependencies. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 206–214). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.iwpt-1.21
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.