Abstract
Code-switching dependency parsing stands as a challenging task due to both the scarcity of necessary resources and the structural difficulties embedded in code-switched languages. In this study, we introduce novel sequence labeling models to be used as auxiliary tasks for dependency parsing of code-switched text in a semi-supervised scheme. We show that using auxiliary tasks enhances the performance of an LSTM-based dependency parsing model and leads to better results compared to an XLM-Rbased model with significantly less computational and space complexity. As the first study that focuses on multiple code-switching language pairs for dependency parsing, we acquire state-of-the-art scores on all of the studied languages. Our best models outperform the previous work by 7.4 LAS points on average.
Cite
CITATION STYLE
Ozates, S. B., Ozgur, A., Gungor, T., & Cetinoglu, O. (2022). Improving Code-Switching Dependency Parsing with Semi-Supervised Auxiliary Tasks. In Findings of the Association for Computational Linguistics: NAACL 2022 - Findings (pp. 1159–1171). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2022.findings-naacl.87
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.