Abstract
Recent developments in deep learning have prompted a surge of interest in the application of multitask and transfer learning to NLP problems. In this study, we explore for the first time, the application of transfer learning (TRL) and multitask learning (MTL) to the identification of Multiword Expressions (MWEs). For MTL, we exploit the shared syntactic information between MWE and dependency parsing models to jointly train a single model on both tasks. We specifically predict two types of labels: MWE and dependency parse. Our neural MTL architecture utilises the supervision of dependency parsing in lower layers and predicts MWE tags in upper layers. In the TRL scenario, we overcome the scarcity of data by learning a model on a larger MWE dataset and transferring the knowledge to a resource-poor setting in another language. In both scenarios, the resulting models achieved higher performance compared to standard neural approaches.
Cite
CITATION STYLE
Taslimipoor, S., Rohanian, O., & Ha, L. A. (2019). Cross-lingual transfer learning and multitask learning for capturing multiword expressions. In ACL 2019 - Joint Workshop on Multiword Expressions and WordNet, MWE-WN 2019 - Proceedings of the Workshop (pp. 155–161). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/w19-5119
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.