Syntax aware lstm model for semantic role labeling

22Citations
Citations of this article
90Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In Semantic Role Labeling (SRL) task, the tree structured dependency relation is rich in syntax information, but it is not well handled by existing models. In this paper, we propose Syntax Aware Long Short Time Memory (SA-LSTM). The structure of SA-LSTM changes according to dependency structure of each sentence, so that SA-LSTM can model the whole tree structure of dependency relation in an architecture engineering way. Experiments demonstrate that on Chinese Proposition Bank (CPB) 1.0, SA-LSTM improves F1 by 2.06% than ordinary bi-LSTM with feature engineered dependency relation information, and gives state-of-the-art F1 of 79.92%. On English CoNLL 2005 dataset, SA-LSTM brings improvement (2.1%) to bi-LSTM model and also brings slight improvement (0.3%) when added to the stateof- the-art model.

Cite

CITATION STYLE

APA

Qian, F., Sha, L., Chang, B., Liu, L. C., & Zhang, M. (2017). Syntax aware lstm model for semantic role labeling. In EMNLP 2017 - Conference on Empirical Methods in Natural Language Processing, Proceedings of the 2nd Workshop on Structured Prediction (pp. 27–32). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/w17-4305

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free