Graph-based dependency parsing with bidirectional LSTM

134Citations
Citations of this article
169Readers
Mendeley users who have this article in their library.

Abstract

In this paper, we propose a neural network model for graph-based dependency parsing which utilizes Bidirectional LSTM (BLSTM) to capture richer contextual information instead of using high-order factorization, and enable our model to use much fewer features than previous work. In addition, we propose an effective way to learn sentence segment embedding on sentence-level based on an extra forward LSTM network. Although our model uses only first-order factorization, experiments on English Peen Treebank and Chinese Penn Treebank show that our model could be competitive with previous higher-order graph-based dependency parsing models and state-of-the-art models.

Cite

CITATION STYLE

APA

Wang, W., & Chang, B. (2016). Graph-based dependency parsing with bidirectional LSTM. In 54th Annual Meeting of the Association for Computational Linguistics, ACL 2016 - Long Papers (Vol. 4, pp. 2306–2315). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/p16-1218

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free