Generative incremental dependency parsing with neural networks

14Citations
Citations of this article
109Readers
Mendeley users who have this article in their library.

Abstract

We propose a neural network model for scalable generative transition-based dependency parsing. A probability distribution over both sentences and transition sequences is parameterised by a feedforward neural network. The model surpasses the accuracy and speed of previous generative dependency parsers, reaching 91.1% UAS. Perplexity results show a strong improvement over n-gram language models, opening the way to the efficient integration of syntax into neural models for language generation.

References Powered by Scopus

A Neural Probabilistic Language Model

5184Citations
N/AReaders
Get full text

A fast and accurate dependency parser using neural networks

1456Citations
N/AReaders
Get full text

Dependency-based word embeddings

909Citations
N/AReaders
Get full text

Cited by Powered by Scopus

Recurrent neural network grammars

352Citations
N/AReaders
Get full text

End-to-end neural relation extraction with global optimization

185Citations
N/AReaders
Get full text

A fast unified model for parsing and sentence understanding

142Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Buys, J., & Blunsom, P. (2015). Generative incremental dependency parsing with neural networks. In ACL-IJCNLP 2015 - 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing, Proceedings of the Conference (Vol. 2, pp. 863–869). Association for Computational Linguistics (ACL). https://doi.org/10.3115/v1/p15-2142

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 43

68%

Researcher 12

19%

Professor / Associate Prof. 6

10%

Lecturer / Post doc 2

3%

Readers' Discipline

Tooltip

Computer Science 59

82%

Linguistics 8

11%

Engineering 3

4%

Business, Management and Accounting 2

3%

Save time finding and organizing research with Mendeley

Sign up for free