Abstract
We suggest a compositional vector representation of parse trees that relies on a recursive combination of recurrent-neural network encoders. To demonstrate its effectiveness, we use the representation as the backbone of a greedy, bottom-up dependency parser, achieving very strong accuracies for English and Chinese, without relying on external word embeddings. The parser’s implementation is available for download at the first author’s webpage.
Cite
CITATION STYLE
Kiperwasser, E., & Goldberg, Y. (2016). Easy-First Dependency Parsing with Hierarchical Tree LSTMs. Transactions of the Association for Computational Linguistics, 4, 445–461. https://doi.org/10.1162/tacl_a_00110
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.