Steps to excellence: Simple inference with refined scoring of dependency trees

N/ACitations
Citations of this article
113Readers
Mendeley users who have this article in their library.

Abstract

Much of the recent work on dependency parsing has been focused on solving inherent combinatorial problems associated with rich scoring functions. In contrast, we demonstrate that highly expressive scoring functions can be used with substantially simpler inference procedures. Specifically, we introduce a sampling-based parser that can easily handle arbitrary global features. Inspired by SampleRank, we learn to take guided stochastic steps towards a high scoring parse. We introduce two samplers for traversing the space of trees, Gibbs and Metropolis-Hastings with Random Walk. The model outperforms state-of-the-art results when evaluated on 14 languages of non-projective CoNLL datasets. Our sampling-based approach naturally extends to joint prediction scenarios, such as joint parsing and POS correction. The resulting method outperforms the best reported results on the CATiB dataset, approaching performance of parsing with gold tags. © 2014 Association for Computational Linguistics.

Cite

CITATION STYLE

APA

Zhang, Y., Lei, T., Barzilay, R., Jaakkola, T., & Globerson, A. (2014). Steps to excellence: Simple inference with refined scoring of dependency trees. In 52nd Annual Meeting of the Association for Computational Linguistics, ACL 2014 - Proceedings of the Conference (Vol. 1, pp. 197–207). Association for Computational Linguistics (ACL). https://doi.org/10.3115/v1/p14-1019

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free