Statistical parsing with probabilistic symbol-refined tree substitution grammars

ISSN: 10450823
2Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.

Abstract

We present probabilistic Symbol-Refined Tree Substitution Grammars (SR-TSG) for statistical parsing of natural language sentences. An SR-TSG is an extension of the conventional TSG model where each nonterminal symbol can be refined (subcategorized) to fit the training data. Our probabilistic model is consistent based on the hierarchical Pitman-Yor Process to encode backoff smoothing from a fine-grained SR-TSG to simpler CFG rules, thus all grammar rules can be learned from training data in a fully automatic fashion. Our SR-TSG parser achieves the state-of-the-art performance on the Wall Street Journal (WSJ) English Penn Treebank data.

Cite

CITATION STYLE

APA

Shindo, H., Miyao, Y., Fujino, A., & Nagata, M. (2013). Statistical parsing with probabilistic symbol-refined tree substitution grammars. In IJCAI International Joint Conference on Artificial Intelligence (pp. 3082–3086).

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free