Abstract
We present a generative distributional model for the unsupervised induction of natural language syntax which explicitly models constituent yields and contexts. Parameter search with EM produces higher quality analyses than previously exhibited by unsupervised systems, giving the best published unsupervised parsing results on the ATIS corpus. Experiments on Penn treebank sentences of comparable length show an even higher F1of 71% on nontrivial brackets. We compare distributionally induced and actual part-of-speech tags as input data, and examine extensions to the basic model. We discuss errors made by the system, compare the system to previous models, and discuss upper bounds, lower bounds, and stability for this task.
Cite
CITATION STYLE
Klein, D., & Manning, C. D. (2002). A generative constituent-context model for improved grammar induction. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (Vol. 2002-July, pp. 128–135). Association for Computational Linguistics (ACL). https://doi.org/10.3115/1073083.1073106
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.