We introduce a globally normalized transition-based neural network model that achieves state-of-the-art part-of-speech tagging, dependency parsing and sentence compression results. Our model is a simple feed-forward neural network that operates on a task-specific transition system, yet achieves comparable or better accuracies than recurrent models. We discuss the importance of global as opposed to local normalization: a key insight is that the label bias problem implies that globally normalized models can be strictly more expressive than locally normalized models.
CITATION STYLE
Andor, D., Alberti, C., Weiss, D., Severyn, A., Presta, A., Ganchev, K., … Collins, M. (2016). Globally normalized transition-based neural networks. In 54th Annual Meeting of the Association for Computational Linguistics, ACL 2016 - Long Papers (Vol. 4, pp. 2442–2452). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/p16-1231
Mendeley helps you to discover research relevant for your work.