Recent work on supertagging using a feed-forward neural network achieved signifi-cant improvements for CCG supertagging and parsing (Lewis and Steedman, 2014). However, their architecture is limited to considering local contexts and does not naturally model sequences of arbitrary length. In this paper, we show how di-rectly capturing sequence information us-ing a recurrent neural network leads to fur-ther accuracy improvements for both su-pertagging (up to 1.9%) and parsing (up to 1% FI), on CCGBank, Wikipedia and biomedical text.
CITATION STYLE
Xu, W., Auli, M., & Clark, S. (2015). CCG supertagging with a recurrent neural network. In ACL-IJCNLP 2015 - 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing, Proceedings of the Conference (Vol. 2, pp. 250–255). Association for Computational Linguistics (ACL). https://doi.org/10.3115/v1/p15-2041
Mendeley helps you to discover research relevant for your work.