CCG supertagging with a recurrent neural network

47Citations
Citations of this article
118Readers
Mendeley users who have this article in their library.

Abstract

Recent work on supertagging using a feed-forward neural network achieved signifi-cant improvements for CCG supertagging and parsing (Lewis and Steedman, 2014). However, their architecture is limited to considering local contexts and does not naturally model sequences of arbitrary length. In this paper, we show how di-rectly capturing sequence information us-ing a recurrent neural network leads to fur-ther accuracy improvements for both su-pertagging (up to 1.9%) and parsing (up to 1% FI), on CCGBank, Wikipedia and biomedical text.

Cite

CITATION STYLE

APA

Xu, W., Auli, M., & Clark, S. (2015). CCG supertagging with a recurrent neural network. In ACL-IJCNLP 2015 - 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Conference on Natural Language Processing of the Asian Federation of Natural Language Processing, Proceedings of the Conference (Vol. 2, pp. 250–255). Association for Computational Linguistics (ACL). https://doi.org/10.3115/v1/p15-2041

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free