The unstoppable rise of computational linguistics in deep learning

20Citations
Citations of this article
205Readers
Mendeley users who have this article in their library.

Abstract

In this paper, we trace the history of neural networks applied to natural language understanding tasks, and identify key contributions which the nature of language has made to the development of neural network architectures. We focus on the importance of variable binding and its instantiation in attention-based models, and argue that Transformer is not a sequence model but an induced-structure model. This perspective leads to predictions of the challenges facing research in deep learning architectures for natural language understanding.

Cite

CITATION STYLE

APA

Henderson, J. (2020). The unstoppable rise of computational linguistics in deep learning. In Proceedings of the Annual Meeting of the Association for Computational Linguistics (pp. 6294–6306). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/2020.acl-main.561

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free