SOPA: Bridging CnNs, RNNs, and weighted finite-state machines

20Citations
Citations of this article
175Readers
Mendeley users who have this article in their library.

Abstract

Recurrent and convolutional neural networks comprise two distinct families of models that have proven to be useful for encoding natural language utterances. In this paper we present SoPa, a new model that aims to bridge these two approaches. SoPa combines neural representation learning with weighted finite-state automata (WFSAs) to learn a soft version of traditional surface patterns. We show that SoPa is an extension of a one-layer CNN, and that such CNNs are equivalent to a restricted version of SoPa, and accordingly, to a restricted form of WFSA. Empirically, on three text classification tasks, SoPa is comparable or better than both a BiLSTM (RNN) baseline and a CNN baseline, and is particularly useful in small data settings.

Cite

CITATION STYLE

APA

Schwartz, R., Thomson, S., & Smith, N. A. (2018). SOPA: Bridging CnNs, RNNs, and weighted finite-state machines. In ACL 2018 - 56th Annual Meeting of the Association for Computational Linguistics, Proceedings of the Conference (Long Papers) (Vol. 1, pp. 295–305). Association for Computational Linguistics (ACL). https://doi.org/10.18653/v1/p18-1028

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free