Synthesizing Context-free Grammars from Recurrent Neural Networks

13Citations
Citations of this article
11Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

We present an algorithm for extracting a subclass of the context free grammars (CFGs) from a trained recurrent neural network (RNN). We develop a new framework, pattern rule sets (PRSs), which describe sequences of deterministic finite automata (DFAs) that approximate a non-regular language. We present an algorithm for recovering the PRS behind a sequence of such automata, and apply it to the sequences of automata extracted from trained RNNs using the L∗algorithm. We then show how the PRS may converted into a CFG, enabling a familiar and useful presentation of the learned language. Extracting the learned language of an RNN is important to facilitate understanding of the RNN and to verify its correctness. Furthermore, the extracted CFG can augment the RNN in classifying correct sentences, as the RNN’s predictive accuracy decreases when the recursion depth and distance between matching delimiters of its input sequences increases.

Cite

CITATION STYLE

APA

Yellin, D. M., & Weiss, G. (2021). Synthesizing Context-free Grammars from Recurrent Neural Networks. In Lecture Notes in Computer Science (Vol. 12651 LNCS, pp. 351–369). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-030-72016-2_19

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free