Learning the dynamics of embedded clauses

11Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Recent work by Siegelmann has shown that the computational power of recurrent neural networks matches that of Turing Machines. One important implication is that complex language classes (infinite languages with embedded clauses) can be represented in neural networks. Proofs are based on a fractal encoding of states to simulate the memory and operations of stacks. In the present work, it is shown that similar stack-like dynamics can be learned in recurrent neural networks from simple sequence prediction tasks. Two main types of network solutions are found and described qualitatively as dynamical systems: damped oscillation and entangled spiraling around fixed points. The potential and limitations of each solution type are established in terms of generalization on two different context-free languages. Both solution types constitute novel stack implementations - generally in line with Siegelmann's theoretical work - which supply insights into how embedded structures of languages can be handled in analog hardware.

Cite

CITATION STYLE

APA

Bodén, M., & Blair, A. (2003). Learning the dynamics of embedded clauses. Applied Intelligence, 19(1–2), 51–63. https://doi.org/10.1023/A:1023816706954

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free