Improving the state space organization of untrained recurrent networks

0Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Recurrent neural networks are frequently used in cognitive science community for modeling linguistic structures. More or less intensive training process is usually performed but several works showed that untrained recurrent networks initialized with small weights can be also successfully used for this type of tasks. In this work we demonstrate that the state space organization of untrained recurrent neural network can be significantly improved by choosing appropriate input representations. We experimentally support this notion on several linguistic time series. © 2009 Springer Berlin Heidelberg.

Cite

CITATION STYLE

APA

Čerňanský, M., Makula, M., & Beňušková, Ľ. (2009). Improving the state space organization of untrained recurrent networks. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5506 LNCS, pp. 671–678). https://doi.org/10.1007/978-3-642-02490-0_82

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free