On capacity of memory in chaotic neural networks with incremental learning

11Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Neural networks are able to learn more patterns with the incremental learning than with the correlative learning. The incremental learning is a method to compose an associate memory using a chaotic neural network. In former work, it was found that the capacity of the network increases along with its size, which is the number of the neurons in the network, until some threshold size and that it decreases over that size. The threshold size and the capacity varied between 2 different learning parameters. In this paper, the capacity of the networks was investigated changing the learning parameter. Through the computer simulations, it turned out that the capacity also increases in proportion to the network size in larger sizes and that the capacity of the network with the incremental learning is above 11 times larger than the one with correlative learning. © 2008 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Deguchi, T., Matsuno, K., & Ishii, N. (2008). On capacity of memory in chaotic neural networks with incremental learning. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5178 LNAI, pp. 919–925). Springer Verlag. https://doi.org/10.1007/978-3-540-85565-1_114

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free