On learning parameters of incremental learning in chaotic neural network

0Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The incremental learning is a method to compose an associate memory using a chaotic neural network and provides larger capacity than correlative learning in compensation for a large amount of computation. A chaotic neuron has spatio-temporal sum in it and the temporal sum makes the learning stable to input noise. When there is no noise in input, the neuron may not need temporal sum. In this paper, to reduce the computations, a simplified network without temporal sum is introduced and investigated through the computer simulations comparing with the network as in the past. Then, to shorten the learning steps, the learning parameters are changed during the learning along 3 functions.

Cite

CITATION STYLE

APA

Deguchi, T., & Ishii, N. (2016). On learning parameters of incremental learning in chaotic neural network. In Communications in Computer and Information Science (Vol. 629, pp. 241–252). Springer Verlag. https://doi.org/10.1007/978-3-319-44188-7_18

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free