A new stochastic learning algorithm for neural networks

5Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.

Abstract

A new stochastic learning algorithm using Gaussian white noise sequence, referred to as Subconscious Noise Reaction (SNR), is proposed for a class of discrete-time neural networks with time-dependent connection weights. Unlike the back-propagation-through-time (BTT) algorithm, SNR does not require the synchronous transmission of information backward along connection weights, while it uses only ubiquitous noise and local signals, which are correlated against a single performance functional, to achieve simple sequential (chronologically ordered) updating of connection weights. The algorithm is derived and analyzed on the basis of a functional derivative formulation of the gradient descent method in conjunction with stochastic sensitivity analysis techniques using the variational approach.

Cite

CITATION STYLE

APA

Koda, M., & Okano, H. (2000). A new stochastic learning algorithm for neural networks. Journal of the Operations Research Society of Japan, 43(4), 469–485. https://doi.org/10.15807/jorsj.43.469

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free