A new stochastic learning algorithm using Gaussian white noise sequence, referred to as Subconscious Noise Reaction (SNR), is proposed for a class of discrete-time neural networks with time-dependent connection weights. Unlike the back-propagation-through-time (BTT) algorithm, SNR does not require the synchronous transmission of information backward along connection weights, while it uses only ubiquitous noise and local signals, which are correlated against a single performance functional, to achieve simple sequential (chronologically ordered) updating of connection weights. The algorithm is derived and analyzed on the basis of a functional derivative formulation of the gradient descent method in conjunction with stochastic sensitivity analysis techniques using the variational approach.
CITATION STYLE
Koda, M., & Okano, H. (2000). A new stochastic learning algorithm for neural networks. Journal of the Operations Research Society of Japan, 43(4), 469–485. https://doi.org/10.15807/jorsj.43.469
Mendeley helps you to discover research relevant for your work.