Dynamic search trajectory methods for neural network training

1Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Training multilayer feedforward neural networks corresponds to the global minimization of the network error function. To address this problem we utilize the Snyman and Fatti approach by considering a system of second order differential equations of the form, ẍ = -∇E(x), where x is the vector of network weights and ∇E is the gradient of the network error function E. Equilibrium points of the above system of differential equations correspond to optimizers of the network error function. The proposed approach is described and experimental results are discussed.

Cite

CITATION STYLE

APA

Petalas, Y. G., Tasoulis, D. K., & Vrahatis, M. N. (2004). Dynamic search trajectory methods for neural network training. In Lecture Notes in Artificial Intelligence (Subseries of Lecture Notes in Computer Science) (Vol. 3070, pp. 241–246). Springer Verlag. https://doi.org/10.1007/978-3-540-24844-6_32

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free