A regularized line search tunneling for efficient neural network learning

3Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

A novel two phases training algorithm for a multilayer perceptron with regularization is proposed to solve a local minima problem for training networks and to enhance the generalization property of networks trained. The first phase is a trust region-based local search for fast training of networks. The second phase is an regularized line search tunneling for escaping local minima and moving toward a weight vector of next descent. These two phases are repeated alternatively in the weight space to achieve a goal training error. Benchmark results demonstrate a significant performance improvement of the proposed algorithm compared to other existing training algorithms. © Springer-Verlag 2004.

Cite

CITATION STYLE

APA

Lee, D. W., Choi, H. J., & Lee, J. (2004). A regularized line search tunneling for efficient neural network learning. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 3173, 239–243. https://doi.org/10.1007/978-3-540-28647-9_41

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free