Neural network learning using low-discrepancy sequence

3Citations
Citations of this article
1Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Backpropagation, (BP), is one of the most frequently used practical methods for supervised training of artificial neural networks. During the learning process, BP may get stuck in local minima, producing suboptimal solution, and thus limiting the effectiveness of the training. This work is dedicated to the problem of avoiding local minima and introduces a new technique for learning, which substitutes gradient descent algorithm in the BP with an optimization method for a global search in a multi-dimensional parameter (weight) space. For this purpose, a low-discrepancy LPT sequence is used. The proposed method is discussed and tested with common benchmark problems at the end.

Author supplied keywords

Cite

CITATION STYLE

APA

Jordanov, I., & Brown, R. (1999). Neural network learning using low-discrepancy sequence. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 1747, pp. 255–267). Springer Verlag. https://doi.org/10.1007/3-540-46695-9_22

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free