Backpropagation, (BP), is one of the most frequently used practical methods for supervised training of artificial neural networks. During the learning process, BP may get stuck in local minima, producing suboptimal solution, and thus limiting the effectiveness of the training. This work is dedicated to the problem of avoiding local minima and introduces a new technique for learning, which substitutes gradient descent algorithm in the BP with an optimization method for a global search in a multi-dimensional parameter (weight) space. For this purpose, a low-discrepancy LPT sequence is used. The proposed method is discussed and tested with common benchmark problems at the end.
CITATION STYLE
Jordanov, I., & Brown, R. (1999). Neural network learning using low-discrepancy sequence. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 1747, pp. 255–267). Springer Verlag. https://doi.org/10.1007/3-540-46695-9_22
Mendeley helps you to discover research relevant for your work.