By adding chaotic sequences to a neural network that solves combinatorial optimization problems, its performance improves much better than the case that random number sequences are added. It was already shown in a previous study that a specific autocorrelation of the chaotic noise makes a positive effect on its high performance. Autocorrelation of such an effective chaotic noise takes a negative value at lag 1, and decreases with dumped oscillation as the lag increases. In this paper, we generate a stochastic noise whose autocorrelation is C(τ)∈≈∈C ×(∈-∈r) τ , similar to effective chaotic noise, and evaluate the performance of the neural network with such stochastic noise. First, we show that an appropriate amplitude value of the additive noise changes depending on the negative autocorrelation parameter r. We also show that the performance with negative autocorrelation noise is better than those with the white Gaussian noise and positive autocorrelation noise, and almost the same as that of the chaotic noise. Based on such results, it can be considered that high solvable performance of the additive chaotic noise is due to its negative autocorrelation. © 2008 Springer-Verlag Berlin Heidelberg.
CITATION STYLE
Hasegawa, M., & Umeno, K. (2008). Solvable performances of optimization neural networks with chaotic noise and stochastic noise with negative autocorrelation. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4984 LNCS, pp. 693–702). https://doi.org/10.1007/978-3-540-69158-7_72
Mendeley helps you to discover research relevant for your work.