A new kind of hopfield networks for finding global optimum

3Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.
Get full text

Abstract

The Hopfield network has been applied to solve optimization problems over decades. However, it still has many limitations in accomplishing this task. Most of them are inherited from the optimization algorithms it implements. The computation of a Hopfield network, defined by a set of difference equations, can easily be trapped into one local optimum or another, sensitive to initial conditions, perturbations, and neuron update orders. It doesn't know how long it will take to converge, as well as if the final solution is a global optimum, or not In this paper, we present a Hopfield network with a new set of difference equations to fix those problems. The difference equations directly implement a new powerful optimization algorithm. © 2005 IEEE.

Cite

CITATION STYLE

APA

Huang, X. (2005). A new kind of hopfield networks for finding global optimum. In Proceedings of the International Joint Conference on Neural Networks (Vol. 2, pp. 764–769). https://doi.org/10.1109/IJCNN.2005.1555948

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free