A New Activation Function in the Hopfield Network for Solving Optimization Problems

  • Zeng X
  • Martinez T
N/ACitations
Citations of this article
5Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper shows that the performance of the Hoppeld network for solving optimization problems can be improved by using a new activation (output) function. The eeects of the activation function on the performance of the Hoppeld network are analyzed. It is shown that the sigmoid activation function in the Hoppeld network is sensitive to noise of neurons. The reason is that the sigmoid function is most sensitive in the range where noise is most predominant. A new activation function that is more robust against noise is proposed. The new activation function has the capability of amplifying the signals between neurons while suppressing noise. The performance of the new activation function is evaluated through simulation. Compared with the sigmoid function , the new activation function reduces the error rate of tour length by 30.6% and increases the percentage of valid tours by 38.6% during simulation on 200 randomly generated city distributions of the 10-city traveling salesman problem.

Cite

CITATION STYLE

APA

Zeng, X., & Martinez, T. R. (1999). A New Activation Function in the Hopfield Network for Solving Optimization Problems. In Artificial Neural Nets and Genetic Algorithms (pp. 73–77). Springer Vienna. https://doi.org/10.1007/978-3-7091-6384-9_13

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free