Adaptive Hopfield network

2Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This paper proposes an innovative enhancement of the classical Hopfield network algorithm (and potentially its stochastic derivatives) with an "adaptation mechanism" to guide the neural search process towards high-quality solutions for large-scale static optimization problems. Specifically, a novel methodology that employs gradient-descent in the error space to adapt weights and constraint weight parameters in order to guide the network dynamics towards solutions is formulated. In doing so, a creative algebraic approach to define error values for each neuron without knowing the desired output values for the same is adapted. © Springer-Verlag Berlin Heidelberg 2003.

Cite

CITATION STYLE

APA

Serpen, G. (2003). Adaptive Hopfield network. Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2714, 3–10. https://doi.org/10.1007/3-540-44989-2_1

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free