Tailoring the performance of attractor neural networks

  • Wong K
  • Sherrington D
N/ACitations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

First, we study the effects of introducing training noise on the retrievalbehaviours of dilute attractor neural networks. We found that, ingeneral, training noise enhances associativity, but also reducesthe attractor overlap. At a narrow range of storage levels, however,the system exhibits re-entrant retrieval behaviour on increasingtraining noise. Secondly, we consider optimization of network performance,and subsequently the storage capacity, in the presence of retrievalnoise (temperature). This is achieved by adapting the network toan appropriate training overlap, which is determined self-consistentlyby the optimal attractor overlap. The maximum storage capacity deviatesfrom the storage capacity of the maximally stable network on increasingtemperature, and in the high temperature regime (T \ge 0.38 forGaussian noise, the Hebb-rule network yields the maximum storagecapacity. Our analysis demonstrates the principles of specializationand adaptation in neural networks.

Cite

CITATION STYLE

APA

Wong, K. Y. M., & Sherrington, D. (2008). Tailoring the performance of attractor neural networks. In Statistical Mechanics of Neural Networks (pp. 105–119). Springer Berlin Heidelberg. https://doi.org/10.1007/3540532676_44

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free