Neuron-less neural-like networks with exponential association capacity at Tabula Rasa

0Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Artificial neural networks have been used as models of associative memory but their storage capacity is severely limited. Alternative machine-learning approaches perform better in classification tasks but require long learning sessions to build an optimized representational space. Here we present a radically new approach to the problem of classification based on the fact that networks associated to random hard constraint satisfaction problems display naturally an exponentially large number of attractor clusters. We introduce a warning propagation dynamics that allows selective mapping of arbitrary input vector onto these well-separated clusters of states, without need of training. The potential for such networks with exponential capacity to handle inputs with a combinatorially complex structure is finally explored with a toy-example. © 2009 Springer Berlin Heidelberg.

Cite

CITATION STYLE

APA

Battaglia, D. (2009). Neuron-less neural-like networks with exponential association capacity at Tabula Rasa. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 5601 LNCS, pp. 184–194). https://doi.org/10.1007/978-3-642-02264-7_20

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free