Learning structure with many-take-all networks

0Citations
Citations of this article
10Readers
Mendeley users who have this article in their library.
Get full text

Abstract

It is shown that by restricting the number of active neurons in a layer of a Boltzmann machine, a sparse distributed coding of the input data can be learned. Unlike Winner-Take-All, this coding reveals the distance structure in the training data and thus introduces proximity in the learned code. Analogous to the normal Radial Basis Boltzmann Machine, the network uses an annealing schedule to avoid local minima. The annealing is terminated when generalization performance deteriorates. It shows symmetry breaking and a critical temperature, depending on the data distribution and the number of winners. The learned structure is independent of the details of the architecture if the number of neurons and the number of active neurons are chosen sufficiently large.

Cite

CITATION STYLE

APA

Tax, D., & Kappen, H. J. (1996). Learning structure with many-take-all networks. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 1112 LNCS, pp. 95–100). Springer Verlag. https://doi.org/10.1007/3-540-61510-5_20

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free