Hebbian Learning

  • Choe Y
N/ACitations
Citations of this article
19Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Information theory is a practical and theoretical framework developed for the study of communication over noisy channels. Its probabilistic basis and capacity to relate statistical structure to function make it ideally suited for studying information flow in the nervous system. As a framework it has a number of useful properties: it provides a general measure sensitive to any relationship, not only linear effects; its quantities have meaningful units which in many cases allow direct comparison between different experiments; and it can be used to study how much information can be gained by observing neural responses in single experimental trials, rather than in averages over multiple trials. A variety of information theoretic quantities are in common use in neuroscience - including the Shannon entropy, Kullback-Leibler divergence, and mutual information. In this entry, we introduce and define these quantities. Further details on how these quantities can be estimated in practice are provided in the entry "Estimation of Information-Theoretic Quantities" and examples of application of these techniques in neuroscience can be found in the entry "Applications of Information-Theoretic Quantities in Neuroscience".

Cite

CITATION STYLE

APA

Choe, Y. (2014). Hebbian Learning. In Encyclopedia of Computational Neuroscience (pp. 1–5). Springer New York. https://doi.org/10.1007/978-1-4614-7320-6_672-1

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free