Applications of Information Theory to Analysis of Neural Data

  • Schultz S
  • Ince R
  • Panzeri S
N/ACitations
Citations of this article
25Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Information theory is a practical and theoretical framework developed for the study of communication over noisy channels. Its probabilistic basis and capacity to relate statistical structure to function make it ideally suited for studying information flow in the nervous system . It has a number of useful properties : it is a general measure sensitive to any relationship , not only linear effects ; it has meaningful units which in many cases allow direct comparison between different experiments ; and it can be used to study how much information can be gained by observing neural responses in single trials , rather than in averages over multiple trials . A variety of information theoretic quantities are commonly used in neuroscience – (see entry " Definitions of Information - ­‐Theoretic Quantities ") . Inthisentrywereviewsomeapplicationsofinformationtheoryinneurosciencetostudyencodingofinformationinbothsingleneuronsandneuronalpopulations.

Cite

CITATION STYLE

APA

Schultz, S. R., Ince, R. A. A., & Panzeri, S. (2014). Applications of Information Theory to Analysis of Neural Data. In Encyclopedia of Computational Neuroscience (pp. 1–6). Springer New York. https://doi.org/10.1007/978-1-4614-7320-6_280-1

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free