A large deviation principle for networks of rate neurons with correlated synaptic weights

  • Faugeras O
  • MacLaurin J
N/ACitations
Citations of this article
14Readers
Mendeley users who have this article in their library.

Abstract

We study the asymptotic law of a network of interacting neurons when the number of neurons becomes infinite. Given a completely connected network of firing rate neurons in which the synaptic weights are Gaussian correlated random variables, we describe the asymptotic law of the network when the number of neurons goes to infinity. We introduce the process-level empirical measure of the trajectories of the solutions to the equations of the finite network of neurons and the averaged law (with respect to the synaptic weights) of the trajectories of the solutions to the equations of the network of neurons. The main result of this article is that the image law through the empirical measure satisfies a large deviation principle with a good rate function which is shown to have a unique global minimum. Our analysis of the rate function allows us also to characterize the limit measure as the image of a stationary Gaussian measure defined on a transformed set of trajectories. This is potentially very useful for applications in neuroscience since the Gaussian measure can be completely characterized by its mean and spectral density. It also facilitates the assessment of the probability of finite-size effects.

Cite

CITATION STYLE

APA

Faugeras, O., & MacLaurin, J. (2013). A large deviation principle for networks of rate neurons with correlated synaptic weights. BMC Neuroscience, 14(S1). https://doi.org/10.1186/1471-2202-14-s1-p252

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free