Multiscale analysis of slow-fast neuronal learning models with noise

  • Galtier M
  • Wainrib G
N/ACitations
Citations of this article
28Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

This paper deals with the application of temporal averaging methods to recurrent networks of noisy neurons undergoing a slow and unsupervised modification of their connectivity matrix called learning. Three time-scales arise for these models: (i) the fast neuronal dynamics, (ii) the intermediate external input to the system, and (iii) the slow learning mechanisms. Based on this time-scale separation, we apply an extension of the mathematical theory of stochastic averaging with periodic forcing in order to derive a reduced deterministic model for the connectivity dynamics. We focus on a class of models where the activity is linear to understand the specificity of several learning rules (Hebbian, trace or anti-symmetric learning). In a weakly connected regime, we study the equilibrium connectivity which gathers the entire 'knowledge' of the network about the inputs. We develop an asymptotic method to approximate this equilibrium. We show that the symmetric part of the connectivity post-learning encodes the correlation structure of the inputs, whereas the anti-symmetric part corresponds to the cross correlation between the inputs and their time derivative. Moreover, the time-scales ratio appears as an important parameter revealing temporal correlations.

Cite

CITATION STYLE

APA

Galtier, M., & Wainrib, G. (2012). Multiscale analysis of slow-fast neuronal learning models with noise. The Journal of Mathematical Neuroscience, 2(1), 13. https://doi.org/10.1186/2190-8567-2-13

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free