Descriptive models of the retina have been essential to understand how retinal neurons convert visual stimuli into a neural response. With recent advancements of neuroimaging techniques, availability of an increasing amount of physiological data and current computational capabilities, we now have powerful resources for developing biologically more realistic models of the brain. In this work, we implemented a two-dimensional network model of the primate retina that uses conductance-based neurons. The model aims to provide neuroscientists who work in visual areas beyond the retina with a realistic retinal model whose parameters have been carefully tuned based on data from the primate fovea and whose response at every stage of the model adequately reproduces neuronal behavior. We exhaustively benchmarked the model against wellestablished visual stimuli, showing spatial and temporal responses of the model neurons to light flashes, which can be disk- or ring-shaped, and to sine-wave gratings of varying spatial frequency. The model describes the red-green and blue-yellow color opponency of retinal cells that connect to parvocellular and koniocellular cells in the Lateral Geniculate Nucleus (LGN), respectively. The model was implemented in the widely used neural simulation tool NEST and the code has been released as open source software.
CITATION STYLE
Martínez-Cañada, P., Morillas, C., & Pelayo, F. (2017). A conductance-based neuronal network model for color coding in the primate foveal retina. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 10337 LNCS, pp. 63–74). Springer Verlag. https://doi.org/10.1007/978-3-319-59740-9_7
Mendeley helps you to discover research relevant for your work.