Looking is one of the most basic and fundamental goal-directed behaviors. The neural circuitry that generates gaze shifts towards target objects is adaptive and compensates for changes in the sensorimotor plant. Here, we present a neural-dynamic architecture, which enables an embodied agent to direct its gaze towards salient objects in its environment. The sensorimotor mapping, which is needed to accurately plan the gaze shifts, is initially learned and is constantly updated by a gain adaptation mechanism. We implemented the architecture in a simulated robotic agent and demonstrated autonomous map learning and adaptation in an embodied setting. © 2014 Springer International Publishing Switzerland.
CITATION STYLE
Bell, C., Storck, T., & Sandamirskaya, Y. (2014). Learning to look: A dynamic neural fields architecture for gaze shift generation. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8681 LNCS, pp. 699–706). Springer Verlag. https://doi.org/10.1007/978-3-319-11179-7_88
Mendeley helps you to discover research relevant for your work.