In this paper, we propose a bio-inspired and developmental neural model that allows a robot, after learning its own dynamics during a babbling phase, to gain imitative and shape recognition abilities leading to early attempts for physical and social interactions. We use a motor controller based on oscillators. During the babbling step, the robot learns to associate its motor primitives (oscillators) to the visual optical flow induced by its own arm. It also statically learn to recognize its arm by selecting moving local view (feature points) in the visual field. In real indoor experiments we demonstrate that, using the same model, early physical (reaching objects) and social (immediate imitation) interactions can emerge through visual ambiguities induced by the external visual stimuli. © 2014 Springer International Publishing Switzerland.
CITATION STYLE
Braud, R., Mostafaoui, G., Karaouzene, A., & Gaussier, P. (2014). Simulating the emergence of early physical and social interactions: A developmental route through low level visuomotor learning. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8575 LNAI, pp. 154–165). Springer Verlag. https://doi.org/10.1007/978-3-319-08864-8_15
Mendeley helps you to discover research relevant for your work.