Unsupervised learning of sensory primitives from optical flow fields

2Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Adaptive behaviour of animats largely depends on the processing of their sensory information. In this paper, we examine the estimation of robot egomotion from visual input by unsupervised online learning. The input is a sparse optical flow field constructed from discrete motion detectors. The global flow field properties depend on the robot motion, the spatial distribution of motion detectors with respect to the robot body and the visual environment. We show how online linear Principal Component Analysis can be applied to this problem to enable a robot to continuously adapt to a changing environment. © 2014 Springer International Publishing Switzerland.

Cite

CITATION STYLE

APA

Berthold, O., & Hafner, V. V. (2014). Unsupervised learning of sensory primitives from optical flow fields. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8575 LNAI, pp. 188–197). Springer Verlag. https://doi.org/10.1007/978-3-319-08864-8_18

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free