Computational models of visual processing often use framebased image acquisition techniques to process a temporally changing stimulus. This approach is unlike biological mechanisms that are spikebased and independent of individual frames. The neuromorphic Dynamic Vision Sensor (DVS) [Lichtsteiner et al., 2008] provides a stream of independent visual events that indicate local illumination changes, resembling spiking neurons at a retinal level. We introduce a new approach for the modelling of cortical mechanisms of motion detection along the dorsal pathway using this type of representation. Our model combines filters with spatio-temporal tunings also found in visual cortex to yield spatiotemporal and direction specificity. We probe our model with recordings of test stimuli, articulated motion and ego-motion. We show how our approach robustly estimates optic flow and also demonstrate how this output can be used for classification purposes.
CITATION STYLE
Tschechne, S., Sailer, R., & Neumann, H. (2014). Bio-inspired optic flow from event-based neuromorphic sensor input. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8774, pp. 171–182). Springer Verlag. https://doi.org/10.1007/978-3-319-11656-3_16
Mendeley helps you to discover research relevant for your work.