Biologically-inspired event-driven silicon retinas, so called dynamic vision sensors (DVS), allow efficient solutions for various visual perception tasks, e.g. surveillance, tracking, or motion detection. Similar to retinal photoreceptors, any perceived light intensity change in the DVS generates an event at the corresponding pixel. The DVS thereby emits a stream of spatiotemporal events to encode visually perceived objects that in contrast to conventional frame-based cameras, is largely free of redundant background information. The DVS offers multiple additional advantages, but requires the development of radically new asynchronous, event-based information processing algorithms. In this paper we present a fully event-based disparity matching algorithm for reliable 3D depth perception using a dynamic cooperative neural network. The interaction between cooperative cells applies cross-disparity uniqueness-constraints and within-disparity continuity-constraints, to asynchronously extract disparity for each new event, without any need of buffering individual events. We have investigated the algorithm’s performance in several experiments; our results demonstrate smooth disparity maps computed in a purely event-based manner, even in the scenes with temporally-overlapping stimuli.
CITATION STYLE
Firouzi, M., & Conradt, J. (2016). Asynchronous Event-based Cooperative Stereo Matching Using Neuromorphic Silicon Retinas. Neural Processing Letters, 43(2), 311–326. https://doi.org/10.1007/s11063-015-9434-5
Mendeley helps you to discover research relevant for your work.