Event-Based Moving Object Detection and Tracking

310Citations
Citations of this article
292Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Event-based vision sensors, such as the Dynamic Vision Sensor (DVS), are ideally suited for real-time motion analysis. The unique properties encompassed in the readings of such sensors provide high temporal resolution, superior sensitivity to light and low latency. These properties provide the grounds to estimate motion efficiently and reliably in the most sophisticated scenarios, but these advantages come at a price - modern event-based vision sensors have extremely low resolution, produce a lot of noise and require the development of novel algorithms to handle the asynchronous event stream. This paper presents a new, efficient approach to object tracking with asynchronous cameras. We present a novel event stream representation which enables us to utilize information about the dynamic (temporal)component of the event stream. The 3D geometry of the event stream is approximated with a parametric model to motion-compensate for the camera (without feature tracking or explicit optical flow computation), and then moving objects that don't conform to the model are detected in an iterative process. We demonstrate our framework on the task of independent motion detection and tracking, where we use the temporal model inconsistencies to locate differently moving objects in challenging situations of very fast motion.

Cite

CITATION STYLE

APA

Mitrokhin, A., Fermuller, C., Parameshwara, C., & Aloimonos, Y. (2018). Event-Based Moving Object Detection and Tracking. In IEEE International Conference on Intelligent Robots and Systems (pp. 6895–6902). Institute of Electrical and Electronics Engineers Inc. https://doi.org/10.1109/IROS.2018.8593805

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free