Event-Based Line Fitting and Segment Detection Using a Neuromorphic Visual Sensor

23Citations
Citations of this article
46Readers
Mendeley users who have this article in their library.

Abstract

This paper introduces an event-based luminance-free algorithm for line and segment detection from the output of asynchronous event-based neuromorphic retinas. These recent biomimetic vision sensors are composed of autonomous pixels, each of them asynchronously generating visual events that encode relative changes in pixels' illumination at high temporal resolutions. This frame-free approach results in an increased energy efficiency and in real-time operation, making these sensors especially suitable for applications such as autonomous robotics. The proposed algorithm is based on an iterative event-based weighted least squares fitting, and it is consequently well suited to the high temporal resolution and asynchronous acquisition of neuromorphic cameras: parameters of a current line are updated for each event attributed (i.e., spatio-temporally close) to it, while implicitly forgetting the contribution of older events according to a speed-tuned exponentially decaying function. A detection occurs if a measure of activity, i.e., implicit measure of the number of contributing events and using the same decay function, exceeds a given threshold. The speed-tuned decreasing function is based on a measure of the apparent motion, i.e., the optical flow computed around each event. This latter ensures that the algorithm behaves independently of the edges' dynamics. Line segments are then extracted from the lines, allowing for the tracking of the corresponding endpoints. We provide experiments showing the accuracy of our algorithm and study the influence of the apparent velocity and relative orientation of the observed edges. Finally, evaluations of its computational efficiency show that this algorithm can be envisioned for high-speed applications, such as vision-based robotic navigation.

References Powered by Scopus

Use of the Hough Transformation to Detect Lines and Curves in Pictures

5635Citations
N/AReaders
Get full text

Generalizing the Hough transform to detect arbitrary shapes

3614Citations
N/AReaders
Get full text

A 128 × 128 120 dB 15 μs latency asynchronous temporal contrast vision sensor

1904Citations
N/AReaders
Get full text

Cited by Powered by Scopus

Real-Time High Speed Motion Prediction Using Fast Aperture-Robust Event-Driven Visual Flow

37Citations
N/AReaders
Get full text

Hough<sup>2</sup>Map-Iterative Event-Based Hough Transform for High-Speed Railway Mapping

23Citations
N/AReaders
Get full text

LB-LSD: A length-based line segment detector for real-time applications

21Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Reverter Valeiras, D., Clady, X., Ieng, S. H., & Benosman, R. (2019). Event-Based Line Fitting and Segment Detection Using a Neuromorphic Visual Sensor. IEEE Transactions on Neural Networks and Learning Systems, 30(4), 1218–1230. https://doi.org/10.1109/TNNLS.2018.2807983

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 16

62%

Researcher 6

23%

Professor / Associate Prof. 3

12%

Lecturer / Post doc 1

4%

Readers' Discipline

Tooltip

Engineering 13

52%

Computer Science 9

36%

Neuroscience 2

8%

Chemistry 1

4%

Article Metrics

Tooltip
Mentions
Blog Mentions: 1

Save time finding and organizing research with Mendeley

Sign up for free