A Novel Multi-vision Sensor Dataset for Insect-Inspired Outdoor Autonomous Navigation

1Citations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Insects have—over millions of years of evolution—perfected many of the systems that roboticists aim to achieve; they can swiftly and robustly navigate through different environments under various conditions while at the same time being highly energy efficient. To reach this level of performance and efficiency, one might want to look at and take inspiration from how these insects achieve their feats. Currently, no dataset exists that allows bio-inspired navigation models to be evaluated over long >100 m real-life routes. We present a novel dataset containing omnidirectional event vision, frame-based vision, depth frames, inertial measurement (IMU) readings, and centimeter-accurate GNSS positioning over kilometer long stretches in and around the TUDelft campus. The dataset is used to evaluate familiarity-based insect-inspired neural navigation models on their performance over longer sequences. It demonstrates that current scene familiarity models are not suited for long-ranged navigation, at least not in their current form.

Cite

CITATION STYLE

APA

Verheyen, J. K. N., Dupeyroux, J., & Croon, G. C. H. E. de. (2022). A Novel Multi-vision Sensor Dataset for Insect-Inspired Outdoor Autonomous Navigation. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 13548 LNAI, pp. 279–291). Springer Science and Business Media Deutschland GmbH. https://doi.org/10.1007/978-3-031-20470-8_28

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free