Lightweight visual odometry for autonomous mobile robots

32Citations
Citations of this article
65Readers
Mendeley users who have this article in their library.

Abstract

Vision-based motion estimation is an effective means for mobile robot localization and is often used in conjunction with other sensors for navigation and path planning. This paper presents a low-overhead real-time ego-motion estimation (visual odometry) system based on either a stereo or RGB-D sensor. The algorithm’s accuracy outperforms typical frame-to-frame approaches by maintaining a limited local map, while requiring significantly less memory and computational power in contrast to using global maps common in full visual SLAM methods. The algorithm is evaluated on common publicly available datasets that span different use-cases and performance is compared to other comparable open-source systems in terms of accuracy, frame rate and memory requirements. This paper accompanies the release of the source code as a modular software package for the robotics community compatible with the Robot Operating System (ROS).

Cite

CITATION STYLE

APA

Aladem, M., & Rawashdeh, S. A. (2018). Lightweight visual odometry for autonomous mobile robots. Sensors (Switzerland), 18(9). https://doi.org/10.3390/s18092837

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free