Real-time visual odometry covariance estimation for unmanned air vehicle navigation

12Citations
Citations of this article
23Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Demand is growing for unmanned air vehicles (UAVs) with greater autonomy, including the ability to navigate without GPS information, such as indoors. In this work, a novel visual odometry algorithm is developed and flight tested. It uses sequential pairs of red, green, blue, depth (RGBD) camera images to estimate the UAV's change in position (delta pose), which can be used to aid a navigation filter. Unlike existing related techniques, it uses a novel perturbation approach to estimate the uncertainty of the odometry measurement dynamically in real time, a technique that is applicable to a wide range of sensor preprocessing tasks aimed at generating navigation-relevant measurements. Real-time estimates of the delta pose and its covariance allow these estimates to be efficiently fused with other sensors in a navigation filter. Indoor flight testing was performed with motion capture, which demonstrated that the odometry and covariance estimates are accurate when appropriately scaled. Flights also demonstrated the algorithm used in a navigation filter to improve a velocity estimate, which represents a significant improvement over the state of the art for RGBD odometry.

Cite

CITATION STYLE

APA

Anderson, M. L., Brink, K. M., & Willis, A. R. (2019). Real-time visual odometry covariance estimation for unmanned air vehicle navigation. Journal of Guidance, Control, and Dynamics, 42(6), 1272–1288. https://doi.org/10.2514/1.G004000

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free