Combined visual and inertial navigation for an unmanned aerial vehicle

47Citations
Citations of this article
131Readers
Mendeley users who have this article in their library.
Get full text

Abstract

We describe an UAV navigation system which combines stereo visual odometry with inertial measurements from an IMU. Our approach fuses the motion estimates from both sensors in an extended Kalman filter to determine vehicle position and attitude. We present results using data from a robotic helicopter, in which the visual and inertial system produced a final position estimate within 1% of the measured GPS position, over a flight distance of more than 400 meters. Our results show that the combination of visual and inertial sensing reduced overall positioning error by nearly an order of magnitude compared to visual odometry alone. © 2008 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Kelly, J., Saripalli, S., & Sukhatme, G. S. (2008). Combined visual and inertial navigation for an unmanned aerial vehicle. In Springer Tracts in Advanced Robotics (Vol. 42, pp. 255–264). https://doi.org/10.1007/978-3-540-75404-6_24

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free