Abstract
This paper presents a vision/LiDAR integrated navigation system that provides accurate relative navigation performance on a general ground surface, in GNSS-denied environments. The considered ground surface during flight is approximated as a piecewise continuous model, with flat and slope surface profiles. In its implementation, the presented system consists of a strapdown IMU, and an aided sensor block, consisting of a vision sensor and a LiDAR on a stabilized gimbal platform. Thus, two-dimensional optical flow vectors from the vision sensor, and range information from LiDAR to ground are used to overcome the performance limit of the tactical grade inertial navigation solution without GNSS signal. In filter realization, the INS error model is employed, with measurement vectors containing two-dimensional velocity errors, and one differenced altitude in the navigation frame. In computing the altitude difference, the ground slope angle is estimated in a novel way, through two bisectional LiDAR signals, with a practical assumption representing a general ground profile. Finally, the overall integrated system is implemented, based on the extended Kalman filter framework, and the performance is demonstrated through a simulation study, with an aircraft flight trajectory scenario. © The Korean Society for Aeronautical & Space Sciences.
Author supplied keywords
Cite
CITATION STYLE
Yun, S., Lee, Y. J., Kim, C. J., & Sung, S. (2014). Integrated navigation design using a gimbaled vision/liDAR system with an approximate ground description model. International Journal of Aeronautical and Space Sciences, 14(4), 369–378. https://doi.org/10.5139/IJASS.2013.14.4.369
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.