Photometric Visual-Inertial Navigation with Uncertainty-Aware Ensembles

8Citations
Citations of this article
23Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

In this article, we propose a visual-inertial navigation system that directly minimizes a photometric error without an explicit data-association. We focus on the photometric error parametrized by pose and structure parameters that is highly nonconvex due to the nonlinearity of image intensity. The key idea is to introduce an optimal intensity gradient that accounts for a projective uncertainty of a pixel. Ensembles sampled from the state uncertainty contribute to the proposed gradient and yield a correct update direction even in a bad initialization point. We present two sets of experiments to demonstrate the strengths of our framework. First, a thorough Monte Carlo simulation in a virtual trajectory is designed to reveal robustness to large initial uncertainty. Second, we show that the proposed framework can achieve superior estimation accuracy with efficient computation time over state-of-the-art visual-inertial fusion methods in a real-world UAV flight test, where most scenes are composed of a featureless floor.

Cite

CITATION STYLE

APA

Jung, J. H., Choe, Y., & Park, C. G. (2022). Photometric Visual-Inertial Navigation with Uncertainty-Aware Ensembles. IEEE Transactions on Robotics, 38(4), 2039–2052. https://doi.org/10.1109/TRO.2021.3139964

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free