Robust embedded egomotion estimation

  • Voigt R
  • Nikolic J
  • Hürzeler C
 et al. 
  • 67

    Readers

    Mendeley users who have this article in their library.
  • 26

    Citations

    Citations of this article.

Abstract

This work presents a method for estimating the egomotion of an aerial vehicle in challenging industrial environments. It combines binocular visual and inertial cues in a tightly-coupled fashion and operates in real time on an embedded platform. An extended Kalman filter fuses measurements and makes motion estimation rely more on inertial data if visual feature constellation is degenerate. Errors in roll and pitch are bounded implicitly by the gravity vector. Inertial sensors are used for efficient outlier detection and enable operation in poorly and repetitively textured environments. We demonstrate robustness and accuracy in an industrial scenario as well as in general indoor environments. The former is accompanied by a detailed performance evaluation supported with ground truth measurements from an external tracking system.

Author-supplied keywords

  • Egomotion Estimation
  • Extended Kalman Filter
  • Stereo Vision
  • Vision-IMU Fusion
  • Visual Odometry

Get free article suggestions today

Mendeley saves you time finding and organizing research

Sign up here
Already have an account ?Sign in

Find this document

Authors

  • Rainer Voigt

  • Janosch Nikolic

  • Christoph Hürzeler

  • Stephan Weiss

  • Laurent Kneip

  • Roland Siegwart

Cite this document

Choose a citation style from the tabs below

Save time finding and organizing research with Mendeley

Sign up for free