Visual odometry provides astronauts with accurate knowledge of their position and orientation. Wearable astronaut navigation systems should be simple and compact. Therefore, monocular vision methods are preferred over stereo vision systems, commonly used in mobile robots. However, the projective nature of monocular visual odometry causes a scale ambiguity problem. In this paper, we focus on the integration of a monocular camera with a laser distance meter to solve this problem. The most remarkable advantage of the system is its ability to recover a global trajectory for monocular image sequences by incorporating direct distance measurements. First, we propose a robust and easy-to-use extrinsic calibration method between camera and laser distance meter. Second, we present a navigation scheme that fuses distance measurements with monocular sequences to correct the scale drift. In particular, we explain in detail how to match the projection of the invisible laser pointer on other frames. Our proposed integration architecture is examined using a live dataset collected in a simulated lunar surface environment. The experimental results demonstrate the feasibility and effectiveness of the proposed method. © 2014 by the authors; licensee MDPI, Basel, Switzerland.
CITATION STYLE
Wu, K., Di, K., Sun, X., Wan, W., & Liu, Z. (2014). Enhanced monocular visual odometry integrated with laser distance meter for astronaut navigation. Sensors (Switzerland), 14(3), 4981–5003. https://doi.org/10.3390/s140304981
Mendeley helps you to discover research relevant for your work.