Inertial sensor self-calibration in a visually-aided navigation approach for a micro-AUV

24Citations
Citations of this article
40Readers
Mendeley users who have this article in their library.

Abstract

This paper presents a new solution for underwater observation, image recording, mapping and 3D reconstruction in shallow waters. The platform, designed as a research and testing tool, is based on a small underwater robot equipped with a MEMS-based IMU, two stereo cameras and a pressure sensor. The data given by the sensors are fused, adjusted and corrected in a multiplicative error state Kalman filter (MESKF), which returns a single vector with the pose and twist of the vehicle and the biases of the inertial sensors (the accelerometer and the gyroscope). The inclusion of these biases in the state vector permits their self-calibration and stabilization, improving the estimates of the robot orientation. Experiments in controlled underwater scenarios and in the sea have demonstrated a satisfactory performance and the capacity of the vehicle to operate in real environments and in real time.

Cite

CITATION STYLE

APA

Bonin-Font, F., Massot-Campos, M., Negre-Carrasco, P. L., Oliver-Codina, G., & Beltran, J. P. (2015). Inertial sensor self-calibration in a visually-aided navigation approach for a micro-AUV. Sensors (Switzerland), 15(1), 1825–1860. https://doi.org/10.3390/s150101825

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free