Tracking a Mobile Robot Position Using Vision and Inertial Sensor

7Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Wheeled mobile robots are still the first choice when it comes to industrial or domotic applications. The robot's navigation system aims to reliably determine the robot's position, velocity and orientation and provide it to control and trajectory guidance modules. The most frequently used sensors are inertial measurement units (IMU) combined with an absolute position sensing mechanism. The dead reckoning approach using IMU suffers from integration drift due to noise and bias. To overcome this limitation we propose the use of the inertial system in combination with mechanical odometers and a vision based system. These two sensor complement each other as the vision sensor is accurate at low-velocities but requires long computation time, while the inertial sensor is able to track fast movements but suffers from drift. The information from the sensors is integrated through a multi-rate fusion scheme. Each of the sensor systems is assumed to have it's own independent sampling rate, which may be time-varying. Data fusion is performed by a multi-rate Kalman filter. The paper describes the inertial and vision navigation systems, and the data fusion algorithm. Simulation and experimental results are presented. © IFIP International Federation for Information Processing 2014.

Cite

CITATION STYLE

APA

Coito, F., Eleutério, A., Valtchev, S., & Coito, F. (2014). Tracking a Mobile Robot Position Using Vision and Inertial Sensor. In IFIP Advances in Information and Communication Technology (Vol. 423, pp. 201–208). Springer New York LLC. https://doi.org/10.1007/978-3-642-54734-8_23

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free