Movement direction estimation using omnidirectional images in a SLAM Algorithm

0Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.
Get full text

Abstract

This work presents a method to estimate the movement direction of a mobile robot using only visual information, without any other additional sensor. This visual information is provided by a catadioptric system mounted on the robot and formed by a camera pointing towards a convex mirror. It provides the robot with omnidirectional images that contain information with a field of view of 360° around the camera-mirror axis. A SLAM algorithm is presented to test the method that estimates the movement direction of the robot. This SLAM method uses two different global appearance descriptors to calculate the orientation of the robot and the distance between two different positions. The method to calculate the movement direction is based on landmarks extraction, using SURF features. A set of omnidirectional images has been considered to test the effectiveness of this method.

Cite

CITATION STYLE

APA

Berenguer, Y., Payá, L., Reinoso, O., Peidró, A., & Jiménez, L. M. (2018). Movement direction estimation using omnidirectional images in a SLAM Algorithm. In Advances in Intelligent Systems and Computing (Vol. 693, pp. 640–651). Springer Verlag. https://doi.org/10.1007/978-3-319-70833-1_52

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free