Visual localization based on quadtrees

1Citations
Citations of this article
2Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Autonomous mobile robotsmoving through their environment to perform the tasks for which they were programmed. The robot proper operation largely depends on the quality of the self localization information used when globally navigating in its environment. This paper describes a method of maintaining a self-location probability distribution of a set of states, which represents the robot position. The novel feature of this approach is to represent the state space as a Quadtree that dynamically evolves to use the minimum set of statements without loss of accuracy. We demonstrate the benefits of this approach in localizing a robot in the RoboCup SPL environment using the information provided by its camera.

Author supplied keywords

Cite

CITATION STYLE

APA

Martín, F. (2016). Visual localization based on quadtrees. In Advances in Intelligent Systems and Computing (Vol. 418, pp. 599–610). Springer Verlag. https://doi.org/10.1007/978-3-319-27149-1_46

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free