Perceptual interpretation for autonomous navigation through dynamic imitation learning

4Citations
Citations of this article
32Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Achieving high performance autonomous navigation is a central goal of field robotics. Efficient navigation by a mobile robot depends not only on the individual performance of perception and planning systems, but on how well these systems are coupled. When the perception problem is clearly defined, as in well structured environments, this coupling (in the form of a cost function) is also well defined. However, as environments become less structured and more difficult to interpret, more complex cost functions are required, increasing the difficulty of their design. Recently, a class of machine learning techniques has been developed that rely upon expert demonstration to develop a function mapping perceptual data to costs. These algorithms choose the cost function such that the robot's planned behavior mimics an expert's demonstration as closely as possible. In this work, we extend these methods to address the challenge of dynamic and incomplete online perceptual data, as well as noisy and imperfect expert demonstration. We validate our approach on a large scale outdoor robot with hundreds of kilometers of autonomous navigation through complex natural terrains. © 2011 Springer-Verlag.

Cite

CITATION STYLE

APA

Silver, D., Bagnell, J. A., & Stentz, A. (2011). Perceptual interpretation for autonomous navigation through dynamic imitation learning. In Springer Tracts in Advanced Robotics (Vol. 70, pp. 433–449). https://doi.org/10.1007/978-3-642-19457-3_26

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free