Visual navigation of mobile robot using optical flow and visual potential field

8Citations
Citations of this article
14Readers
Mendeley users who have this article in their library.
Get full text

Abstract

In this paper, we develop a novel algorithm for navigating a mobile robot using the visual potential. The visual potential is computed from an image sequence and optical flow computed from successive images captured by the camera mounted on the robot. We assume that the direction to the destination is provided at the initial position of the robot. Using the direction to the destination, the robot dynamically selects a local pathway to the destination without collision with obstacles. The proposed algorithm does no require any knowledge or environmental maps of the robot workspace. Furthermore, this algorithm uses only a monocular uncalibrated camera for detecting a feasible region of navigation, since we apply the dominant plane detection to detect the feasible region. We present the experimental results of navigation in synthetic and real environments. Additionally, we present the robustness evaluation of optical flow computation against lighting effects and various kinds of textures. © 2008 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Ohnishi, N., & Imiya, A. (2008). Visual navigation of mobile robot using optical flow and visual potential field. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 4931 LNCS, pp. 412–426). https://doi.org/10.1007/978-3-540-78157-8_32

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free