Accurate and robust monocular SLAM with omnidirectional cameras

24Citations
Citations of this article
43Readers
Mendeley users who have this article in their library.

Abstract

Simultaneous localization and mapping (SLAM) are fundamental elements for many emerging technologies, such as autonomous driving and augmented reality. For this paper, to get more information, we developed an improved monocular visual SLAM system by using omnidirectional cameras. Our method extends the ORB-SLAM framework with the enhanced unified camera model as a projection function, which can be applied to catadioptric systems and wide-angle fisheye cameras with 195 degrees field-of-view. The proposed system can use the full area of the images even with strong distortion. For omnidirectional cameras, a map initialization method is proposed. We analytically derive the Jacobian matrices of the reprojection errors with respect to the camera pose and 3D position of points. The proposed SLAM has been extensively tested in real-world datasets. The results show positioning error is less than 0.1% in a small indoor environment and is less than 1.5% in a large environment. The results demonstrate that our method is real-time, and increases its accuracy and robustness over the normal systems based on the pinhole model. We open source in https://github.com/lsyads/fisheye-ORB-SLAM.

Cite

CITATION STYLE

APA

Liu, S., Guo, P., Feng, L., & Yang, A. (2019). Accurate and robust monocular SLAM with omnidirectional cameras. Sensors (Switzerland), 19(20). https://doi.org/10.3390/s19204494

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free