Relative pose estimation and fusion of 2D spectral and 3D Lidar images

1Citations
Citations of this article
9Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

This paper presents a unified approach for the relative pose estimation of a spectral camera - 3D Lidar pair without the use of any special calibration pattern or explicit point correspondence. The method works without specific setup and calibration targets, using only a pair of 2D-3D data. Pose estimation is formulated as a 2D-3D nonlinear shape registration task which is solved without point correspondences or complex similarity metrics. The registration is then traced back to the solution of a non-linear system of equations which directly provides the calibration parameters between the bases of the two sensors. The method has been extended both for perspective and omnidirectional central cameras and was tested on a large set of synthetic lidar-camera image pairs as well as on real data acquired in outdoor environment.

Cite

CITATION STYLE

APA

Kato, Z., & Tamas, L. (2015). Relative pose estimation and fusion of 2D spectral and 3D Lidar images. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 9016, pp. 33–42). Springer Verlag. https://doi.org/10.1007/978-3-319-15979-9_4

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free