This paper presents a novel approach for the extrinsic parameter estimation of omnidirectional cameras with respect to a 3D Lidar coordinate frame. The method works without specific setup and calibration targets, using only a pair of 2D-3D data. Pose estimation is formulated as a 2D-3D nonlinear shape registration task which is solved without point correspondences or complex similarity metrics. It relies on a set of corresponding regions, and pose parameters are obtained by solving a small system of nonlinear equations. The efficiency and robustness of the proposed method was confirmed on both synthetic and real data in urban environment.
CITATION STYLE
Tamas, L., Frohlich, R., & Kato, Z. (2015). Relative pose estimation and fusion of omnidirectional and Lidar cameras. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 8926, pp. 640–651). Springer Verlag. https://doi.org/10.1007/978-3-319-16181-5_49
Mendeley helps you to discover research relevant for your work.