LIDAR and panoramic camera extrinsic calibration approach using a pattern plane

26Citations
Citations of this article
19Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Mobile platforms typically combine several data acquisition systems such as lasers, cameras and inertial systems. However the geometrical combination of the different sensors requires their calibration, at least, through the definition of the extrinsic parameters, i.e., the transformation matrices that register all sensors in the same coordinate system. Our system generate an accurate association between platform sensors and the estimated parameters including rotation, translation, focal length, world and sensors reference frame. The extrinsic camera parameters are computed by Zhang's method using a pattern composed of white rhombus and rhombus holes, and the LIDAR with the results of previous work. Points acquired by the LIDAR are projected into images acquired by the Ladybug cameras. A new calibration pattern, visible to both sensors is used. Correspondence is obtained between each laser point and its position in the image, the texture and color of each point of LIDAR can be know. © 2013 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

García-Moreno, A. I., Gonzalez-Barbosa, J. J., Ornelas-Rodriguez, F. J., Hurtado-Ramos, J. B., & Primo-Fuentes, M. N. (2013). LIDAR and panoramic camera extrinsic calibration approach using a pattern plane. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 7914 LNCS, pp. 104–113). https://doi.org/10.1007/978-3-642-38989-4_11

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free