Joint Calibration of a Multimodal Sensor System for Autonomous Vehicles

1Citations
Citations of this article
5Readers
Mendeley users who have this article in their library.

Abstract

Multimodal sensor systems require precise calibration if they are to be used in the field. Due to the difficulty of obtaining the corresponding features from different modalities, the calibration of such systems is an open problem. We present a systematic approach for calibrating a set of cameras with different modalities (RGB, thermal, polarization, and dual-spectrum near infrared) with regard to a LiDAR sensor using a planar calibration target. Firstly, a method for calibrating a single camera with regard to the LiDAR sensor is proposed. The method is usable with any modality, as long as the calibration pattern is detected. A methodology for establishing a parallax-aware pixel mapping between different camera modalities is then presented. Such a mapping can then be used to transfer annotations, features, and results between highly differing camera modalities to facilitate feature extraction and deep detection and segmentation methods.

Cite

CITATION STYLE

APA

Muhovič, J., & Perš, J. (2023). Joint Calibration of a Multimodal Sensor System for Autonomous Vehicles. Sensors, 23(12). https://doi.org/10.3390/s23125676

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free