Radar style transfer for metric robot localisation on lidar maps

6Citations
Citations of this article
7Readers
Mendeley users who have this article in their library.

Abstract

Lidar and visual data are affected heavily in adverse weather conditions due to sensing mechanisms, which bring potential safety hazards for vehicle navigation. Radar sensing is desirable to build a more robust navigation system. In this paper, a cross-modality radar localisation on prior lidar maps is presented. Specifically, the proposed workflow consists of two parts: first, bird's-eye-view radar images are transferred to fake lidar images by training a generative adversarial network offline. Then with online radar scans, a Monte Carlo localisation framework is built to track the robot pose on lidar maps. The whole online localisation system only needs a rotating radar sensor and a pre-built global lidar map. In the experimental section, the authors conduct an ablation study on image settings and test the proposed system on Oxford Radar Robot Car Dataset. The promising results show that the proposed localisation system could track the robot pose successfully, thus demonstrating the feasibility of radar style transfer for metric robot localisation on lidar maps.

Cite

CITATION STYLE

APA

Yin, H., Wang, Y., Wu, J., & Xiong, R. (2023). Radar style transfer for metric robot localisation on lidar maps. CAAI Transactions on Intelligence Technology, 8(1), 139–148. https://doi.org/10.1049/cit2.12112

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free