Key Indicators to Assess the Performance of LiDAR-Based Perception Algorithms: A Literature Review

0Citations
Citations of this article
16Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Perception algorithms are essential for autonomous or semi-autonomous vehicles to perceive the semantics of their surroundings, including object detection, panoptic segmentation, and tracking. Decision-making in case of safety-critical situations, like autonomous emergency braking and collision avoidance, relies on the outputs of these algorithms. This makes it essential to correctly assess such perception systems before their deployment and to monitor their performance when in use. It is difficult to test and validate these systems, particularly at runtime, due to the high-level and complex representations of their outputs. This paper presents an overview of different existing metrics used for the evaluation of LiDAR-based perception systems, emphasizing particularly object detection and tracking algorithms due to their importance in the final perception outcome. Along with generally used metrics, we also discuss the impact of Planning KL-Divergence (PKL), Timed Quality Temporal Logic (TQTL), and Spatio-temporal Quality Logic (STQL) metrics on object detection algorithms. In the case of panoptic segmentation, Panoptic Quality (PQ) and Parsing Covering (PC) metrics are analysed resorting to some pretrained models. Finally, it addresses the application of diverse metrics to evaluate different pretrained models with the respective perception algorithms on publicly available datasets. Besides the identification of the various metrics being proposed, their performance and influence on models are also assessed after conducting new tests or reproducing the experimental results of the reference under consideration.

Cite

CITATION STYLE

APA

Karri, C., Da Silva, J. M., & Correia, M. V. (2023). Key Indicators to Assess the Performance of LiDAR-Based Perception Algorithms: A Literature Review. IEEE Access, 11, 109142–109168. https://doi.org/10.1109/ACCESS.2023.3321912

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free