Semantic Cameras for 360-Degree Environment Perception in Automated Urban Driving

24Citations
Citations of this article
13Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

The European UP-Drive project addresses transportation-related challenges by providing key contributions that enable fully automated vehicle navigation and parking in complex urban areas, which results in a safer, inclusive, affordable and environmentally friendly transportation system. For this purpose, the project consortium developed a prototype electrical vehicle equipped with cameras and LiDARs sensors that is capable to autonomously drive around the city and find available parking spots. In UP-Drive, we created an accurate, robust and redundant multi-modal environment perception system that provides 360° coverage around the vehicle. This paper summarizes the work of the project related to the surround view semantic perception using fisheye and narrow field-of-view semantic virtual cameras. Deep learning-based semantic, instance and panoptic segmentation networks, which satisfy requirements in accuracy and efficiency have been developed and integrated into the final prototype. The UP-Drive automated vehicle has been successfully demonstrated in urban areas after extensive experiments and numerous field tests.

Cite

CITATION STYLE

APA

Petrovai, A., & Nedevschi, S. (2022). Semantic Cameras for 360-Degree Environment Perception in Automated Urban Driving. IEEE Transactions on Intelligent Transportation Systems, 23(10), 17271–17283. https://doi.org/10.1109/TITS.2022.3156794

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free