Abstract
In this paper, we provide a survey on automotive surround-view fisheye optics, with an emphasis on the impact of optical artifacts on computer vision tasks in autonomous driving and ADAS. The automotive industry has advanced in applying state-of-the-art computer vision to enhance road safety and provide automated driving functionality. When using camera systems on vehicles, there is a particular need for a wide field of view to capture the entire vehicle's surroundings, in areas such as low-speed maneuvering, automated parking, and cocoon sensing. However, one crucial challenge in surround-view cameras is the strong optical aberrations of the fisheye camera, which is an area that has received little attention in the literature. Additionally, a comprehensive dataset is needed for testing safety-critical scenarios in vehicle automation. The industry has turned to simulation as a cost-effective strategy for creating synthetic datasets with surround-view camera imagery. We examine different simulation methods (such as model-driven and data-driven simulations) and discuss the simulators' ability (or lack thereof) to model real-world optical performance. Overall, this paper highlights the optical aberrations in automotive fisheye datasets, and the limitations of optical reality in simulated fisheye datasets, with a focus on computer vision in surround-view optical systems.
Author supplied keywords
Cite
CITATION STYLE
Jakab, D., Deegan, B. M., Sharma, S., Grua, E. M., Horgan, J., Ward, E., … Eising, C. (2024). Surround-View Fisheye Optics in Computer Vision and Simulation: Survey and Challenges. IEEE Transactions on Intelligent Transportation Systems, 25(9), 10542–10563. https://doi.org/10.1109/TITS.2024.3368136
Register to see more suggestions
Mendeley helps you to discover research relevant for your work.