Evaluation of video-based driver assistance systems with sensor data fusion by using virtual test driving

9Citations
Citations of this article
16Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Research and/or Engineering Questions/Objective: The vehicle of the future will support its driver by advising him regarding potential hazards. Essential prerequisite therefore is the sensor based perception of the traffic situation. For the recognition of traffic related objects, camera based sensors, deepness cameras, vehicle sensors as well as radar and lidar sensors are used. For the future development of ADAS the fusion of multiple sensor data to a consistent environmental picture will play a key role. The evaluation approach of real world driving tests will no longer be sufficient due to the complexity of the system interactions. New simulation methods are needed to evaluate ADAS by using virtual test driving with realistic vehicle behavior and complex traffic environment. Methodology Therefore it is important to integrate camera based components in a "closed loop"- simulation platform to be able to test sensor data fusion technologies under realistic conditions. To test new driver assistance systems in a simulation environment today animation data is filmed, subsequently this data is used to test an image processing algorithm or a fusion algorithm. But this method cannot be applied if wide-angle cameras such as cameras with fisheye lenses will be used. Within a research frame work for autonomous driving functions a new simulation technology was developed to integrate virtual cameras beside the well know environment sensor in the vehicle dynamic simulation CarMaker. For this purpose the real-time animation was extended with a sophisticated virtual camera model so called "VideoDataStream" to generate simultaneous video data (also PMD for 3D images). The camera positions as well as the camera properties could be applied individually. Additionally it is possible to freely define the type of the camera lens (e.g. fisheye) with lens settings like opening angle and the typical lens failures (e.g. distortion and vignetting). With this new technology it is possible that e.g. camera and radar data can be provided time and place synchronal for the fusion algorithm which should be tested! Results The video data could be used for evaluating image processing and sensor data fusion in Model-/Software-/Hardware-in-the-Loop applications within virtual test driving conditions. Here the created method and examples of image based perception of the vehicle environment as well as sensor data fusion algorithms shall be presented. Among others this covers first of all the recognition of traffic lanes, traffic signs and other traffic partners as well as the fusion of the single information up to a comprehensive environment picture. A further field of application will be the conjunction with navigation systems and digital maps, by which the virtual vehicle supports the navigation system with related GPS position and gets back the "MPP-Most Probable Path" with the "electronic horizon", which is a type of predictive sensor, with all related preview information in front of the vehicle which are defined in the ADASIS protocol. Conclusion By using the introduced method the capability and efficiency of function development and testing in the area of Advanced Driver Assistant Systems will significantly be improved. Due to a powerful simulation environment a broad range of validation tests can be shifted into simulation because also complex test scenarios can be replicated and the tests are reproducible. The simulation data can be provided time and place synchronal, which is absolutely important, e.g. for a fusion algorithm which should be tested. © Springer-Verlag 2013.

Cite

CITATION STYLE

APA

Schick, B., & Schmidt, S. (2013). Evaluation of video-based driver assistance systems with sensor data fusion by using virtual test driving. In Lecture Notes in Electrical Engineering (Vol. 196 LNEE, pp. 1363–1375). Springer Verlag. https://doi.org/10.1007/978-3-642-33738-3_36

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free