The development of autonomous vehicles is becoming increasingly popular and gathering real-world data is considered a valuable task. Many datasets have been published recently in the autonomous vehicle sector, with synthetic datasets gaining particular interest due to availability and cost. For a real implementation and correct evaluation of vehicles at higher levels of autonomy, it is also necessary to consider human interaction, which is precisely something that lacks in existing datasets. In this article the UPCT dataset is presented, a public dataset containing high quality, multimodal data obtained using state-of-the-art sensors and equipment installed onboard the UPCT’s CICar autonomous vehicle. The dataset includes data from a variety of perception sensors including 3D LiDAR, cameras, IMU, GPS, encoders, as well as driver biometric data and driver behaviour questionnaires. In addition to the dataset, the software developed for data synchronisation and processing has been made available. The quality of the dataset was validated using an end-to-end neural network model with multiple inputs to obtain the speed and steering wheel angle and it obtained very promising results.
CITATION STYLE
Rosique, F., Navarro, P. J., Miller, L., & Salas, E. (2023). Autonomous Vehicle Dataset with Real Multi-Driver Scenes and Biometric Data. Sensors, 23(4). https://doi.org/10.3390/s23042009
Mendeley helps you to discover research relevant for your work.