Can adas distract driver’s attention? An rgb-d camera and deep learning-based analysis

7Citations
Citations of this article
17Readers
Mendeley users who have this article in their library.

Abstract

Driver inattention is the primary cause of vehicle accidents; hence, manufacturers have introduced systems to support the driver and improve safety; nonetheless, advanced driver assistance systems (ADAS) must be properly designed not to become a potential source of distraction for the driver due to the provided feedback. In the present study, an experiment involving auditory and haptic ADAS has been conducted involving 11 participants, whose attention has been monitored during their driving experience. An RGB-D camera has been used to acquire the drivers’ face data. Subsequently, these images have been analyzed using a deep learning-based approach, i.e., a convolutional neural network (CNN) specifically trained to perform facial expression recognition (FER). Analyses to assess possible relationships between these results and both ADAS activations and event occurrences, i.e., accidents, have been carried out. A correlation between attention and accidents emerged, whilst facial expressions and ADAS activations resulted to be not correlated, thus no evidence that the designed ADAS are a possible source of distraction has been found. In addition to the experimental results, the proposed approach has proved to be an effective tool to monitor the driver through the usage of non-invasive techniques.

Cite

CITATION STYLE

APA

Ulrich, L., Nonis, F., Vezzetti, E., Moos, S., Caruso, G., Shi, Y., & Marcolin, F. (2021). Can adas distract driver’s attention? An rgb-d camera and deep learning-based analysis. Applied Sciences (Switzerland), 11(24). https://doi.org/10.3390/app112411587

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free