The safety of various road users and vehicle passengers is very important in our increasingly populated roads and highways. To this end, the correct perception of driving conditions is imperative for a driver to react accordingly to a given driving situation. Various sensors are currently being used in recognizing driving context. To further enhance such driving environment perception, this paper proposes the use of UAVs (unmanned aerial vehicles, also known as drones). In this work, drones are equipped with sensors (radar, lidar, camera, etc.), capable of detecting obstacles, accidents, and the like. Due to their small size and capability to move places, drones can be used collect perception data and transmit them to the vehicle using a secure method, such as an RF, VLC, or hybrid communication protocol. These data obtained from different sources are then combined and processed using a knowledge base and some set of logical rules. The knowledge base is represented by ontology; it contains various logical rules related to the weather, the appropriateness of sensors with respect to the weather, and the activation mechanism of UAVs containing these sensors. Logical rules about which communication protocols to use also exist. Finally, various driving context cognition rules are provided. The result is a more reliable environment perception for the vehicle. When necessary, users are provided with driving assistance information, leading to safe driving and fewer road accidents. As a proof of concept, various use cases were tested in a driving simulator in the laboratory. Experimental results show that the system is an effective tool in improving driving context recognition and in preventing road accidents.
CITATION STYLE
Khezaz, A., Hina, M. D., & Ramdane-Cherif, A. (2022). Perception Enhancement and Improving Driving Context Recognition of an Autonomous Vehicle Using UAVs. Journal of Sensor and Actuator Networks, 11(4). https://doi.org/10.3390/jsan11040056
Mendeley helps you to discover research relevant for your work.