-
Notifications
You must be signed in to change notification settings - Fork 38
5 Driver Assistance System
"Driver Assistance System" integrates a functional logic with the driving simulation, that allows the user to detect different kinds of driver's behaviors, such as drowsiness indicators (blinks, yawns, head position) or distraction indicators (by detecting the head position).
Furthermore, "Driver Assistance System" uses information from the vehicle (currently we are extracting vehicle data by using a ROS plug-in running along the Driving Simulation), that is filtered through functional logic for identifying specific vehicle's statuses in order to determine wheter if the system must be enabled/disabled.
Once enabled, the system's functional logic continues filtering vehicle's data in order to decide when a detected behavior should be considered as relevant in terms of potential dangerousness.
(To see more about Drowsiness and Distraction status indicators and values please check Alarm Notification System)
Once the simulator is integrated correctly, the system can use the "Vehicle Data" to determine logics such as initializing Driver Assistance and establish different types of alarms.
Currently, the Driver Assistance System has different enabling option depending on the usage purposes. On the one hand, by following the "Usage" instructions described in our Installation Guide, the user will be able to run the system through a console terminal, independently from the simulation, being able to validate all recognition models implemented. You will find not only example images/videos but also the required command lines for running the system using your own camera device.
On the other hand, Driver Assistence System enablement can be executed along with the driving simulation, in order to relate enablement conditions to the available vehicle data, such as the vehicle ignition status. For further instructions on how to set up the driving simulation, please refer to the following link.
In first instance, given the driving simulation (using ROS2 plug-in) is set up and running, the Driver Assistance System will be paired to the vehicle's ignition status. While in the simulation, by pressing the E
key the user will be able to "Turn on/Turn off" the vehicle's engine respectively. Once the engine is started, Driver Assistance System will initiate performing a driver's facial recognition in order to verify the person within the cockpit is an authorized driver. For this, we compare the face detected with a photo stored in a "Authorized drivers Data Base" (check Driver Recognition, and if they match, the Driver Assistance System will proceed to driver's behavior detection.
The use case for handling "Unauthorized Driver" detection is currently under development.
Caution: If you have the "Detections Results" window open (which is generated when running our development), take special care in not pressing any key, because it stops the execution.
In order to enhance our inferencing accurancy, we have defined different functional rules within our system's logic that use vehicle's data to understand when and how to manage the detection process (take into account that this will only work when Driver Assistance is running along with the driving simulation). In this matter, we have chosen to focus mainly in the vehicle's speed and gear status.
When talking about Drowsiness State Detection, we will consider specifically the gear status. For example:
- Given the [gearStatus=Driving], when the user blinks/yawns repeatedly, then the Drivers Assistance System will detect this facial expressions as Drowsiness Status indicators.
- Given the [gearStatus=Reverse], when the user blinks/yawns repeatedly, then the Drivers Assistance System will detect this facial expressions as Drowsiness Status indicators.
- Given the [gearStatus=Parking], when the user blinks/yawns repeatedly, then the Drivers Assistance System will not detect any Drowsiness Status indicator due to as the car is parked the driver's behavior do not represent a potential dangerous situation.
For the Distraction State Detection, we will take into account the not only the gear status but the vehicle's speed as well. For example:
- Eyes out of the Road
- Given the [gearStatus=Driving] and the [vehicleSpeed>5km/h], when the user moves the head to a side, then the Driver Assistance System will detect an "Eyes Out Of Road" event.
- Given the [gearStatus=Reverse] and the [vehicleSpeed>5km/h], when the usermoves the head to a side, then the Driver Assistance System will not detect an "Eyes Out Of Road" event, due to it is expected that when the vehicle moves in reverse the driver will look to the sides.
- Given the [gearStatus=Parking], when the usermoves the head to a side, then the Driver Assistance System will not detect any "Eyes Out Of Road" event.
- Stop looking at (...)
- Given the [gearStatus=Driving] and the [vehicleSpeed>2km/h], when the user moves the head to a side, then the Driver Assistance System will detect an "Stop looking at (...)" event.
- Given the [gearStatus=Reverse] and the [vehicleSpeed>2km/h], when the user moves the head to a side, then the Driver Assistance System will detect an "Stop looking at (...)" event.
- Given the [gearStatus=Parking], when the usermoves the head to a side, then the Driver Assistance System will not detect any "Stop looking at (...)" event.