Will emotion recognition enable safer driving experiences?

by Naveen Joshi – Director at Allerin

Works on Data Analytics and Strategies, Process Automation, Connected Infrastructure (IoT)

The use of driver emotion recognition in automobiles will make travel safer by providing appropriate responses and recommendations according to the drivers’ emotions.

When it comes to safety, many people often think of factors like speeding, reckless behavior, and others. While one of the most important factors that are forgotten is the emotional state of the driver. The emotional state of the driver majorly influences how he or she drives. For instance, anger is an emotional state that influences aggressive driving and increases the chances of road accidents. Another example can be stress. One of the most common events causing stress while driving is traffic congestion. Stress can lead to a lack of focus and attention arising chances of accidents. Considering the fact that emotions can influence how a driver drives, automobile manufacturers are using driver emotion recognition in automobiles to prevent road accidents.

How driver emotion recognition can increase safety

The emotion recognition system needs to scan drivers’ faces and then compare them with data about various emotional faces to detect what is the current emotional state of the driver. And, it is not only at the start when emotion recognition systems should detect emotions. Drivers’ emotions might change during a journey as well. Hence the systems need to constantly detect changing emotions as well. Computer vision-enabled cameras can constantly gain information about drivers’ facial emotions. And then, deep learning networks can match those emotions with datasets to recognize the emotions of drivers.

No alt text provided for this image

Once the emotion of the driver is detected, emotion recognition systems can send signals to various other AI-enabled devices to take appropriate actions. For instance, in the case of fatigue, AI-enabled chatbots can communicate with drivers to help them to stay awake. And, based on the mood of drivers, chatbots can also interact with them to ease stress. Sometimes situations might not change with the help of communications.

For such scenarios, emotion recognition systems can trigger other automobile systems to pre-charge brakes of vehicles. And, if embedded in semi-automated vehicles, then automated systems can takeover driving without the driver noticing it. Thus, the driver might think that he or she is driving the vehicle, whereas it would be automated systems that would be driving it.

Adjusting in-vehicle ambiance might also make the driver comfortable and provide some non-distracting entertainment to him or her. For instance, depending on the mood of the driver in-vehicle ambiance like music, ambient light, and air-conditioning in the vehicle can be adjusted.

Using driver emotion recognition in automobiles for a safe journey does sound interesting. But, there are challenges that can affect the accuracy of emotion recognition systems. For instance, in-vehicle lighting can reflect from drivers’ faces and make it look shiny, which might cause inaccuracy in emotion detection. Another challenge can be a partial face appearance.

For instance, drivers might wear sunglasses or change directions of their faces to look at rearview mirrors. In both scenarios, the entire face of the driver will not be visible in the cameras of emotion recognition systems. Hence, to accurately recognize emotions, emotion recognition systems should be trained for such scenarios so that they can become more robust and reliable.