Emotion Recognition Technology in Automotive Industry: A CAREFUL SECOND SEATER

The book “The loop” written by “Jeremy Robert Johnson” shows that our mind makes the majority of our decisions using an unconscious instinct. These instincts have tremendous patterns, and there is no pattern recognition algorithm to predict what that instinctive system should do next. For example, yawning can point out to drowsiness or boredom. Similarly, heavy eyes show fatigue or drowsiness. So, one of the unconscious instincts with these signs is sleep; thus, your brain makes the decision unconsciously when you are tired. Now consider a case when you are driving a vehicle and show signs of drowsiness or fatigue. At that time, it is really important for us to be alert, and an unconscious instinct like sleep could prove fatal. Therefore, with respect to safety, we must be alerted if any such sign is observed. It is not possible that you will always have a second seater with you while driving a vehicle, but technology does not seem to disappoint and can always accompany you in a better way than the second seater. One such technology is Emotion Recognition Technology or Emotion A.I. Now, the following questions arise.

What is Emotion Recognition Technology?

Emotion Recognition Technology understands multitudinous human emotions by reading the facial features of humans, which are captured by a camera and analyzed using machine learning. These facial features include the movement of different facial parts like the eyes, mouth, head, etc., and with the help of accurately identified emotions, we can program a computer to take the next step. In the coming article, we will discover details of Emotion Recognition Technology.

Emotion Recognition Technology (ERT)

Emotion Recognition Technology utilizes different parameters to automatically detect the emotional state of a person. These parameters may include different facial expressions, tone of voice, and body posture to predict the state of a human being. Different expressions, like lip movements, brow furrows, eye, head movements, etc., help the algorithm detect a person’s state. Here we will discuss the use of the Circumplex model, in which we determine the state of a person by studying the Valence and Arousal ratings of facial expressions. The term valence describes the positivity or negativity of an emotion, whereas arousal describes the intensity of an emotion. Thus, with valence and arousal ratings, we can discover a wide range of emotional states of a person. Emotion Recognition Technology has its utilization in almost every field, but our article will deal with the use and effectiveness of Emotion Recognition Technology in the Automotive industry.

Importance of Emotion Recognition Technology in Vehicles

Drivers often experience drowsiness where their mind is incapable of staying alert on the road. Different involuntary actions like microsleep are extremely dangerous and unnoticeable in such conditions. People who are into drowsy driving risk themselves and their surroundings. According to the National Highway Traffic Safety Administration (NHTSA), each year, around 2-3% of people lose their precious lives because of drowsy driving, directly or indirectly. Another study showed that an average of 4% of people are reported to fall asleep while driving over a period of one month which means that 1 out of 25 people is affected by drowsy driving. Thus, it became really important to address the seriousness of drowsy driving and take necessary measures to control the considerable risks arising from it.

Emotion Recognition A.I. system serves the purpose of identifying the driver’s state as well as suggesting safety measures to the driver. With it, the in-vehicle Human Machine Interface (HMI) alerts the driver when they show signs of being drowsy, sleepy, or fatigued by observing and comparing different facial expressions like yawning, eye movements, and nodding of their head; it prompts the driver to take a break and freshen himself up before he continues driving. The alert options can be repeated in different ways, like a vibration of the seat and an audio message on the speakers, etc. In the most advanced cars, the Engine Control Unit (ECU) can even engage the autopilot mode to slow down and stop the car by the road shoulder after detecting a sleeping driver, as well as alert the driver with a visual sign and audio beep.

Work of Emotion Recognition Technology

Emotion Recognition Technology is an artificial intelligence technique utilizing machine learning or deep learning where the result gets better as the pre-trained datasets increases. The prerequisite for an effective Emotion Recognition A.I. System is a high-quality camera with which different facial expressions like eye, mouth, and head movements are captured. With the help of A.I. and machine learning, all these facial expressions are analyzed meticulously on the vehicle’s computer or in-cabin human-machine interface (HMI). The driver’s emotional data captured are compared with the pre-trained datasets stored in the library, and the emotional state of the driver is produced in real-time.

Challenges in Emotion Recognition Technology

The stored dataset is one of the most important components of an efficient Emotion Recognition A.I. System. Following are some of the challenges we need to overcome to make Emotion Recognition Technology an effective model.

1. External light source

Different people respond differently to external light sources while driving. The interference caused by headlights of vehicles traveling opposite to us or shadows of the objects caused by streetlights affects the driver’s facial expressions, which are all being captured. These irregularities may cause the mood of the driver to be wrongly captured and analyzed, thereby displaying the incorrect emotional state of the driver.

2. Different poses

There are many activities that a person can possibly be engaged in while driving. For example, he might be eating or drinking, he might be daydreaming, or even using a mobile phone. Due to all these different poses, it becomes a necessity that we should cover every possible angle and throw light on every aspect.

3. Things that hide your face

Emotion Recognition A.I. System may not be able to effectively capture and process the driver’s emotional state accurately if he wears sunglasses or a cap or uses things like sun visors which might cause it to give no alert or faulty alerts. This potentially might be a limitation of Emotion Recognition Technology as it is unable to detect eye movements. Perhaps detection by head pose and body posture of the driver might be a way to circumvent this issue if eye movement is not detected.

4. Camera position

Camera positioning is also a factor of profound importance. The image/video source, which serves as the basis for deciding the state of a person, needs to be well captured. Without the best possible shots, your system’s accuracy will obviously be affected adversely. Indeed, Emotion Recognition Technology is an effective safety strategy for drowsy drivers, but all the above factors make it challenging, and if all of them can be cared for properly, then the emotional state of a person can be detected very accurately.

Extension of Emotion Recognition Technology in a vehicle

It can help us in many more possible ways in terms of driver safety if we get to develop an Emotion Recognition A.I. System with a high level of accuracy. For example, we can utilize Emotion Recognition Technology to control the aggressive action of the driver to prevent road rage. If an angry emotion is detected, the vehicle ECU can take actions like limiting the top speed or playing the driver’s favorite music, or making the driver remember his loved ones by audible message to mitigate his aggressiveness and prompt the driver to pay more attention on the road instead.

With the most advanced vehicles, the technology aims to provide autonomous control in a medical emergency situation like a cardiac arrest. For example, if the driver experiences a heart attack while driving and this is understood by the computer with the help of a camera and various sensors. Now, in such a condition, the computer would be programmed to take on autonomous control of the vehicle and bring the vehicle to a safe stop, whereas on the other hand, the doctor would monitor the driver through the camera and make a better judgment of him.

Conclusion

The dangers associated with fatigue, stress, or drowsy driving are worrisome because thousands of precious lives are lost because of it. Thus, it becomes important for us to save our fellows from casualties and fatalities for the betterment of mankind, or you can say the human aspect. Therefore, it is recommended that all new fleets of vehicles should be delivered pre-installed with Emotion Recognition A.I. software in its in-cabin human-machine interface (HMI) as a fixed feature. The government of each country should encourage or enforce those existing owners and all trucks or logistic fleets that do long-distance driving to equip their vehicle with a separate dashcam that comes with Emotion Recognition A.I. or driver fatigue detection software pre-installed and make a working Emotion Recognition A.I. system a must requirement for their vehicle to pass an assessment test. However, many European countries have already made that an essential requirement and are following it successfully. The above may seem farfetched in this era, but who knows, this could happen soon if the government of all countries understands the vast benefits of it for saving lives.

More posts...