Opsis emotion AI helps make ride experiences safer and more enjoyable by making the driver aware of emotional disturbances that could lead to road accidents and enabling a fun, intuitive & emphatic avatar.

The automotive industry worldwide has been showing keen interest and adoption in using facial expression emotion AI for fleet management/driver attention monitoring. Facial expression analysis, object recognition and upper body movement will also support use cases for safety inside the car. The industry is also innovating with various AI technologies for avatar which connects speech AI, computer vision, natural language understanding, recommendation engines to redefine the experience inside the car. 

 

Opsis emotion AI use facial expression, voice-based and gestures emotion analytics in driver assistance systems to detect the driver’s attention level and affective behaviour patterns on the road. Our emotion AI is trained to detect negative emotions with a high degree of accuracy, especially when these could impact the driver’s attention level and impair the ability to drive safely. It measures facial expressions and their intensity & is sensitive to split second changes in subtle expressions. Voice analytics and gestures improve the accuracy.

 

Our emotion sensing also enables the creation of a human-like, empathic avatar – a fun, expressive and intuitive in-car companion.  By making the digital assistant on the center dashboard screen emotionally intelligent, it can go beyond standard infotainment functions to weave an emotional bond between the vehicle its driver and passengers. It is able to sense and response to the user’s emotions personally, and learns user preferences over time, thereby deepening the sense of emotional engagement.

Safe Driving

  • Monitor driver’s attention level such as eye closure, drowsiness/fatigue, distraction to ensure that driver attention is on the road
  • Detect negative emotions such as anger, aggressiveness and ‘road rage’, anxiety, sadness, depression with high accuracy
  • Prompt alerts when negative emotions impact the driver’s attention level and impair the ability to drive safely
  • Analytics on driving behaviour pattern can be used to help tailor insurance premiums
  • When integrated into devices which measure physiological data such as heart rate, respiratory rate, it can identify sudden changes in medical conditions which impact the ability of the driver to drive safely.

Human-like, Emphatic Avatar

  • Sense the emotions of the driver and passengers and adjust naturally for a more enjoyable, playful ride experience
  • Enable a mood reactive conversation with the avatar, becoming your in-car companion
  • Observe the driver’s or passengers’ sentiment to personalise preferences based on past mood metrics
  • Adjust ambient lighting and temperature upon sensing driver’s frustrations from heavy traffic, or according to the time of day
  • Remind drowsy drivers to stay alert, tune on more energizing music upon sensing frequent yawning and or signs of eye fatigue during long trips

In-Cabin Advertising for Public Transport

  • Measure responses to entertainment content, advertising and campaigns which are continuously being tested inside the vehicle
  • Measure commuters’ engagement with interactive displays that both “push” information and gather data on customer preferences and satisfaction
  • Provide additional revenue stream for private hirers
  • In-cabin crowd sensing can help detect and alert likelihood of aggressive behaviours, helping to safeguard commuters’ safety