Emoj is bringing its contribution to intelligent systems to the Next Perception project, co-funded by the EU H2020 programme.
The vision of the Next Perception project is the need for accurate smart devices that can deeply understand the subject and its environment. The participants’ mission and objective is to develop smart perception sensors and enhance the distributed intelligence paradigm to build versatile, secure, reliable, and proactive human monitoring solutions for the health, well-being, and automotive domains.
The Next Perception project brings together 43 partners from 7 countries and branches out into three main Use Cases: UC1: integral vitality in monitoring for health and well-being; UC2: safety and comforts at an intersection in automotive; UC3: driver monitoring in automotive.
Specifically, Emoj participated in UC2. This scenario aimed to design a Driver Monitoring System, which can classify both the driver’s cognitive state (distraction, fatigue, workload, drowsiness) and the driver’s emotional state and intentions/activities. This information will be used for autonomous driving functions, including take-over-request and driver support.
Emoj has developed machine learning and computer vision algorithms to detect biometric traits and facial expressions by relating them to Ekman’s seven main emotions: neutral, happy, surprised, angry, sad, feared, and disgusted. In addition to emotions, we can detect the level of engagement and satisfaction by measuring parameters such as valence and arousal.
Valence identifies the quality/type of emotion felt, whether positive or negative valence, while arousal measures the level of arousal according to Russell’s model, i.e. how intense the emotion felt is.
In addition to coding facial expressions, we have developed a head analysis module for evaluating driver’s behaviour to have a 360° understanding with a view to the best possible support.
The driver monitoring system creates an empathic connection with the driver and the deep understanding of his emotional state allows the real-time activation of the Human-Machine Interface (HMI) and intelligent devices to mitigate negative emotions and change the emotional state towards a more comfortable and pleasant driving experience.
We carried out several tests through extremely realistic driving simulators to have a better face detection rate of the driver in different lighting and loudness conditions, in case of several people detected, not necessarily in front of the webcam.
The solution was tested in several use cases to ensure maximum reliability and accuracy of the driver’s analysis results.
You can consult the multimedia materials for a practical understanding of how the driver monitoring system works via our Linkedin social channel https://www.linkedin.com/company/emoj/ .
Don’t forget to follow us to stay up-to-date on the latest implementations and more!
For more information on the software developed for the Next Perception project, email us at firstname.lastname@example.org