New emotion detector can make driving safer
Now a new device that can identify drivers' emotions and make driving safer!
London: Scientists have developed a new device that can identify drivers' emotions using embedded cameras which film their faces to help make driving safer.
Reading facial expressions and identifying which of the seven universal emotions a person is feeling can be very useful in video game development, medicine, marketing, and, perhaps less obviously, in driver safety, researchers said.
In addition to fatigue, the emotional state of the driver is a risk factor. Irritation, in particular, can make drivers more aggressive and less attentive, they said. Researchers at the Ecole polytechnique federale de Lausanne (EPFL) Switzerland, in collaboration with PSA Peugeot Citroen, developed an on-board emotion detector based on the analysis of facial expressions.
The problem was to get the device to recognise irritation on the face of a driver. Everyone expresses this state somewhat differently – a kick, an epithet, a nervous tic or an impassive face, researchers said.
The solution explored by scientists in EPFL's Signal Processing 5 Laboratory (LTS5), adapted a facial detection device for use in a car, using an infrared camera placed behind the steering wheel. Hua Gao and Anil Yuce, who spearheaded the research, tracked only two expressions: anger and disgust, whose manifestations are similar to those of anger.
Two phases of tests were carried out. First, the system "learned" to identify the two emotions using a series of photos of subjects expressing them. Then the same exercise was carried out using videos.
The images were taken both in an office setting as well as in real life situations, in a car that was made available for the project. The rapidity with which the comparison between filmed images and thus detection could be carried out depended on the analysis methods used.
But overall, the system worked well and irritation could be accurately detected in the majority of cases. Additional research aims to explore updating the system in real-time – to complement the static database – a self-taught human-machine interface, or a more advanced facial monitoring algorithm, said Gao.
Detecting emotions is only one indicator for improving driver safety and comfort.
In this project, it was coupled with a fatigue detector that measures the percentage of eyelid closure.