News Mediacom

Emotion detectors could make driving safer

© EPFL/Jamani Caillet

14.03.14 - EPFL scientists are studying how to identify drivers’ emotions using embedded cameras that film their faces.

Technology now allows us to read facial expressions and identify which of the seven universal emotions a person is feeling: fear, anger, joy, sadness, disgust, surprise, or suspicion. This is very useful in video game development, medicine, marketing, and, perhaps less obviously, in driver safety. We know that in addition to fatigue, the emotional state of the driver is a risk factor. Irritation, in particular, can make drivers more aggressive and less attentive. EPFL researchers, in collaboration with PSA Peugeot Citroën, have developed an on-board emotion detector based on the analysis of facial expressions. Tests carried out using a prototype indicate that the idea could have promising applications.

It’s not easy to measure emotions within the confines of a car, especially non-invasively. The solution explored by scientists in EPFL’s Signal Processing 5 Laboratory (LTS5), who specialize in facial detection, monitoring and analysis, is to get drivers’ faces to do the job. In collaboration with PSA Peugeot Citroën, LTS5 adapted a facial detection device for use in a car, using an infrared camera placed behind the steering wheel.

The problem was to get the device to recognize irritation on the face of a driver. Everyone expresses this state somewhat differently – a kick, an epithet, a nervous tic or an impassive face. To simplify the task at this stage of the project, Hua Gao and Anil Yüce, who spearheaded the research, chose to track only two expressions: anger and disgust, whose manifestations are similar to those of anger.

Two phases of tests were carried out. First, the system “learned” to identify the two emotions using a series of photos of subjects expressing them. Then the same exercise was carried out using videos. The images were taken both in an office setting as well as in real life situations, in a car that was made available for the project.

The rapidity with which the comparison between filmed images and thus detection could be carried out depended on the analysis methods used. But overall, the system worked well and irritation could be accurately detected in the majority of cases. When the test failed, it was usually because this state is very variable from individual to individual. This is where the difficulty will always lie, given the diversity of how we express anger. Additional research aims to explore updating the system in real-time – to complement the static database – a self-taught human-machine interface, or a more advanced facial monitoring algorithm, says Hua Gao.

Detecting emotions is only one indicator for improving driver safety and comfort. In this project, it was coupled with a fatigue detector that measures the percentage of eyelid closure. The LTS5 is also working on detecting other states on drivers’ faces such as distraction, and on lip reading for use in vocal recognition. These projects are coordinated by EPFL’s Transportation Center and carried out in collaboration with PSA Peugeot Citroën.

Return to previous page
Alumni
Olivier Glauser
Diplôme
Master en Informatique et systèmes de communication 1994
Parcours
1994 - 1996 HP
1996 - 1998 Phillippe Moris
1998 - 2005 MBA Universite de Harvard
2005 - 2009 ROTH Cl Partners
Fonction
Directeur général de Streamboat Ventures, Pékin

Contacts

Olivier Gauser
Steamboat Ventures
222 Hu Bin Road, Shanghai 200021
Tel: 86 (21) 2308 1800
olivier.glauser@steamboatvc.com
A3 EPFL
Rolex Learning Center
Case postale 122
1015 Lausanne 15
Tel: +41 (0)21 693 24 91