Skip to content

Personal tools
You are here: Home » News » From research to application - Detecting emotions revealed by our eyes

From research to application - Detecting emotions revealed by our eyes

Document Actions
As part of the National Centre of Competence in Research (NCCR) “IM2 – Interactive Multimodal Information Management”, nViso, a young start-up company, is developing a system for detecting emotions, based on an analysis of facial expressions and eye movement. The researchers aim to offer these innovative solutions in the marketing sector.

Our expressions sometimes tell quite a different story from what we say. “Non-verbal language analysis has advanced considerably in recent years,” explains Jean-Philippe Thiran, professor at the Signal Processing Laboratory of the EPFL. Within the NCCR, Jean-Philippe Thiran and his team are developing technologies capable of detecting expressions and eye movement. These are data processing models developed with the help of facial databases. By means of a camera, they can recognise different parts of a face, track changes in time and space, and relate them to specific facial expressions. “This analysis is primarily performed on the basis of calculating facial angles and differently coloured areas of complexion,” says Thiran. The eyes are a fundamental element in this identification process. Refined to its ultimate degree, the technology can even pinpoint the pupils and provide information on the direction of viewing.

Targeting the marketing sector
The analyses of facial expressions offer several potential applications. Thirty-six year old Matteo Sorci completed his doctorate last year under the guidance of Jean-Philippe Thiran, and went on to co-found the start-up company nViso. nViso’s business is based on the development of techniques which detect facial expressions linked to emotions. “We have identified key commercial markets, notably in the marketing sector,” says Sorci.

The first product developed by nViso is a system which, with the help of a simple webcam, can analyse consumers’ expressions when completing a marketing questionnaire on the Internet. The aim is to distinguish between written and non-verbal responses.

nViso will now develop a new application. “By tracking the eyes, we will be able to pinpoint the area of the screen which arouses a particular emotion. Our application will be all the more useful when we manage to locate the precise area of the screen on which the eyes are fixed,” explains Sorci.

One question remains: what about the ethical implications of eliciting such information for marketing purposes? Thiran agrees that this is a pertinent question: “Researchers perfect technologies and it’s up to society to determine how it intends to use them. As a citizen, I can only hope that this technology will be used with maximum transparency.”

National Centre of Competence in Research (NCCR) «IM2 – Interactive Multimodale Information Management»
The NCCR IM2 aims at advancing research and developing prototypes in the field of advanced human-machine interaction. More specifically, the present NCCR addresses the technologies that coordinate natural input modes (such as speech, writing, touch, hand gestures, head and body movements) with multimedia system output, such as speech, sounds, images, 3D graphics, and animation.
As part of government economic stabilisation measures, the NCCR’s start-up company nViso received 150'000 Swiss francs for developing innovative solutions for the industry.

Last modified 2010-05-18 10:02

Powered by Plone