Skip to content

Personal tools
You are here: Home » News » From research to application - Detecting emotions revealed by our eyes
... IM2, Start-ups winners Un prix multimedia européen pour une start-up valaisanne The February Newsletter is online Picture of the month KeyLemon: Winner of the Red Herring 100 Europe NCCR IM2 LinkedIn Group CALL FOR PAPERS IEEE SIGNAL PROCESSING MAGAZINE New Book: Multi-Modal signal processing: methods and techniques to build multimodal interactive systems IM2 8th Site Visit Best Student Paper Award Recordings of the workshop on Foundations of Social Signal Processing Next four-year period, 2010-2013, of IM2 approved Alessandro Vinciarelli has been appointed Associate Editor for the Social Sciences Column of the IEEE Signal Processing Magazine. IM2 is actively participating in AERFAISS'2010, the Summer School on Pattern Recognition and Learning in Multimedia Systems of the Spanish Association for Pattern Recognition Ed Gregg, new Idiap Financial Manager From research to application - Detecting emotions revealed by our eyes Indexed brainstorming Kooaba’s Reality Augmented With $3 Million Funding KeyLemon signe avec un des plus importants producteurs d’ordinateurs au Brésil Koemei, new Idiap spin-off is born ! 2010 IMD startup competition winners - Koemei, Idiap Spin-off, selected nViso - Un nouveau logiciel suisse décrypte les émotions Award-winning publication for Afsaneh Asaei, Idiap research assistant. Idiap researchers involved in the organisation of the next International Conference on Multimodal Interaction (ICMI2011). Active participation of IM2 in a festival for children Successful scientific workshops during the "Hérisson sous gazon" festival. Tracking the gaze and facial expression. IM2 SI 2011 photos and videos now available. BOB is born Un challenge pour start-up à Martigny ...
c/o Idiap
Centre du Parc
Rue Marconi 19
Case Postale 592
CH-1920 Martigny

tel. +41 27 721 77 11
fax +41 27 721 77 12


From research to application - Detecting emotions revealed by our eyes

Document Actions
As part of the National Centre of Competence in Research (NCCR) “IM2 – Interactive Multimodal Information Management”, nViso, a young start-up company, is developing a system for detecting emotions, based on an analysis of facial expressions and eye movement. The researchers aim to offer these innovative solutions in the marketing sector.

Our expressions sometimes tell quite a different story from what we say. “Non-verbal language analysis has advanced considerably in recent years,” explains Jean-Philippe Thiran, professor at the Signal Processing Laboratory of the EPFL. Within the NCCR, Jean-Philippe Thiran and his team are developing technologies capable of detecting expressions and eye movement. These are data processing models developed with the help of facial databases. By means of a camera, they can recognise different parts of a face, track changes in time and space, and relate them to specific facial expressions. “This analysis is primarily performed on the basis of calculating facial angles and differently coloured areas of complexion,” says Thiran. The eyes are a fundamental element in this identification process. Refined to its ultimate degree, the technology can even pinpoint the pupils and provide information on the direction of viewing.

Targeting the marketing sector
The analyses of facial expressions offer several potential applications. Thirty-six year old Matteo Sorci completed his doctorate last year under the guidance of Jean-Philippe Thiran, and went on to co-found the start-up company nViso. nViso’s business is based on the development of techniques which detect facial expressions linked to emotions. “We have identified key commercial markets, notably in the marketing sector,” says Sorci.

The first product developed by nViso is a system which, with the help of a simple webcam, can analyse consumers’ expressions when completing a marketing questionnaire on the Internet. The aim is to distinguish between written and non-verbal responses.

nViso will now develop a new application. “By tracking the eyes, we will be able to pinpoint the area of the screen which arouses a particular emotion. Our application will be all the more useful when we manage to locate the precise area of the screen on which the eyes are fixed,” explains Sorci.

One question remains: what about the ethical implications of eliciting such information for marketing purposes? Thiran agrees that this is a pertinent question: “Researchers perfect technologies and it’s up to society to determine how it intends to use them. As a citizen, I can only hope that this technology will be used with maximum transparency.”

National Centre of Competence in Research (NCCR) «IM2 – Interactive Multimodale Information Management»
The NCCR IM2 aims at advancing research and developing prototypes in the field of advanced human-machine interaction. More specifically, the present NCCR addresses the technologies that coordinate natural input modes (such as speech, writing, touch, hand gestures, head and body movements) with multimedia system output, such as speech, sounds, images, 3D graphics, and animation.
As part of government economic stabilisation measures, the NCCR’s start-up company nViso received 150'000 Swiss francs for developing innovative solutions for the industry.

Last modified 2010-05-18 10:02

Powered by Plone