Software Enables Avatar to Reproduce Our Emotions in Real Time

© 2012 Alain Herzog

© 2012 Alain Herzog

A virtual character produces the same facial expressions as its user. It makes a video game, chat, or an animated film both fun and fast. Faceshift, an EPFL spin-off, launches its software on the market today.


You move, he moves. You smile, he smiles. You get angry, he gets angry. “He” is the avator you chose. Faceshift, from EPFL’s Computer Graphics and Geometry Laboratory, now offers a software program that could save time for the designers of animation or video games. Thibaut Weise, founder of the start-up, smiles and nods. On the screen his avatar, a fantasy creature, directly reproduces his gestures. This system could enhance the future of video games or even make video chats more fun.

One tool required: a camera that has motion and depth sensors in the style of Microsoft Kinect or Asus Xtion, well known to gamers. During its first use, the software needs only ten minutes to recognize the user’s face. The user reproduces several basic expressions requested by the program: smile, raise eyebrows, etc. “The more movement is incorporated into the program’s 50 positions, the more realistic are the results,” explains Thibaut Weise, creator of the start-up currently based at the Technopark in Zurich. Then you can get into the skin of your character and animate by moving yourself. “It’s almost like leaving your body to enter that of your avatar,” jokes the young entrepreneur.

Saving time for animated films

The challenge for the research team in the laboratory of Computer Graphics and Geometry was to find an algorithm to superimpose the depth data from the camera with the color of the image and avatar in one step. They demonstrated that 3D facial movements could be reconstructed in real time without using facial markers or complex scanning hardware.

In an animated film or a video game, the facial expressions of characters are defined with a program that permits the movement of different parts of the face, step by step. To simulate anger, for example, it’s necessary to knit each eyebrow in two or three clicks, then stretch the mouth down, and so on. With the Faceshift software, mimicry and “emotions” of the avatar follow those of the actor, rendering the work more fun and certainly faster. “This new tool can reduce the time to make a film by up to 30%,” asserts Weise.

One imagines the purpose is to directly animate the face of one’s avatar in a video game. Already in contact with the major designers of video games, Thibaut Weise believes that the next generation of 3D cameras will enable his company to take off. In the meantime, it provides versions for the general public, integrated into applications such as Skype or online gaming.



Images to download

© 2012 Alain Herzog
© 2012 Alain Herzog

Share on