Face/Off: Live Facial Puppetry

by Sean Andrist on January 23, 2011

in Assignment 1

Selected from list

1. Sentence: We present a complete integrated system for live facial puppetry that enables high-resolution real-time facial expression tracking with transfer to another person’s face. (Abstract, 1st sentence)

2. Problem: Producing compelling facial animations for digital characters is time-consuming and challenging. Facial performance capture is the current industry standard, but mapping the performance to a digital model is currently a complex process requiring a lot of manual assistance.

3. Key Idea: A light scanner provides dense 3D data and texture. Offline, a generic template mesh is fitted to a rigid reconstruction of the actor’s face and trained through a set of expression sequences. These sequences create a person-specific linear face model.

4a. What the paper does: The authors show that plausible live animations of different characters are possible even when only a single rigid model of the target face is available, and illustrate this beautifully by bringing a Roman statue to “life”.

4b. What it could be used for: Directors of animated movies could use this system to get an immediate preview of a face performance, including the emotional and perceptual aspects. In TV shows and video games, live performances of digital characters become possible with direct control by an actor.

5. Resources: Here is a nice video.

Previous post:

Next post: