This is a strange, possibly creepy, and utterly amazing video
We've become accustomed to the idea of motion capture: you put white dots on an actor's face and body, and these points are mapped on to equivalent parts of a computer generated model.
The moving computer model is essentially a framework that is then "skinned" with the appearance of the character called for in the script. It's a means to realistically animate computer models and the method is getting better all the time.
Here's a variation on the theme, that is impressive as well as being mildly disturbing.
Rather than map the motion capture data onto a computer generated model, the images are projected, in real time, back onto the actor's face.
The computer "knows" the shape and position of the face from the motion capture data, and so can modify the projected images as the actor's position changes.
For this to work, everything has to be precisely aligned. This is not a trivial thing to set up.
But the results are amazing, as you'll see below.
NOBUMICHI ASAI (PLANNER / PRODUCER / TECHNICAL PRODUCER - P.I.C.S.)
HIROTO KUWAHARA (ART DIRECTOR & MAKEUP)
PAUL LACROIX (TECHNICAL DIRECTOR / PROGRAMMER - TRANSIT DIGITAL WORKS)
JIN HASEGAWA (CG DESIGNER - SPADE)
TAKASHI ISHIBASHI (CG DESIGNER - SPADE)
AYAKA MOTOYOSHI (PRODUCTION MANAGER - P.I.C.S.)
AYA KUMAKURA (PRODUCTION MANAGER)
YOSHIHIRO UENO (PRODUCTION MANAGER - P.I.C.S.)
KAZUHIRO NAKAMURA (COLORIST - McRAY)
KENJI NAKAZONO (PHOTOGRAPHER - CREATIVE STUDIO WORKS)
KIMIHIRO MORIKAWA (PHOTOGRAPHER - SHOOTING & LIGHTING)
JIN HASEGAWA (SPADE)
RHEA TOR’S INC.
K.FURUMOTO (HAIR - &´S MANAGEMENT)
YUKA SEKIMIZU (MODEL - SATORU JAPAN)
HIDEAKI TAKAHASHI (MUSIC - mjuc)
SPICE (OPTITRACK MOTION CAPTURE SENSOR)