Creativity: What's so unique about the challenge of creating a living, breathing Orville Redenbacher on screen?
Ulbrich: In our business, doing a completely believable human being that delivers lines in close-up to the camera is the holy grail. There is nothing more difficult that can be done in computer animation. We've done digital actors before, certainly—we've done them in stunts, and we've even done them in commercials. But they've never been in close-up delivering lines. This is the first time it's ever been attempted anywhere. It was the most challenging thing we've ever done. There was no magic software package that you could just buy and suddenly you're making digital people. Our team looks back at the spot and we see all the flaws, but I have to keep reminding them that we're going into uncharted territory. For a long time, this was considered unattainable. And we're there, or at least damn close. We've learned a hell of a lot, and we're getting ready to do a similar thing in a major motion picture. (ed: David Fincher's The Curious Case of Benjamin Button?) Over the next two years, this will get better and faster and cheaper, it's going to be much more commonplace, and eventually it's going to be standard.
Was it a matter of refining existing techniques, or inventing whole new ones?
Both. There is no one tool set that allowed us to do this. Internally at Digital Domain, we've got this little initiative called Project H.A.R.D. (human animation research and development). That group has been evaluating available software tools and rendering technologies, and we're doing software pipeline engineering to figure out the best ways to do this. It's a hybrid pipeline that's comprised of lots and lots of different tools—putting together software tools that weren't necessarily meant to go together, and then rewriting propriety code that allows them to talk to each other. And when we find out that there is no tool to achieve a particular aspect of this whole puzzle that is a digital human being, we've got to create one. It's really hard. [Laughs]
How was Orville brought to life on set?
Basically, there's three actors that comprised Orville. One was the actor who plays the body, who was in the scenes with [director] David Fincher, interacting with the other actors and holding the popcorn. He wore a blue ski mask, for lack of a better description. We did a little bit of motion capture with that, using tracking markers on the head and shoulders for how we would track a digital head back onto that body. Then, separately, we shot another actor who was brought in to play the face, to deliver the performance and the lines that gave us a template that the teams of animators used as a reference for our animation process. And then, the third actor was the voice. The guys at Crispin did extensive casting, auditioning over 400 people before they ended up with the guy they chose. At the end, he was brought into a looping session to put the words back into Orville's mouth after the fact.
How did you go about recreating Orville's exact likeness?
Achieving authentic likeness was probably the most difficult part of this, because we don't have the benefit of the real man to look at. When we've created 3D humans in other cases, we've at least had the real person to reference, so you can take exact measurements or even scan a sculpture or a bust that's been pulled from a mold of the real person. We didn't have that. In this case, we had family photos, existing publicity photos that were taken of Orville, and the old commercials made back in the '70s. The quality of those commercials was not great, and there's different lenses and film stock and things that put certain distortions on those images. And then you realize, 'Wow, we don't have any shots of the back of his head!' It was really daunting. So we brought in a team of Hollywood sculptors who are kind of artists-meet-forensic sculptors—from those photos and reference footage, they pieced together their impression of Orville in three dimensions. We then scanned that into the computer to create a 3D model, and then there was a very extensive rigging process where we basically built a muscle system and all the tool and handles that the animators and programmers could use to drive the face.
What was more challenging, the animation process, or just getting the skin textures correct?
All of the above. [Laughs] The skin was enormously challenging. Again, it's one thing if you're doing a static head, or a head that makes very little range of motion, like making a grimace or something. But to actually have someone speak, you have to deal with skin deformation, the way the skin behaves around the mouth, the eyes, the nose, the neck. Hugely complex. And the hair? Intensely complex. Millions of calculations to create the hair. The other thing is, he's an icon. Everybody knows what Orville looked like. So we had to match his hairstyle. Extraordinarily difficult. I'd say, the easiest thing we did was tracking the head onto the body. [Laughs] Everything else, on a scale from one to ten, I think was about a twelve.
Do you see this technology potentially extending beyond the world of commercials and film?
I think the hardware in video games specifically is getting very robust. The Gears of War commercial that we did was a breakthrough spot in that it was all done in the video game engine. So we've been studying that industry very closely, and we're starting to realize that it is not a leap of faith anymore to think that we can now start using these engines in advertising and entertainment. If you look at the quality we got in Gears of War, I think we're very close to the point where you could make an animated movie in an engine. At that point, we're able to create actors as digital assets, and those actors could have a life in a movie and also have a life in a game simultaneously. With what we've just done with Orville Redenbacher, we can start to bring a level of realism to these characters in games at a level that no one's ever seen before. They can be celebrities and recognized athletes and fictious characters, and they're all going to look pretty good, because right now we're at the point where the hardware platform—people's PS3s and Xbox 360s—are at a level where they can handle flying around 800,000 polygons. So I think the video game space is hugely exciting, and it holds great promise.