12.12.2004

Face of CGI is often...Creepy
>



 


How come we sometimes feel sympathy for animated characters who are 'robotic'? While other times we do not. This process has to do with the perception of reality in the mind, with genetic, pre-wired factors, and our own experiences logged in looking at thousands of people from infancy to present. Since some aspects of detecting memory loss revolve around facial recognition, for example, Stephen Ferris' work at NYU, if we look at the commercial phenomenon of animated films as 'lab' experiment on human perception held at arm's length, we may begin to understand the underlying mechanisms....

By Neil Parmar
Correspondent


Computer-animated films may be grounded in fantasy but one of the secrets behind making them blockbusters remains buried deep within human psychology.

From "Toy Story' to "Finding Nemo,' Pixar Animation Studios has trumped its rivals for almost a decade by using computer graphics to generate cartoon characters that look real -- but not too real. Artists at DreamWorks Animation have followed suit and stuck to anthropomorphically lucrative heroes like Shrek, and Oscar from "Shark Tale.'

So when execs at Warner Bros. Animation gambled more than $265 million to produce and promote the lifelike cast of "The Polar Express' only to watch them get derailed by Pixar's "The Incredibles,' psychologists weren't all that surprised with the film's initially lackluster box-office performance.




According to the "uncanny valley' theory developed in the late 1970s by Masahiro Mori, a Japanese roboticist, we increasingly empathize with a robot the more it looks like a human being (recall C-3PO from "Star Wars'). Yet if a robot appears too humanlike, our compassion peaks, then plummets into a chasm of emotional detachment and disgust. That's because we can usually still detect a robot's eerie, machinelike movement or cold, mechanistic facial expressions -- no matter how much it resembles us. Once a robot becomes completely lifelike, however, our emotional guards melt and we may actually feel affection toward, say, the more lifelike androids in Steven Spielberg's "A.I Artificial Intelligence.'

The uncanny valley theory has yet to be proved -- or disproved -- by a scientific study. But it may exist because our eyes pick up subtle differences between things that appear similar but are not quite identical, says Donald Norman, author of "Emotional Design' and a professor of psychology and cognitive science at Northwestern University in Evanston, Ill. Experts say that it's during this process of mental nitpicking that we notice off-putting features: cheeks that fail to bulge, eyes without shadows beneath them, wrinkles that don't crease across the forehead and oddly textured skin.

Norman notes that the uncanny valley phenomenon is equally applicable to video games and computer-generated films like "The Polar Express' because the more a character seems real, "the more we expect them to behave like us.' When a computer carbon copy of Tom Hanks, for instance, falls short of impressing us or creeps us out, we're left disappointed and buzz quickly fades for a film.

Although a captivating story is obviously crucial in holding our interest at the theater, Norman argues that the emotion-based region of our brain places more importance on trying to "identify with a movie's actors,' he says. "How are we going to identify with a computer-generated character, or a physical or virtual robot?'

Marian Bartlett, an assistant research professor at the Machine Perception Lab at the University of California, San Diego, has studied human facial expressions and recently helped design an animated, computer-based tutor for children. She says the problem with films like "The Polar Express' and certain video game characters is that artists often fail to capture subtle changes in facial expressions.

"If you just capture the movement of facial features -- and not the surface changes in shadows -- characters are less appealing,' says Barlett. "The onset of facial expressions and how that coordinates with speech are also important. If you don't get the timing right things really look bad.'

Indeed, the time between when a character speaks and when their coinciding facial expression changes is vital in sidestepping the uncanny valley. Our eyes shift, for example, a fraction of a second before our head moves when we turn to talk to our neighbor. Yet, in many scenes from "The Polar Express,' the Christmas-happy protagonist moves his head about and chats with his conductor friend while his eyes remain scarily frozen in place.

Most animation companies now employ full-time "localization translators' who determine how much and in what way a character's mouth will move when it speaks. One such employee is Kristopher Tan, a 25-year-old video game designer at the Canadian-based company BioWare. He acknowledges that numerous animators have been trained to avoid the uncanny valley by perfecting the fine dynamics of speech.

In fact, Tan studied the theory in a computer graphics class when he attended university and now spends his days at BioWare translating character-speak into languages other than English. He then sends text codes to animators so that they can realistically alter the facial expression of characters when they converse. That way, international gamers won't be put off.

What's groundbreaking about "The Polar Express' is that director Robert Zemeckis attempted to create a world that intersected films, video games and reality. That's why he purposefully combined old-fashioned human acting, motion-capture technology and computer graphics so that he could get a look "somewhere in between.' Unfortunately, it appears as though he left his latest creation parked in the middle of the uncanny valley.

This page is powered by Blogger. Isn't yours?