The state of the art in the computer representation of a character resembles a marionette, allowing an animator to emote through these puppet-like controls on the character. This project develops upon the concept of a character as a puppet. Our research involves the abstraction of animator infused emotions as a form of intelligence and personality of the character. This model of intelligence will capture the various unique expressions and mannerisms that make a character instantly recognizable. The general concept of intelligence in a character is very broad and we draw upon aspects of work done in anatomy, artificial intelligence, neural networks, speech and cognition, as we explore this area. This notion of intelligence in a character is in many ways independent of any environmental setting and is more a manifestation of the character's mind.
We are also
looking into approaches that allow characters to establish a personality, by
learning and embedding behavioral traits specific to the character and its
interaction with the environment. Video
processing and motion capture techniques harness the wealth of behavior that can
be abstracted from the real world. These layers of abstraction should bring
computer-generated characters closer in nature to real actors, leaving animators
and directors free to focus on the creative aspects of an animation.
On a parallel note, therefore, we
explore the potential of retargeting various aspects of a character to other
geometrically similar characters. The problem at the lowest level of
transferring skeletal animation from the bone structure of one character to
another of different proportions, has proved to be an interesting challenge in
itself. Transferring muscle and skin behavior to a different geometric skin adds
a whole new level of complexity. Finally to be able to retarget secondary
motion, moods and other abstractions of intelligence is a sizable problem that
is of both scientific interest and of industrial importance.