The essence of great character performance at Image Engine
They say “the devil is in the details”. Over the years, what has Image Engine learned about the more microscopic, smaller elements of a creature that really sells their believability to the audience? We focus on 5 different creatures from recent years and chart the different, detailed performance elements that went into making them feel so absolutely believable. We also look at incremental improvements year over year in the creature performances.
May 22, 2019
Where lies the essence of great character performance? Is it found in white-knuckle action sequences, where behemoth dragons rain fire on battlefields or robots clash in artillery-ridden firefights? Or is it found in something altogether more subtle – is it the near-invisible, subliminal aspects of a performance that, contradictorily, cause it to stand out?
In this article, we’ll dive into the philosophy behind creature performance at Image Engine to reveal the answer. We’ll chart the progression in artistry and tech behind five of our CG characters to show that what makes for an exceptional performance isn’t always that which is most obvious or noticeable.
“Animation is usually the first department that shows the director the vision of a shot, so the performance has to sing,” begins Jeremy Mesana, Animation Supervisor. “That’s why our tools are always evolving from show to show. We’re constantly seeking new ways to learn from what we’ve done, to improve efficiency, quality and speed, and to nail the details that make for the perfect performance…”
A Real Boy – Chappie (2015)
Chappie’s robot protagonist gives affirmation to both sides of the character performance question. Based on the in-camera performance of Sharlto Copley, but with no motion capture involved, Chappie’s performance was hand-animated via a process termed robo-mation. It made for gripping battle scenes, but it also laid the foundation for a less overt yet narrative-critical element of the character: his burgeoning individuality.
“Sharlto would do subtle, emotive things with his body or expression that we would have to translate to Chappie’s largely immovable face,” says Mesana. “Eye movements would become slight head movements, or Chappie’s ‘ears’ would shift to signify a reaction. We actually eschewed early ideas like a waveform mouth to designate emotion, because it felt more effective to communicate it in this subtler way.”
What the team faced here was an interesting challenge: they needed to convey a ‘soul’ in Chappie, yet at the same time convince viewers they were watching an actual robot, not a human being pretending to be one. The limits of Chappie’s 100% mechanically accurate CG model were both a blessing and a curse in this regard, and technical ingenuity was required to create a rig onto which animators could impart their imagination.
“Neill (Blomkamp, director) was adamant on avoiding ball joints,” says Mesana. “That meant there were many axis-restricted joints throughout Chappie. Doing every rotation as a single axis would be extremely difficult, so we built a multi-chain, nine-joint IK system that solved the joints without the need to run a simulation. This empowered our animators to work within the constraints of the model and deliver a half robotic, half human performance.”
Performance isn’t always about how a character reacts within the world, however, but also how it reacts back to them – such as Chappie’s ‘zef’ bling that swings from his chassis and gives him a physical connection to the world. The creation of the system that powered this bling was the first stepping stone on a new path of innovation for Image Engine.
“The sooner we can show the director a more complete vision of a shot, the better – if we can show a version of something we know will be added down the line, it’s super helpful,” says Maia Neubig, Lead Creature FX TD. “For Chappie, we developed a way for animators to quickly view a near-final render output of the bling on a blocking pass. It was instant feedback: if something was broken, they could reevaluate and adjust at this early stage.”
For Mesana, empowering animators via automation in this way is key to improving performance: “Any time we give animators quicker insight into their work we increase the efficacy of our iteration cycles, because they spend less time fixing issues and more time fine-tuning what works about that performance.”
Spared no expense – Blue, Jurassic World (2015)
This was certainly the case on Jurassic World. Although altogether more organic in nature, the performance of the fiercely intelligent velociraptor ‘Blue’ was founded in the same thinking as Chappie’s robot: namely, more robust technical animation processes equals more nuanced, and therefore more believable, character performance.
Helping animators to create more refined work in this way is the main directive of Jared Embley, Rigging Lead. “Over the years we have developed a suite of custom, under-the-hood approaches to help animators work faster.
“With something like Blue, for example, a full muscle and skin setup can dramatically impact the silhouette of the character, which in turn informs the animator’s performance decisions. If we can automate that setup, and let the animator see the how the muscle flexing sims react to their work at as early a stage as possible, they can get the performance where they want it to be much faster.”
And Blue’s performance is certainly impressive. She stands out in Image Engine’s bestiary due to her pairing of animal ferocity with keen intellect. This is a predator that thinks. The systems created by Mesana and team were crucial in communicating this – they provided the foundation for the emotive performance the animation team layered on top.
“Raptors have rigid brow bones, so there was no expressive gesture there that we could use to impart Blue’s intelligence,” says Embley. “Given this, we went full steam on rigging the eyes. We added fidelity with blend shapes, which could be used to rig specific shapes based on the movement of the eye controls. Animators could also tweak the clusters, enabling them to get very specific with the movements these eyes made.”
Using this system, the team could incorporate elements like the nictitating membrane of alligators into Blue’s performance. Other reference from ostriches and lions further fed into the sense of Blue being an entity, not a mesh. “We had to choose where to blend these real-world references into the physicality of a raptor, while still being faithful enough to the reference that the viewer subconsciously sensed that they were looking at something real.”
Simply put, audiences had to feel these real-world details, rather than see them, if they were to buy into Blue on a subconscious level. “Sometimes, it certainly is that which audiences don’t see that pushes the quality of a performance to the next level, rather than what they do,” says Embley.
The speed and efficiency of Blue’s rig, despite her dense mesh, has given it something of a legacy status at the studio. “Blue’s quad leg has been used on pretty much every character since, but improved each time,” says Embley. “We don’t reinvent the wheel every project, we evolve it – the work on Blue enabled us to take on creatures like the Graphorn, for instance, which is in many ways a direct descendant of Blue…”
A fantastic beast – Graphorn, Fantastic Beasts & Where to Find Them (2016)
The Graphorn and Blue’s shared DNA is highly observable in the systems that power them, but the creative challenges were of a different pedigree. Here, animal reference came in the form of tigers, elephants and buffalo, blended with the feline grace of a house cat. And while the Graphorn didn’t present the complexities of Blue’s emotional intelligence, they still needed to breathe, shake and snort like a believable biological being.
“We modelled skeleton, muscles and fascia beneath the Graphorn hide to make them move and act like an animal,” says Neubig. “That would work in conjunction within a layered sim going from inside out, resulting in skin slide, muscle deformation and other physiological elements of the performance.”
To make this movement as realistic, and as immersive, for the audience as possible, the animation team employed a tetrahedral mesh for the sim – a closed mesh-like muscle with an internal volume and dynamic constraints, enabling it to react like a voluminous mass as opposed to a cloth sim.
“When you use a tetrahedral mesh version of the finite element method it actually preserves volume because it has that internal structure,” says Neubig. “Taking that approach with the Graphorn represented that next evolution in terms of how we tackled muscles and skin slide – and we were about to take it further again…”
A dance with dragons – Drogon, Game of Thrones season 7 (2017)
In Game of Thrones season 7, Drogon has done some growing up. No longer a youngling, the mythical beast has matured into an awe-inspiring killing machine capable of decimating entire armies. Image Engine needed to communicate this raw power in every element of his performance.
“Real-world speed and scale was important, but also a challenge given that Drogon is the size of a Boeing 747,” says Nathan Fitzgerald, Lead Animator. “We had to tread a fine line between reality and pushing the physics so Drogon could perform his aerial manoeuvres despite his incredible weight.”
Image Engine experimented with speed options early in development, dialling into something that avoided a ‘slow motion’ feel while still selling the mass of the character. “The flight cycle, for instance, featured a timing breakup on the wing’s downstroke, emphasising air resistance and Drogon’s power as he pushed to gain altitude,” says Jenn Taylor, animation supervisor (animation lead on Game of Thrones season 7).
“We also worked to sell the feeling of the air supporting that large mass under the wings. We set up ‘wing puffs’ for the wing membranes, for example, which animators could trigger to fill with air when Drogon made larger turns and put additional force on his body.”
The more such control was given to animators, the more their imagination could soar and the more believable Drogon became. “Most creatures are limited to what the rig can do, but with Drogon’s deformer system animators could manipulate the mesh and paint weighting for shot-specific fixes,” says Taylor. “These could be exported down-pipe for other animators to import; it was very powerful in crafting a unified performance.”
When Drogon had finished raining fire from above and came to rest beside his mother figure Daenerys, he still needed to feel alive with movement. “Even when ‘motionless’ we would make Drogon emote – a muscle flex here, some breathing, frills lifting up, wind in the membranes,” says Jason Snyman, Animation Supervisor. “All of this gives your subconscious enough information to suggest the illusion that the creature is real.”
From Fantastic Beasts to Game of Thrones and then back full circle: the work on Drogon provided the roots for Image Engine’s work on the Thestrals in Fantastic Beasts and the Crimes of Grindelwald. “The wings and rigging system were updated and improved,” says Neubig. “That sense of constant iteration, of building upon past successes, is what makes our creature performance ever better.”
Humanising non-humans – Robot, Lost In Space (2018)
Three years on from its defining work on Chappie, Image Engine again tackled the challenge of imbuing a robot with humanity – this time plying its creature performance talents on the smaller screen for Netflix’s Lost In Space.
“The task was similar to that of Chappie – evoking nuances of humanity from a character without a face,” says Julia Flanagan, Lead Animator. “Every scene in which the robot appeared had to carry a message about the enigmatic nature of his character. The challenge was in making acting choices in this context while still maintaining his robotic nature.”
There was another, more practical, consideration too: many shots throughout the show featured an in-camera actor wearing a physical suit. “Some sequences would go from the actor, to our CG character, back to the actor. And the actor moved in a very specific way for a 6ft 2 man in a giant suit – not like a normal person at all. So that was a unique thing to match in the performance without breaking the viewer’s immersion.”
Motion capture was used for specific shots, which would give the team the human aspect of the actor. Animators would then “roboticise” the performance using keyframe animation, adding small details that cemented the character into his narrative role.
The CG model the animators had to puppeteer is certainly impressive – a muscular hulk of shifting armour plates encasing an inner blue light. His aesthetic came at a cost of geometry, however. “Each of those armour plates was a rigid piece of geometry deformed from multiple controls, so we couldn’t use rigid techniques to animate,” says Embley.
“We used our own Image Engine skin cluster node instead, which makes point deformation calculations every frame. It echoes the work on Drogon’s deformations, but was drastically updated for this character.”
Thanks to these updates the animation team could work in a flexible manner: “We were able to massage the plates into whatever position we needed, so if the robot was in a weird position or lifting his arms above his head, we could pull a plate out of the way to make that movement work,” says Flanagan. “That allowed us to focus on drawing out that emotive performance, instead of worrying about the technicalities of a rig.”
This final point is a solid summation of all we have discussed – and it brings us nearly back to our opening question.
A great creature performance isn’t rooted in an either/or dichotomy between subconscious understatement and overt action. It’s rooted in a unison of both; a gestalt effect that wows and immerses simultaneously.
And while this gestalt effect is certainly driven by the imagination of those animators who studiously scrutinise reference material and inject life into the meshes with which they work, it is, as Neubig concludes, equally the result of the rigs that grease the gears and allow for such grace in movement to be more easily achieved – whether it’s for intense scenes of robot battle or the smallest ripple of a hide as a creature exhales warm breath.
Excellent character performance is found in the coupling of the pronounced and the subliminal; the creative and the technical – in the multiple layers of research, talent, technical wizardry and painstaking animation that must work in harmony. When that happens, audiences get to see pixels come to life on screen.