New Adam Short Looks Amazing

Unity’s newest digital short, created entirely with the wonderful cinematic tools of Unity, tells a story of a tribe of human survivors.

Unity Technologies has finally released the llong-anticipatedthird episode of the ADAM series. This time we get to have a look at the tribe of humans, who live in post-apocalyptic wilderness and kill robots. Pretty intense stuff.

The visuals look pretty neat indeed. Wonderful work with the facial animation, very cool camera tricks. I’ve seen some of this stuff during a presentation at Siggraph 2017 in LA and there’s a lot of delightful cinematic stuff you can do with Unity now. Actually the whole production process is documented on the official websiteADAM: The Mirror and ADAM: Episode 3 were achieved with Unity 2017.1.

Farewell to fossilized pipelines

To produce the next two ADAM installments, which are around six minutes each, Oats knew they had considerable technical challenges ahead of them. To realize their first-ever CG film “in engine,” they onboarded real-time rendering and artist-focused sequencing tools from Unity 2017.1

In just five months, the Oats team produced in real-time what would normally take close to a year using traditional rendering. “This is the future of animated content,” declares CG supervisor Abhishek Joshi, who was CG lead on Divergent and Game of Thrones. “Coming from offline, ray-traced renders, the speed and interactivity has allowed us complete creative freedom and iteration speed unheard of with a non-RT workflow.”

 

Oats wanted to make the films look as real as possible, and Blomkamp promised, “with high-definition facial capture, dense polygonal environments, and with lots of characters, it’s going to max out the allotted computational power.” What followed was an intense period of experimentation using new features.  

Technical director Jim Spoto recalls, “One of the most ambitious features that we co-developed with Unity was Alembic streams for cloth and face animation. Alembic is a standard for animated geometry cache data – it’s a staple in use at VFX studios, and the richness and fidelity it provides has been crucial.”

Coming off big-budget movie productions like Avatar, and Star Wars: The Force Awakens, rigging technical director Eric Legare remarked how “the Alembic integration allowed them to implement Unity into any VFX film pipeline.”

 

Photogrammetry for locations and clothes

True to his live-action background, Blomkamp insisted on real environments for ADAM. “We shot 30,000 photos,” he says. The crew spent time outside Indio, California, at a decommissioned iron mine, where they found ideal story locations. “We wanted the nuances and authenticity of a real environment that can’t be modeled or manufactured.” Dressing the set with digital props, using drones, and armed with several cameras, they captured their data over a couple of days.

This significantly cut down on asset-creation time. The team saw the environments instantly appear in the Editor and could never have modelled that level of detail from scratch in Maya. Not only were they able to start lighting right away, but they started propping the set. Seeing the virtual environment while we did motion capture with actors was really useful as well, says production designer Richard Simpson. “It was much easier to walk around seeing where the performers should go and such.”

 

1 of 4
 

Using Marvelous, the team recreated the real-world costumes and simulated the physical behavior of the cloth, the same as for a feature film. Having the real physical costumes and reference video, the team then tuned the nuances of the cloth simulation to ensure that it performed correctly.

The resulting cloth simulation was piped out using Alembic to cache it for playback in Unity. The final result is much higher fidelity than would typically be possible in a real-time engine.

Tackling CG humans

“The most difficult aspect has been RT photoreal humans,” Neill admits. Eager to put the engine through its paces, Blomkamp didn’t shy away from tackling one of the biggest challenges in VFX – humans. To achieve this, Chris Harvey engaged the two pillars of realism: shaders and animation.

Subsurface scattering (SSS) shaders are essential for believable skin. SSS is the phenomenon where light penetrates the surface of a translucent object, interacts with it, and then exits from a different location. In materials such as skin, milk, and marble, SSS plays a crucial role in creating a soft translucency.

1 of 2

The other challenge haunting CG artists working on humans is animation. “The uncanny valley occurs when we animate a face but cannot animate the perceptible yet invisible tiny movements of the face,” says Eric Legare. “This creates an uneasy feeling like we are watching something fake or unreal.” To avoid this problem, the Oats crew decided to lose the facial rig.  

Blomkamp explains how they captured traditional motion data for the body then did something quite different for the head: “We scanned the actors’ facial performance by photographing them the way you do an environment, using photogrammetry. This was done in high resolution at 60 frames a second. We ended up with 60 heads, which translated as 60 different meshes deforming in Unity upon playback. “Sort of like classic Mickey Mouse animation,” adds Blomkamp.  

This hyper-realistic facial animation dispenses with rigs, bones and all the trappings of classic 3D animation – it only needed Alembic support.

 
1 of 3
 

Custom rendering texture

The Mirror’s hypnotic eyes involved a different setup, as technical director Jim Spoto explains: “It literally wouldn’t have been possible without the new features like the Custom Render Texture functionality.” The team used the Alembic-streamed animation for her face but also had the engine run a custom shader that involved GPU tessellation (via “compute shader”). The vertex animation was generated procedurally on the GPU via the mask (see illustration).  

For more information, see Custom Render Textures in the Unity User Manual.  

Timeline forms ADAM’s backbone

Another component that helped Oats tell the ADAM story, while ensuring the team’s workflow scaled well, was Timeline, a sequencing tool that was used for animation and scene management.

The studio nested dozens of timelines within a single master timeline, as technical director Mike Ferraro explains: “We broke the film into sequences so each could be worked on simultaneously. Within each sequence we nested timelines to further divide the work – animation, alembic caches, and FX. It even helped for things like background crowds where a whole group of characters has its animation sequenced in a timeline that’s used as a ‘clip’ in the sequence, making it easy to adjust and offset that group’s overall timing per shot.”

“Thanks to its real-time rendering capability, it doesn’t even feel like I’m working while I am in-engine,” adds lighting artist Nate Holroyd.

 

Guerrilla VFX filmmaking

Blomkamp admits, “I have been obsessed with real-time graphics since I was around 16. It feels like some 21st-century playpen of creativity.” Blomkamp’s background as a VFX artist, and the crew he’s assembled at his studio, means that Oats can compete with most major studios in terms of quality.

Trailblazing with new tools is second nature to them. “I believe that there is no such thing as standing still,” Chris Harvey says, “You are either moving forward or you are moving backwards . . . I want to be pushing ahead.”

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more