The Airborn Studio team explained how the materials were made for the project, spoke about its approach to architecture, revealed how it managed to achieve a painterly look, and discussed its custom solution for creating clouds.
In case you missed it
You may find these articles interesting
80.lv: Please introduce yourselves to our readers. How and when did you all team up?
Steffen Unger, Character Lead: I’m one of the founders of Airborn Studios and general Character Lead. I’ve contributed to games like Ori and the Blind Forest, Overwatch, and Fortnite among others. I got to helm the character pipeline of the Showcase.
Benjamin Sauder, Environment Lead: I’m one of the Environment Leads at Airborn Studios and was a Technical Lead on the Showcase project.
As for our history? Steffen Unger and I got to know each other while working at another game development studio (Spieleentwicklungskombinat Ost) way back. It’s when the first ideas for Airborn came up.
The aforementioned studio got shut down after its project got completed, but Steffen and I stayed in touch, kept at it, and posted our work in the Digital Art Forum, which used to be a hub for German-speaking artists at that time. And gradually, more artists flocked to it and joined – and that’s also how Johannes and Simon got involved among others. Since then, we've also worked together on numerous other projects, while still steadily expanding the world of Airborn and developing new prototypes.
80.lv: How did your level artists approach initial layouts? What were the goals? Where and how did they prepare blockouts?
Benjamin Sauder: When building an environment, one of the usual questions is where the center of attention of the viewer/player will be and has to be. It’s one of the factors that inform our approach to tackling a given scene. The blockouts were very rough models, Simon did most of the work on this, as he was also responsible for a first draft of what the flow would be. The whole animatic part was set up and shaded in Blender – I then did a slight cleanup pass, as concept models tend to be a bit too messy for actual usage, and moved everything over to Unreal.
Cliffs and Vertical Structures
80.lv: How did you craft different cliffs and vertical structures? Did you use a kind of modular system to streamline the production or craft everything manually?
Benjamin Sauder: Yes, these rocks and cliffs are built as kits. It’s not built like a grid snapped modular kit, but more like a collection of well-complementing pieces (from small to big, straight to curved, flat to round, etc.), which enables us to build bigger structures out of these.
We had a pretty good storyboard cinematic coupled with quite a few nicely painted concept designs for each scene – so the requirements were always pretty clear from the very beginning.
The workflow for these assets is almost always the same – as a first step, we build a block mesh. This can then be placed in the scene, and we can verify that the big shapes are working. Once all is looking good, it can be detailed and finalized. All the rocks got a sculpt pass – as we started this as an Unreal Engine 4.25 project, most of the models still use a low poly model and baked-down textures – we could leverage Unreal Engine 5 Nanite on quite a few pieces later on.
Each of the different biomes needed its own small asset library. The jungle theme was our biggest set overall because we knew right from the start that it would be the theme that gets the most screen time. The kits are often just a handful of pieces – the thick vegetation layer made it possible to hide the heavy reuse quite well. Modular kits really help to fill the world – but to make things shine, you always need a healthy amount of unique pieces to keep it interesting.
80.lv: Please tell us about your approach to architecture. How did you work on different buildings? How did you assemble the city we saw in the middle of your demo? What modular elements did you craft for the project and how did you modify them?
Benjamin Sauder: As with all our environment work, the approach and level of detail depend on whether the player/camera can get really close to an object or if it’s a backdrop piece that one should rather not spend too much time on. In either case, the settlement surely was the biggest set piece to be built.
The buildings are put together from small kits – building, the roof, and then add-ons like platforms, windows, stairs, etc. The foreground buildings are built with a mix of tiling textures and unique pieces. To make the place come to life it needed a heavy set dressing pass with lots of props and decals on top of everything.
Simon [Kopp, Art Director/Concept Artist] defined from which materials the buildings are constructed, and then we created the wood, whicker, plaster materials, etc.
Production-wise this was all pretty standard – we used the usual applications to author the models in 3ds Max, Maya, Blender, or ZBrush – whatever the artist was most comfortable working with – and then texturing happened in Substance 3D Painter & Designer.
Manuel Virks (Environment Lead) prepared some filters to achieve the splotchy painted texturing style, which allowed for a relatively quick and coherent texturing process. Everything you see in the trailer was handcrafted to get a nice consistent visual language.
Assembling the World
80.lv: How did you fill your universe and scatter details when assembling the world? How did you make sure everything looked readable? What were the challenges? How difficult was it to find the right organic look?
Benjamin Sauder: Set dressing was a huge part of this project, and it took some time to get everything into the current state. Most assets are hand placed, but the Unreal foliage tool was also used in places. It’s very important to balance areas from heavy to low detailing. The detailing helps the overall readability and is also supposed to visually guide the audience to specific parts. After all, that’s part of the job when doing level art.
To ensure everything is readable it’s important to keep the main path of the view clear, and not obscured by too much detail. The use of bright and dark areas, as well as using negative space helped a lot to structure the scenes. For our look, it was also important to have distinct depth layers with color variations. In some scenes, we layered in some Kuwahara filtering in the background, which reduces detail but keeps interesting shapes. All of this is surprisingly hard and can take quite some time to get right.
Getting the lighting and mood right was crucial and a long-winded process. It was sometimes difficult to hit the concept vision for each scene. Another challenge was to make shots feel different from another even though they are placed in the same environment, using mostly the same assets.
We also faced a few technical challenges, most notably flying in the sky with a cloud landscape was a big obstacle. Luckily, Unreal 4.26 added the volumetric sky right in time for us – but it still needed quite some head scratching to get a usable system out of it. We couldn’t really use the typical layer volume noises and fill the sky approach, as we needed much more local control over the clouds. Manuel and I developed a small system that made it possible to place cloud blobs into the scene exactly where we needed them. The sky system is quite taxing on performance but it was fine for our needs.
The Most Time-Consuming Tasks
80.lv: How much time did it take to assemble this enormous unique world? What tasks were the most time-consuming?
Benjamin Sauder: The majority of the work on the prototype was done between September 2020 and September 2021. The first 3-4 months were pre-production to define the general look and feel and to do some world-building. The first phase of which was an analysis of the older Airborn work to assess what the immovable core of the IP is that has to be reflected and honored in the new showcase.
While concept work kicked off, some 3D artists did a bit of R&D work on shaders, foliage, and other elements. Once the first concepts were finalized, we picked an exemplary character, environment setting, and vehicle to define the 3D workflows and use them as proof of concept.
3D asset production was mostly straightforward then, but there still were some technical challenges. Unreal Engine 5 got announced and went into Early Access as we were about 7 months into development. Transitioning to that came with the expected risks – it was Early Access after all with some features still missing or being WIP and Epic Games not providing any official support at that point.
Steffen Unger: On the character side, one of the biggest challenges has certainly been hair grooming. Until recently our main task on hair was always to model or sculpt it. Like Overwatch or, in more complex form, Fortnite are doing it right now: very strand-based haircuts with little to no natural breakouts.
We wanted to try something else and check out what Unreal’s capabilities are. Grooms are only going to become bigger in games – they will be more present in the future. So we wanted to figure out a way to bring some nice stylization into them.
At the time we got into the topic there was little to no documentation at hand, so we had to improvise a lot. Thankfully, Michael Cauchi made a great tutorial series on XGen, which was the tool that was the most documented for Unreal use at the time, so we focussed on using that one.
Besides, we certainly had a long R&D phase on how we get the painterly look into the characters without having to paint every brushstroke by hand.
In general, managing technical risks was one of the challenges. Most of the work on the Airborn Showcase was done between September 2020 and September 2021, and when everything kicked off, there was no Unreal Engine 5 yet. We had speculated that it would follow along with the new console generation but didn’t have any definitive info or timeline on that.
UE5 was announced in Spring 2021 and went into Early Access as we were 7 months into development. We always planned for everything to wrap up in September of that year, but since we were determined to use the latest technology available to us and also tap into features like Lumen and Nanite, we decided to transition from UE4 and UE5. It was a bit of an R&D endeavor since the Early Access version was not officially supported by Epic Games, but we had set aside some buffer time for a potential engine upgrade early on.
80.lv: What tools did you use to create materials for the project? Was it mostly about crafting in Substance 3D Designer?
Steffen Unger: For the characters, we pretty much stuck with Substance 3D Painter. We tried a few things in Substance 3D Designer, but SP was able to cover our needs for this project efficiently.
Benjamin Sauder: On the environment side of things, we employed Substance 3D Designer for all big and modular parts of the levels. For more unique assets, we used Substance 3D Painter in tandem with a few premade generators and materials from SD.
80.lv: Did you use AI-powered workflows to generate new surfaces faster?
Steffen Unger: Not at all, everything is handcrafted. Could AI be used? Probably yes, but we would have to train it on our own material, not just stuff off the internet.
Bringing Painterly Feel
80.lv: What were your main goals when creating materials for the demo? What were you aiming for? You wanted the project to be stylized but also show a certain level of realism, right?
Steffen Unger: We wanted to bring back some painterly feel within a physically based rendering environment. While we all love the way Disney or Pixar treat their surfaces, we felt like these are too clean and realistic. We wanted to get a sense of handcrafted materials that still show some brush strokes here and there, which are rough around the edges. Not perfectly polished and all very noisy and detailed.
80.lv: Could you also share some details on how you texture characters? Was it mostly about texturing details in SP?
Steffen Unger: We wanted to achieve a painterly-looking PBR-based style for the characters. While we do enjoy the crosshatched look we did in some of the earlier iterations of Airborn. We decided to go for a more subtle way. Back when we worked on this there haven’t been many games doing a cross-hatched look with outlines successfully yet. Nowadays there are plenty of games that do this near perfectly. We felt like we wanted to play with something else, bringing similar takes into a more physically based shading system.
A huge part of the texturing and shading of these characters is based on stylized normals, as well as actually groomed hair, no sculpted hair meshes or hair cards.
Importing Materials in UE5
80.lv: How did you import materials in UE5 and set up the desired look? Did you have to tweak something? What shaders did you set up in Unreal? It’d be great if you could share a couple of examples.
Steffen Unger: We tend to not trust Substance's preview very much in general. The vast majority of our productions, even though often PBR-based, are often stylized enough that a full 100% conversion from Substance 3D Painter to the respective engine does not happen. Materials have to be tweaked and fine-tuned with the engine in tandem. Doing an asset fully and solely in Substance 3D Painter or other texturing tools is pretty much never a thing.
80.lv: How much time did it take to prepare all the materials and get the desired look? What were the main challenges? What tricks did you use to streamline the process?
Steffen Unger: We had a pre-production phase of a couple of weeks to months, but it kinda had to flow into the production to be able to stick within the government funding. We would have loved to spend more time in vis-dev, but the way the subsidies are set up did not allow for that to happen.
So initially a lot of time went into testing and feeding the unreal hair system, which has not been documented very well at that point.
At the same time, we have tested how to stylize our textures and how much could be done in texturing versus shading. We tested a lot with creating brush strokes and stylized normals in various tools but settled for the simplest solution using a paint filter for Adobe Photoshop, which did what we wanted better than what we achieved with Substance 3D Designer ourselves, so we didn't spend extra time on trying to figure this out entirely by ourselves.
80.lv: What were your goals for clouds in the project? What results did you want to achieve?
Steffen Unger: While impressive, the current in-use cloud solutions for UE4 or UE5 are either: SOME great looking but all procedural sky-based on stacked noises or art-directed (and also great looking), very expensive clouds based on volumetric textures, let’s say from tools such as Houdini.
We wanted a solution that is lighter weight but can be fully art-directed. As we have concepts for all the cloudscapes, the goal was to match these as well as possible, all solutions we knew at that point didn’t deliver that.
80.lv: You built a custom solution, right? What is it based on? Did you use the standard Unreal tools as the foundation?
Steffen Unger: Yes, it's absolutely the UE volumetric cloud system. But instead of stacking a lot of noises to build an entirely random sky, or very few because of expensive textures. We went a route that is comparable to Metaballs. We have cloud spheres that write into the Unreal Engine volume. These we can adjust, we can work on the density, tiling of noises, size, and height of the shapes and also tweak colors. The system is pretty versatile and could easily be expanded. Say, to support dark thunderclouds and lightning.
Testing Different Variations of Clouds
80.lv: How did you tweak clouds and test different variations? Please tell us about the speed here.
Steffen Unger: Initially, we played with multiple of the publically available solutions and checked if they are feasible. While creating really impressive cloudscapes, we lacked the needed controls.
Achieving a Realistic Look
80.lv: What parameters did you customize to achieve a realistic look? How did you recreate real-life dynamics?
Steffen Unger: Here we are not doing anything different from the unreal system, just instead of stacking the 3D noises to build an entire sky, we only stack them inside our cloud spheroids. We have global and local parameters to finetune how it looks.