Quixel’s Rebirth: Making a Real-Time Photorealistic Cinematic

Quixel team shared some of the details behind the production of Rebirth, a real-time photorealistic cinematic designed to show the power of UE4 and Megascans.

Dan Woje, Teddy Bergsman Lind, and Wiktor Öhman from talented Quixel team shared with us some of the details behind the production of Rebirth, a real-time photorealistic cinematic designed to show the power of UE4 and Megascans.

Check Rebirth and some of the lecture that covers the production of it below:

Project Rebirth

80lv: Could you guys tell us how this project started? Did you think about it before your trip to Iceland or did you come up with the idea after you’ve already scanned all of the pieces of content? What were the main goals of this project?

Dan: We had been having discussions for some time about bringing partners together to collaborate on a project. We knew we wanted to do something big – something ground-breaking, and we had been throwing ideas around about what this may look like since 2017. We settled on the goal of creating the most photorealistic real-time cinematic ever.

Our trip to Iceland in mid-2018 wasn’t specifically for the Rebirth project, however, we had wanted to visit that place for a very long time to scan assets for the Megascans library. It is a beautiful, unforgiving, and exceptionally diverse country with unique biomes that we were really excited to capture.

Following this trip, we realized that the Iceland scans would lend themselves to the kind of cinematic we had been dreaming of for so long, and so the Iceland assets became a pillar for the project.

1 of 2

Cost of the Resources

80lv: The project’s main “wow” factor, apart from the STUNNING visual quality, is the way it’s all done in UE4. So here’s our next question: how expensive are the Megascans assets and resources you’re using in the project?

Teddy: One of the aims of the project was to prove that the kinds of results we achieved with Rebirth are accessible, not only for studios with huge budgets and large teams but also for freelancers and independents. If an individual wanted to download and use the assets that we did in Rebirth it would cost the equivalent of around $75. While Rebirth may look complex, in reality, we used a small set of assets throughout the entire cinematic. As each Megascans asset is quite visually complex, this allowed us to repurpose the same assets across many different shots whilst creating a varied look.

1 of 2

Running in Real-Time

80lv: It would be awesome if you could tell us how it’s even possible to have all these amazing elements running in real-time in UE4. You’ve obviously got a lot of LODs already done, but Galen mentioned in one presentation that the team was actually using a lot of Houdini to build game-ready LODs for assets by getting rid of a lot of super high-density details. The whole process seems to be fully automated which must have been a great time saver. Could you guys talk a bit about that?

Dan: Since we were creating a cinematic rather than a playable environment we initially ended up with high-resolution 1.5m polygons. Game LODs were too low poly for what we wanted so we decided to go for the 250K polygon mark for the assets. The assets held up well at that resolution and saved us some overhead and wiggle with the frame rate. Working at 24 frames/second meant we could work in real-time but also in high-quality.

It was possible to have these elements running in real-time partly because of the versatility of the assets and partly because of the quality. The assets are incredibly detailed and scanned in such a way so that they can be used dynamically. We could copy them around quickly and build-up large areas of detail. Therefore, whilst certain shots might appear to have hundreds of assets in them, in reality, we only used a few. Using high-quality assets efficiently meant that it wasn’t too taxing to run in real-time, even at 250K polygons.

We have our own LODing pipeline for the Megascans library and SideFX helped us with a custom tool for getting the high LOD for the project. This was fully automated and meant that it took only a day and a half to process assets. We could then put them into UE4 and get them pallet-ready.

1 of 2

Natural Assets Production

80lv: We’re amazed by all these incredible assets you’ve got for the scene. It would be great if you could help us understand how your team adapts natural assets, turns them into usable elements and polishes them up to achieve the important flexibility and versatility to be used in different kinds of situations later.

Teddy: The most important aspect is determining which assets to scan in the first place. For this, we employ what we call an “Ecosystem Philosophy”. An ecosystem essentially means a large collection of elements which all work together to form a cohesive whole. To create a new environment set, we capture many elements from the same location, so that when the environment is recreated, all the pieces work together seamlessly.

Assets are captured in various sizes to reproduce the full natural range of an area, including not only objects but also large surfaces and fully scanned vegetation. Unique variations are captured to recreate the natural variety of shapes and textures. Having a predetermined pallet to draw from when creating a scene also simplifies art direction and gives artists a solid base to build upon.

A key advantage of ecosystems is the acceleration of producing a photorealistic environment. Since the overhead of producing assets that work perfectly together is reduced, an artist can immediately focus on the final look. And with the way the assets are prepared, the library caters for every level of artist and every aspect of the industry, which was especially useful for this project where we aimed to bridge the gap between the games and VFX industry.

1 of 2
1 of 2

Assembling Rebirth

80lv: Could you talk about the way you’ve worked on the assembly process? Rebirth was presented as a video, so we’re guessing you were more preoccupied with managing a good shot than building a playable environment. It would be awesome to learn more about your process and the way you approached it. Was it more like a matte-painting post-production story or did your team want to build most of the scenes in UE4?

Teddy: One of the things that makes Rebirth unique is that the small team was made up of individuals from games, VFX, and architectural visualization. This convergence of 3D industries is something that Quixel believes in and supports. Victor Bonafonte, CEO of ArchViz company Beauty and The Bit, was the art director for Rebirth. We handed our reference, scouting and asset photos from our scanning trip over to him and he created concept art based on those, including his own additional ideas.

Victor worked on this shot by shot. Some of the shots we tried to replicate exactly and for others, the artists put in their own twist. There was a lot of creative exploration to make it work, and all final pixels were supervised by Dan, who worked seven years in VFX at Blur. Everything was 100% built in UE4 with no matte-painting at all.

Scale

80lv: Another great element of the film is the sheer size of the scale. What way did you manage to fill the whole space with content? Did the artists do all of that or did you rely on procedural scattering? What are the most efficient ways to build such a large space filled with huge structures?

Dan: Since we were working on a per shot basis we didn’t build a whole world so, in this sense, the scale and expanse was just an illusion. The artists built everything using standard set dressing and placement of assets, with a few exceptions. Some scattering tools were used and while we tested procedural scattering, in the end, we opted for hand placing everything in order to achieve the complexity we desired.

Luiz (SideFX): This relates to something Dan touches upon a little bit later, which is the challenge of the compressed timeline. There wasn’t a lot of time to iterate on the tools to get them to where they needed to be. Some tools were easier to assemble and were extensively used, others, like the new scatter tool, were developed a bit too late to make the cut.

Mountains

80lv: How did you work on the mountains? It’s funny to learn that you’ve used a lot of the Open Topography data from Alaska, and not from Iceland. It would be awesome to learn a bit about how you decided to work on these terrains and texture them to achieve incredible realism.

Teddy: Most of the mountains were created with the help of the amazingly talented Houdini GameDev team at SideFX. They built a system for us that allowed us to create realistic and modular mountains with procedurally generated masks that we used in tandem with Megascans rock and moss surfaces. For one of the top-down terrains, we worked with Dax Pandhi, creator of Gaea, who helped us build a fantastic looking terrain in the 11th hour of production.

Luiz (SideFX): Joe Garth from the very start was pretty adamant about using real-world elevation data in order to build the mountains which would give a great realistic feel to the general shapes. The reason Alaska was chosen is that there is not a lot of high-resolution imagery for Iceland, and Alaska has similar topography that gave us some of the base shapes needed.

A few patches of terrain were created to be kitbashed and reused in multiple shots, and that was the bulk of the terrain work. Then, there were a few shots that required a custom setup.

Houdini was used to generate masks on those base tiles that were then fed to Unreal to be mixed with the Megascans tileables. In a couple of shots, we did some procedural modeling of the terrain using basic shapes to match the concept art.

Texturing

80lv: What way did you use materials here? Did you have a chance to mix some complex scans, especially for the structure? It’d be interesting to discuss how you managed to build such high-level content!

Wiktor: The entire structure was made using just a handful of concrete scans from the Megascans library as a base with some imperfection scans and some decals on top of that to add definition and weathering.

The cockpit shot was brought up very late in production. I got it on Monday morning and by the end of the day, I had the first prototype ready. The material editor in Unreal and the accessible blueprint system did the heavy lifting and allowed us to rapidly prototype the materials. The leather material for the cockpit dashboard was taken from Megascans with some details added.

All of the handcrafted assets, namely the structure and the vehicle, were textured entirely in-engine using surfaces, imperfections, and decals from the Megascans library. We also leveraged Quixel Mixer for some of the unique surfaces, such as the barracks in one of the intro shots. Mixer was used to quickly create completely new and customized surfaces leveraging existing photorealistic materials in the Megascans library.

Procedural Panels

80lv: We loved that you’ve approached panels procedurally. Could you talk a bit about the advantages of Houdini here? Why not building them by hand? Also, how did you decide to texture this? 

Wiktor: SideFX helped us build a procedural paneling tool that could iterate shapes and panels based on input. This saved us days of hand modeling and we could use seeding to get a variety of looks quickly. What SideFX provided in this tool was added support for vertex coloring. Each panel got its own vertex color in the red, green and blue channels. This allowed me to manipulate the reflector values and have more control over the separate panels.

Luiz (SideFX): As Wiktor mentioned, efficiency was behind a lot of reasoning. Wiktor came to me with the structure partially modeled and wondered if there was anything Houdini could do to help speed the process up. This is something we’re going to be looking into a lot more over the next year – how we can leverage proceduralism in the time-consuming aspects of modeling. It’s relatively fast to get your base shapes done, but the detail phase is usually pretty repetitive and time-consuming.

Fog

80lv: Guys, you’ve been using a lot of very cool visual effects here, especially the fog, and as far as we understood it was partially done within Houdini. Are we correct? What were the advantages of using Houdini to generate those amazing flipbooks which were used for the effects?

Dan: The volumetric fog was taken out-of-the-box from UE4. Outside of that, SideFX helped us build flipbooks, using a brilliant animated procedural noise system they built for us, which in this case was much more efficient than particle simulation. We used a variety of techniques including animated cards, custom cards made by our artists, in addition to the amazing flipbooks from SideFX.

Luiz (SideFX): The Flipbook approach is definitely useable in production, it’s your traditional bread and butter effect trick. Depending on your memory constraints you might not have as many variants or as high of a resolution, but flipbooks are the most common way of getting Houdini FX into a game engine.

We chose the procedural approach over a more traditional simulation in order for the Quixel team to iterate quickly. We knew the timeline was tight and they’re not set up with a large render farm to iterate on a full blast smoke simulation, so while we prototyped ideas, we chose something that would give a similar look but allow the Quixel team to iterate on the shots themselves.

Lighting

80lv: How did you work on lighting here? You’ve done an amazing job with setting up atmospheric lights, and we would love to know how you figured out the balance and the values.

Dan: Just one HDR light was used throughout the entire cinematic, except for last shot. It was important that all the shots were coherent, so it was set in stone early that we’d pick one and stick with it. We didn’t use any area lights or direct lighting, just a skydome.

For the grading, we took a vertical slice of three different shots and made sure that these looked great on their own. We then used Unreal’s grading tools to achieve the look that we wanted, before tweaking the shot.

1 of 4
1 of 4

Challenges

80lv: Overall, what would you say were the biggest challenges during the production of this project? How did you manage to solve them?

Dan: One of the biggest challenges was being in an R&D state whilst simultaneously having to deliver shots. We weren’t always sure whether what we were doing was going to work until we delivered the vertical slice. We learned a lot about how to create large terrain efficiently and being able to iterate on that terrain quickly.

Finally, part of the beauty of the project was that it involved working with various different partners. I’m sure the end result wouldn’t have been as good as it was without this cross-industry effort. However, having to keep stakeholders across different companies,  countries, and timezones on the same page was sometimes tricky. Once this was ironed out and we worked out the most efficient ways of working, it became much easier!

Dan Woje, Creative Director at Quixel

Teddy Bergsman Lind, CEO at Quixel

Wiktor Öhman, 3D Artist & Art Lead at Quixel

Luiz Kruel, Technical Artist at SideFx

Interview conducted by Kirill Tokarev

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more