The Epic Games Team on Lumen, MetaHuman, UE5.2 Demo & AI

Epic Games CTO Kim Libreri, VP of Engineering Nicholas Penwarden, and VP of Digital Humans Technology Vladimir Mastilovich discussed the company's GDC 2023 announcements, explained how UE5's Lumen came to be, talked about the production process behind the UE5.2 demo, and shared some behind-the-scenes info about Unreal Engine's procedural tools and MetaHuman.

Left to right: Vladimir Mastilovich, Kim Libreri, Nicholas Penwarden

Epic Games' GDC 2023 Announcements

80.lv: What are you most excited about with recent announcements?

Kim Libreri, CTO at Epic Games: It's very hard to pick one thing because they all fit into the vision that we've had for the last decade coming together. The MetaHumans, for example. They're called MetaHumans because they’re going to be essential for the shared vision of a metaverse we laid out at GDC. At the foundation of that vision, and everything we do, of course is Unreal Engine. Fortnite is built on top of UE5, and built on top of Fortnite is UEFN, which is a continuation of Epic’s philosophy of how we help people succeed as much as possible.

Bringing all these things together is really what it's all about for us and UEFN is about what we can do that's more than just the engine, or MetaHumans, or Quixel Megascans, or all the components that we have right now. We're very proud of MetaHuman Animator, for example, that we’ve been working on for around three years. But more than anything, we want to connect developers directly to the massive community of players that are hungry for content and experiences. 

So, it's hard to pick the best. The best thing from GDC might not be a product at all, but  the reaction of the people in the room because when you move that cliff structure and the whole room lit up in amazement and it was like "okay, we're doing something right."

80.lv: I'm most excited about Lumen. Every time developers download a new Unreal Engine build and play around with it, it leaves them speechless. 

Kim Libreri: Did you ever hear the story about how Lumen really came about? I've been at Epic for nine years now, and during the early days of Unreal Engine 4, you had this foggy Global Illumination. Daniel Wright and of our immensely talented Graphics Programmers had been trying to figure out how to do real-time globalization for the longest time, and we couldn't do it. We couldn't quite work out how to do it on hardware at that point. We got to the point where we created a Distance Field sort of bounce off the ground, but we parked there.

Until one day when Brian Karis, who led the development of Nanite, started to do the prototype. Our team has always had this hunger to get movie quality graphics into video games. We all love video games and a bunch of us had experience working on movies in the past. We've got to work out how to do this and with Brian, who’d gotten Nanite working, it made us realize that finally, we have geometry that starts to rival what we can do in feature films. But with all the resolution, the lighting has to be better.

That's where it came full circle and became a call-to-action for Daniel and our team, who'd sustained a passion to see global illumination in Unreal Engine become a reality. It's like UE5 without a Global Illumination solution to go hand in hand with Nanite would be a fail visually. We would have been in the Uncanny Valley, and that was the motivation.

Nicholas Penwarden, VP of Engineering at Epic Games: And there's another component too, as we were looking to build Unreal Engine 5, we wanted to make sure Unreal was capable of being an open-world game engine with massive worlds to the point where it would be totally impractical to rebuild lighting with an offline solution.

Plus, with a large open world, you're generally going to want the time of day and you're going to have lighting conditions changing constantly. If we couldn't make Dynamic Global Illumination work, we didn't have a right solution for open-world lighting. All of our technical plans for Unreal Engine were based around having not just Nanite, but Dynamic Global Illumination to support making sprawling open-worlds.

Kim Libreri: We knew we were going to bring MetaHuman Animator to GDC, but on the Unreal Engine feature side, we didn't really have plans for a big demo, and quite frankly, the engine team has been working really hard, getting UE5 out, upgrading Fortnite to UE5, getting the The Matrix Awakens: An Unreal Engine 5 Experience out, getting UE5 into customers' hands, and then getting Fortnite to run with Nanite and Lumen.Initially, we weren't planning to do anything too big at all. We were just going to show some customer showcases and demonstrate the amazing work everybody is doing with UE5, and now that we look at all the work that the community are producing, we know that the work has paid off. 

The Quixel team had also been scanning the Pacific Coast Trail in California that goes along the Sierra Nevada mountains. That’s where they started to put together some scenes as a demonstration. At that point we've already got a world-positioned offset working with Nanite, and we've got leaves and foliage working with Nanite, and they started to utilize all of these advancements we’d accomplished. It was the beginning of January, and everybody at Epic got an early look at what Quixel had made with UE5. The graphics team didn't really want to do a demo, but that looked so real and we were all looking at it at 4K, we were all mesmerized about how beautiful this thing looked.

So then we look at each other across the Zoom call and we go: “Let's do it, let's do a demo.” We basically grabbed hold of the engine team and we were like: “Do you want to do it? Should we do it? Are you okay?” Then we were off to the races.

This whole thing, it was almost like jazz, game development jazz where the environment starts, they start laying it out and we're looking at it with the procedural team, and they're are working but we're still in the pre-release stage, but we want our customers to play with it and tell us what they think, and it's just mesmerizing and it looks so real. And Richard Ugarte, who's one of our producers who produced The Matrix Awakens: An Unreal Engine 5 Experience, says it looks like some of the trails he took his Land Cruiser out off-road here in California, and we're like: “Off-road, hold on."

Rivian uses Unreal Engine for their Human-Machine Interface (HMI), and we're quite close to them because they are one of the first adopters to use the engine for a car, and they also use it for design, and they're a super cool, progressive team here in the Bay Area. We just ring them up and go: "Hey, could we borrow one of your cars for a demo?" "What does that mean?" "Oh, I will tell you." Thankfully, they're super kind and they let us do that. And then with the physics team it was like: "Physics team, do you think the new vehicle system in Unreal can hit this?" and we did a little bit of an upgrade for 5.2, and now we even have physics on the tires deforming as it goes over the rocks.

At that point, everyone is so inspired by this jazz session between all the engineers and artists that another idea develops "Let's put a fluid simulation!" "Are you crazy?" And it just kept growing and growing…and growing to what you saw at GDC. But it was never this premeditated: "Hey, we're going to do an environment demo." It was the result of everyone adding and contributing, it was music.

Even the opal paint, we have this beautiful new material authoring system, Substrate, that can do things you could never do in a game engine before. I'm just driving home one night and the code name for the demo was “opal”, because we just picked a letter and then produced randomly assigned something that begins with "O", and it happened to be opal. And I'm like, "Hold on, the demo is called opal”. That's just the story of how organically this demo came together.

Unreal Engine 5's Procedural Tools

80.lv: One of the things that struck me with the introduction of procedural tools. Can you tell us a little bit about them and how they can be used?

Nicholas Penwarden: John Sebastian, one of the developers who worked on the project, would constantly say: "We're not just making a scattering tool." When you think about the first layer of procedural content generation, it’s scattering trees and rocks, and stuff like that. But, again, "we're not just making a scattering tool".

When we showed the procedural assembly, those was a kind of tools that we want to give developers, where an artist can create this beautiful assembly that can just customize itself. Here's a design element that we want to have, but rather than having an Environment Artist taking a pass and spending hours set-dress around it – it does that on its own. We also have other tools in there that we didn't show on stage about being able to draw a spline and have that carve out a space in the forest that could be used to create a creek bed.

There's a kind of spline that forms a circle and sort of interior so that it creates a creek bed, and that's one of the elements that communicates with the assembly that we used in the demo. It was really about how we can use proceduralism to create tools that artists can then use to give themselves superpowers, again, “not just scattering tools”. Of course, the tools can do the scattering. We showed some of that as well, but the cool thing is that those tools can actually lay out these other features that then lay out a procedural path that then can be moved, customized, etc.

It's really about empowering artists to make tools for artists, as opposed to pushing a button and getting something. It is about tool creation.

MetaHuman

80.lv: Can you also tell us more about MetaHuman, its present, and its use cases? 

Vladimir Mastilovich, VP of Digital Humans Technology at Epic Games: The MetaHuman technology is based on 15 years of experience in building characters for open-world games. I founded 3Lateral four years ago, and we have Cubic Motion, who also joined the team at Epic. Together, we specialize in runtime rigs for large open-world titles that are highly optimized to both run on a large number of characters but also to be able to LOD out very quickly. When, for example, there is a cutscene, you have one extreme close-up of a character, you want to put all the resources you have on that one character, but then as soon as the camera goes out and you see multiple characters, you want to be able to redistribute those resources for computation but without any noticeable artifacts. That's our expertise, and that's sort of what is built in the MetaHuman product.

We started with MetaHuman Creator two years ago, and it enables you to create an asset which normally would take months to do, even by professionals. Now, with a sort of parametric asset, anybody can create a MetaHuman in literally minutes, even kids who do not know professional pipelines. But what's striking about it is that even though it's easy to create, it's still a professional asset. People can choose the depth at which they engage with the with this product, but just producing an asset is not enough.

We are slowly removing barriers to the complete adoption of the MetaHuman product. One of these barriers is that a professional user wants to be able to calibrate their MetaHuman to a high-quality 3D scan, which is a feature that we released last year. Now you can load a scan in Unreal Engine. We have a very accurate landmark detection that will run automatically and it will fit the MetaHuman mesh. It may not seem like a major feature, but it’s quite an important one. As for MetaHuman Animator, we consider it to be one of the most complex features of the MetaHuman product, maybe even more complex than the original MetaHuman product because it requires so many pieces of the engine to run in unison.

What was easy with MetaHuman Creator was that because it's detached from the engine in a way, it was in the cloud, it was a separate thing. It was essentially a game that talks to our backend, which ultimately builds and delivers the asset. Then, Animator runs in Unreal and requires many things to come together, including the new neural network inference module. We want to make MetaHumans useful. Enabling easy animation that can be done with just the phone, all the way to professional devices, which would cost thousands, sometimes even millions of dollars. We feel that's a huge accomplishment.

The difference in quality is actually not as dramatic as you would expect. So, the iPhone user would get roughly about 80% of the quality of a professional user, which we feel is the true meaning of the democratization, giving that ability to our customers. And then looking a little bit more into the future, Fab, the new creator marketplace from Epic, and UEFN are the ultimate delivery paths for creators. So, if you look at the overall message that's been delivered at the keynote from the whole company, we are removing barriers between game developers and players, and that's why MetaHuman Animator is another supporting announcement that supports our GDC story.

Plans for the Future

80.lv: What's the endgame for Unreal Engine? What are you trying to achieve overall?

Kim Libreri: I don't think it's feasible to turn the engine into the tool for everybody. And we're actually quite proud of the symbiosis there is between packages like Maya, Houdini, ZBrush, and Unreal Engine. 

We wanted to be the place where people make great entertainment – not just games, but also filmmaking,education and more. We just want to make an engine that has no limits and allows everyone to make compelling content the way that they want to make it. We don't want it to necessarily be everything to everybody because it's not. We have to choose your specialties, but the ultimate place for real-time entertainment, that's what we care about with the engine.

80.lv: And it's not just games, right? 

Kim Libreri: It's not just games. Every industry that uses computer graphics, that wants to enable sophisticated interactions within that computer graphics, is pretty much using Unreal Engine at this point, but primarily it's a game engine. Game engines are built to make simulated worlds that can host lots of people concurrently doing things together, and that has lots of advantages for other people, but our primary purpose is to make a place that allows incredible entertainment. And UEFN is part of that continuum because it brings the people who want to partake in the entertainment to the developers in an easier fashion.

Thoughts on AI Technology

80.lv: Do you plan to do anything more with AI?

Kim Libreri: We believe in it but we also want to respect people’s privacy and IP ownership. We use machine learning as the core of MetaHuman Animator, for example. I think that the game industry would benefit when it comes to UVs and other tasks. There are plenty of repetitive tasks that artists have to do today that I do feel deep learning could contribute to, but that deep learning needs to be trained.

As an industry, whether it's our engine or somebody else's, we have to come up with the right mechanisms to allow people to participate if they want to in providing data that helps machines get better at helping them. How do we come up with a framework for ethical contributions to AI training in a way that the community actually agrees is the right thing to do?

I think 80 Level can actually have a role to play here because so many people are listening to what's happening, and there's been so many complex scandals. At the same time, I'm pretty sure most artists that do UVing would prefer not to do UVing. We just have to come up with the right mechanisms and the right standards for contributing your data, and then we always have actual benefits that other people aren't monetizing on your back.

It is early days, but I'm pretty sure in the next couple of years, we'll start seeing great ground rules laid down for participation in the training of AI that is generally useful and does not disrespect the contribution of the human spirit and artistic creation.

Kim Libreri, CTO at Epic Games

Nicholas Penwarden, VP of Engineering at Epic Games

Vladimir Mastilovich, VP of Digital Humans Technology at Epic Games

Interview conducted by Kirill Tokarev

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more