Building Transmutator in Unreal and Houdini

Amazing VFX guru Benoit Onillon did an amazing breakdown of his real-time visual effect experiment and showed how it all works.

Amazing VFX guru Benoit Onillon did an amazing breakdown of his real-time visual effect experiment and showed how it all works. This project was created as part of the contest The Great Transmutator competition, sponsored by PopcornFX. You can download the file as well, and play around with the finished product.

Introduction

I’ve always been the curious type, I started 3D by trying everything and never stopped. I worked on characters, animations, environment, really a bit of everything. I worked a lot as an environment artist.

Then curiosity led me to programming as a hobby at first and with my technical side starting to be noticed I got to start working on shaders. I used a node based tool and had to learn the basic math node by node. I just kept on learning since then, getting more and more into programming and performing various technical art tasks like scripting tools and optimization.

I’m really happy to work in an environment where being a jack of all trades is an actual and priced strength!

TGT competition

When reading about the theme I immediately started to imagine how it could be used in a game because I like practical goals. Thanks PopcornFX for coming up with such an inspiring theme!

Then I realized that having bits of exploded objects flying around would be much better done by FX specialists and that I would have much more fun trying something different. My main idea was to feel the object’s matter travel as a continuous transforming mass.

I also imagined mechanical alien worms with glowing eyes swimming in the goo but that would have been a visual mess. Still like the idea though.

The first shot at it wasn’t really that fast. I knew what I needed to do it but also added the challenge to use Unreal blueprints which I never used before. It took me about a week every evening to get the basics down and be confident the idea was feasible and start posting.

The two main parts are the object blueprint with the melt shader and the transfer effect with the mesh particles.

When an object blueprint is instantiated it automatically plays the melting effect in reverse to make it appear and then waits for a collision. When that happens the blueprint sends the collision position and time to the object’s shader to start the melting effect.

Just before this first blueprint is destroyed, it spawns the transfer effect blueprint at the current position. This new blueprint starts moving to the destination position with all its attached particles emitters playing and once arrived spawns a new object blueprint before destroying itself. And you’re back to step one.

There’s also a small blueprint that stays in the scene to manage global variables like the objects cycle and the destinations positions and of course smoke and bubbles here and there.

The melting effect and the liquid vertex animation

These are two very different techniques.

The melting is all shader based and is only about adding time to mask values and vertex position. The trick is to send the collision position to the shader and once you compare that to the vertices positions you get a nice gradient to work with.

There’s no pre-rendered animation here, you can hit the object where you want and it actually works as it is with any 3D model.

Here’s how it looks in the material editor after adding new features over and over:

It all starts here with the hit position coming from the blueprint:

Then time is added in order to give each pixel and vertex a value growing over time but happening sooner or later depending on the distance from the hit position.

And that’s all that actually makes the progressive melting. This value is simply used to interpolate between the original position of the vertices and a modified one where the vertical component is canceled.

Using interpolation makes the effect easy to play in reverse because original positions are always known. That’s how the object appears using the same shader.

The same value is used to change the colors using a bitmap mask for texture.

Then there are a few parameters to work on timing, switch conditions to reverse the animation, custom normals for the black melting part, some vertex color so that vertices don’t all move at the same speed, additional layers of color for the glowing parts and a even a little opacity mask so that you don’t see all the flat polygons on the ground.

But in the end all of these parts are just modified by the same animated distance value.

Now for the transfer effect I had some simple particles with animated transparency at first and saw a perfect excuse to play with the brand new Houdini 16 vertex animation tools made by the awesome Luiz Kruel. I was just lucky this just came out.

In Houdini, the idea was to adapt the simulation to fit a particle behavior as much as I could. It creates liquid from a central point, gets bigger, has the bubble motion and then gets sucked in the middle to disappear naturally. A bit of scaling on the final particles took care of the last drops.

A fun detail to add was to prepare a mask using the original simulation objects in order to use it in the shader for the glow effect.

The base fluid comes from shelf tools like “Flip fluid from object” and “Sink from Objects” and I added some moving colliders to create the bubbles like this:

The are of course some fun attribute transfers, merges, remesh and quick vex code lines, but it’s overall very common Houdini usage.

Then for exporting it’s only a matter of finding the new “Games” shelf and clicking on this:

It creates a node in the Out context with everything you need, including the source code for the Unreal shader you can just copy-paste.

This is the kind of image it creates for you:

What you see are vertices positions for every frames. This is fed to the shader to recreate your animated geometry while storing the mesh only once.

The Unreal shader to play this comes from the code given by Luiz in the tool.

The only change I had to make was replacing the reference position with the particle position and the time with the particle lifetime. The glowing part is the mask made in Houdini rendered in a separate color texture and plugged into the emissive output. And since this method is so efficient you can just have fun with the particles emission rate:

Smoke effect

This is going to be the disappointing part I’m afraid because Houdini and Unreal did everything.

Here’s a step by step:

Click the smoke shelf tool you like in Houdini, tweak the parameters until satisfied with the animation, render as an image sequence, import it back in the img space in a comp and use the mosaic node to export a sprite sheet.

Then create a new shader in Unreal, drop a flipbook node and input the previous spritesheet. That’s it. Done. Quickest smoke effect ever.

Added Substance Designer

What’s really amazing is how quickly Substance becomes your natural tool for anything bitmap related. When working on effects you always need to make masks with shapes mixed with gradients and stored in some specific way. Substance is just perfect for that.

The most useful tool for my masks here was the tri-planar projection. I was able to make very simple shapes (a noise for the black goo progression and the cracks) and have them projected on the uvs without seams:

It’s also very easy to pack grayscale masks in your final picture’s channels according to your shader:

I generated one mask texture file per object, just switching them in Substance and resaving the output.

And after testing your effect in the engine you can just come back, adjust your original noises and shapes and reexport the final composited texture with one click. A huge time saver!

Difficulties

Texture vertex animation and mesh particles are already widely used in FX for games. I recommend watching GDC’s FX conferences to see awesome demonstrations of how they can be used. Those were a huge inspiration for me.

It also becomes easier as tools are being created to make those effects and Houdini is becoming the new big thing in games for that reason. It offers the level of freedom and control we need to go into more and more complex processes.

Requirements are very easily met by any modern GPU because the technique was invented precisely to use what they can already do. We can already render a lot of vertices easily, we just need to keep any additional movement computation parallelized and stored efficiently..

On my side the time consuming part was to learn about blueprints but it gets easy very fast. Then fine tuning colors and animations took forever as I’m not a VFX specialist and when it was close enough to a playable executable I had to fix a lot of bugs.

Essentially optional goals I added myself just because I like to try new stuff.

I’d like to thank PopcornFX again for this great inspiring contest and I hope they’ll make another one soon.

Benoit Onillon, 3D Artist

Interview conducted by Kirill Tokarev

Follow 80.lv on FacebookTwitter and Instagram

Join discussion

Comments 4

  • SethPDA

    UNKUS
    I have downloaded it, and it's only a game where it demonstrates stuff. There is no Unreal Project files to work with.

    0

    SethPDA

    ·6 years ago·
  • Juan Belon

    Yeah, it would be nice to have some files to replicate this :)
    I also observed that if you shoot to the very bottom of the objects it can break the animation

    0

    Juan Belon

    ·6 years ago·
  • Unkus

    At the top of the page it claims 'You can download the file as well, and play around with the finished product.'.  I'm sure you don't even need to replicate from a screenshot when you have access to the working files.

    0

    Unkus

    ·6 years ago·
  • SethPDA

    Too bad there is no way to replicate this. The screenshot of the material is too small to see anytthing...

    0

    SethPDA

    ·6 years ago·

You might also like

We need your consent

We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more