Adding Depth and Realism to 2D Fluids in Unreal

Andras Ketzer, the creator of a toolkit for stylized smoke and fire in Unreal Editor called FluidNinja, shared his studies on how to add 3D depth to 2D fluid simulations with Ray Marching and Parallax Occlusion Mapping.

Tutorial Unreal Project (UE 4.20 and above, 45 Mbytes)

The project is included in Ninja v1.1  (released 8 April 2020), also available as a free, downloadable package (see above). The project is a collection of blueprints, materials and baked data, arranged on two levels - exploring technology to make 2D flipbooks “look 3D” by adding depth and self shadows.

Introduction

Recently, we have seen amazing volumetric simulations made inside (1) and outside (2) Unreal. Indeed, 3D fluids are already real-time - when running in solo - and probably could be deployed in-game in a year or two. Right now, the sim must be pre-rendered and baked to a VolumeDataBase file in order to practically use it (3).  VDB files are emerging as a universally supported format (4) - Modo and Blender are already capable of importing such files and there are workarounds to pull simulated volume data to Unreal (5), (6).

Baked volume files are neat, except that a 128 x 128 x 128 3D explosion sequence consumes 128x more memory than a 2D flipbook of similar resolution. Not sure if we are going to include a library of visual effect VDBs to a game anytime soon. So here we are, with real-time 3D sim a bit too GPU-heavy and baked volume files a bit too memory-heavy. 

Considering the cases when true 3D sims are not even needed - think of games with fixed camera angle or environmental effects in the far background - using enhanced 2D seems like a good tradeoff. 

VFX people have invented many tricks to incorporate 2D data into 3D space - we might just pick a few of these. Parallax Occlusion Mapping (POM) and Ray Marching (RM) are definitely the best candidates. 

In UE 4.20 and above, both techniques are implemented as material functions and could be used as a compact node in the material editor. By channeling baked input data through these nodes, our 2D simulation could be lit and spatialized in a 3D scene fairly well.

RayMarchHeightMap Node

Ray Marching (RM) could be used to shade 2D simulation data dynamically (every frame) - by sampling a density map from the direction of a light vector. The result is gorgeous self-shadowing, in sync with a local or global source of light in the 3D scene.

Note 1: RM-node requires unlit density data with no lighting / no self-shadows pre-baked. Imagine the density data as a "heightmap" to generate a landscape - and this landscape being hit by a directional light. The classical Ray Casting approach (ray tracing) assumes that light interacts with surfaces of solid objects and finds the first intersection per ray, computing the surface luminance at the given point. This results in hard shadows. In case of Ray Marching, light rays could penetrate each point of the density landscape: lower density means lower light absorption (Beer-Lambert classic optical law is applied) - so a ray is hitting new points until it is completely absorbed. This way, we could generate more accurate self shadows for density data describing "fuzzy stuff" like smoke or clouds - even the "slopes behind peaks" are lit, it looks a bit like subsurface scattering, a bit like radiosity.

Note 2: RM-node requires unsampled simulation data (practically a Texture Object) as input - which means, pre-processing of the baked data (if needed) should happen in a separate material and written to a RenderTarget. The main concern is the playback of baked sim frames. While the RM-node supports sub-UV sampling, it does not interpolate between the frames - which means you'd need a lot of frames to have a smooth playback experience. In our demo example, this is simply solved by using NinjaPlayBasic that plays and writes a flipbook to a RenderTarget - and this serves as input for both RM and POM nodes.

FluidNinja saves both density and velocity as a separate flipbook when baking a simulation. Velocity data contains speed and moving direction information for each pixel. NinjaPlay (implemented as a material) is utilizing this velocity data to blend between the density frames. Result: smooth playback using a very low number of frames (16 frames might be enough for 3-4 sec of playback).

Note 3: RM-node requires a light vector as input. Let's plug a vector-3 parameter to this input, and then we could adjust this param manually (A) in a material instance, or drive it dynamically (B) by feeding the vec3 with data every frame via a blueprint. 

In case of a directional light, the rotation could be transformed into a light vector.
In case of a point light, we subtract the VFX mesh world pos (the mesh with raymarch material applied) from the point light world pos, constructing a light direction vector.
See the example project for use cases and the image below.

ParallaxOcclusionMapping Node

POM could be used to add 3D depth to the baked 2D data without having to add 3D geometry - by splitting the 2D texture space to multiple layers / slices based on a heightmap (in this case: the density map). 

Imagine the height lines of a classical map and the process of manually building a landscape from cardboard by cutting out the layers and stacking them on top of each other. The parallax shader uses these layers to perform differentiated texture UV offset - based on the camera vector - creating the illusion that some texels are further away from the surface of the object.

Note 1: POM-node also requires unsampled simulation data - and we are using the same TextureObject input for POM and RM nodes.

Note 2: POM-node can output not only offset UVs but offset pixel depth as well - this way making the fake 3D surface to accurately capture cast shadows. However, this trick does not work with translucent materials - only opaque/masked.

Note 3: POM-node was able to output self-shadow data until UE 4.24. Then, it broke. 
This function was somewhat redundant to Ray Marching. 

Introducing UE Demo Project

The project is also introduced in a tutorial video - have a look!

The project folder structure is following the FluidNinja conventions and could be merged with Ninja main branch. The baked sample data is generated using the Dryice2 and RadialSwirl1 presets, located under /Game/FluidNinja/Input/FluidPresets (in case you are running a full Ninja, try to generate your own versions)!

The demo assets are located under /Game/FluidNinja/Usecases/ParallaxMapping. Flipbook data + player materials handling this data are located at /BakedData. The blueprint BP_WriteNinjaPlayOutputToRenderTarget writes the flipbook-player output to a Rendertarget - that serves as input for the POM/RM nodes. Important: (1) this blueprint must be placed on level / executed every frame, (2) the BP fills data cache at first run, so a few slots are empty at project startup. Rendertargets (considering the ease of use) are non-dynamic uassets located under ParallaxMapping root folder with an "RT_*" prefix. POM/RM base materials are in /BaseMaterials. In the scene, we are using instances of base mats, located in /MaterialInstances.

The project content is distributed on two Levels (/Game/FluidNinja/Levels):

Use cases additional 1, 2

The second level - Use cases additional 2 - contains the cauldron scene (see cover pic), where the Dryice sim data is used in three different lighting setups. Setup (A) features a moving point light source - that is controlled from the LevelBlueprint by a sine/cosine function. The same data in the same blueprint is transformed and pushed to POM/raymarch material instance, as a 3vec param: LightingDirection, per frame.

Note: the sync between the moving light source and the light vector is maintained in this blueprint. The process could be automatized (e.g. calculate light vector automatically from the light pos) - see BP_InEditorLigtingTest placed on Level 1. The (B,C) setups are using non-moving light sources and the light vector value is adjusted manually in the material instances to match the on-level light position.

Level 2 is also featuring cascade particles with volumetric material (PS_VolumeParticles_DryiceGravity), forming a dim fog around the cauldron. The volumetric mat enables the light sources to directly affect particle brightness - this is a simple method to support our 2D flipbooks to look more 3D. 

The first level - Use cases additional 1 - contains example data arranged in a grid.

Column 1: contains vector-field driven GPU particle systems, generated in the same process as the flipbooks - demonstrating that Ninja is capable of generating vec.field data. In the case of "Dryice", the GPU particle system is a simple "product" of the baking process, just like the flipbook. In case of the "Radial Swirl" system, the process is a bit more complex: an initial, simple (paint strokes-based) simulation is generating a vecfield - used to drive a complex particle system - that is utilized to generate a much more complex simulation and flipbooks. I'd call this process "bootstrapping" - and there is a separate tutor vid explaining it (have a look!).

Column 2: flipbooks. Zoom in to check the actual frame-matrix below the playback area. Note how velocity-based frame blending is performing a smooth playback using only a few frames. Try to switch off interpolation in the player material to see how it would look without frame blending!

Column 3: Ray Marching + POM advanced materials, performing real-time processing of flipbook data.

Column 4-6: simple materials. POM opaque, POM translucent, POM + Normal based lighting/shading. Note: normal-based lighting is an efficient, generic traditional approach, and probably could be utilized on mobile devices - in case you'd like to shade your flipbooks without Ray Marching.

Top row: a real-time in-editor interactive set. Try to move the light bulb icon and check how the raymarch / parallax mapping behaves. 

The point light is embedded in a blueprint (BP_InEditorLigtingTest), which is calculating the light vector in Construction Script - this makes "in-editor interaction" possible. Edit the blueprint and check how the light vector is constructed and forwarded to the material instance.

Performance | Dryice Test Scene

Overall performance

MEM: 1.25 Mbytes of texture memory for the dry ice flipbooks (density, velocity).

GPU: 281 instructions per pixel (IPP) in the main material pipeline: velocity-based frame blending on flipbooks, parallax mapping, Ray Marching, transparent material. With linear frame blending, skipping DepthFade and POM, the GPU load is pushed below 200 IPP.

FPS: the scene performed 182 FPS on an NVIDIA GeForce GTX 1070 in Editor (PIE), using fullscreen, 1920 x 1080p, with the raymarching cauldron setup fully occupying the screen space. With optimizations: 250 FPS. Opti: switching off additional volume particles, setting the flipbook frame-blending to linear, switching off DepthFade in the transparency pipeline and baking the scene lighting (leaving only the oscillating, raymarch-controller light source on type "moving"). 

Materials | Depth and Lighting material performance

  • 233 IPP: POM, RM, advanced transparency (DepthFading, EdgeFading)
  • 199 IPP: POM, RM, basic transparency
  • 182 IPP: RM only, advanced transparency
  • 148 IPP: RM only, basic transparency
  • 147 IPP: POM, basic transparency

Materials | Flipbook player material performance

  • 92 IPP, NinjaPlayerBasic, using advanced frame blending (velocity interpolation)
  • 64 IPP, NinjaPlayerBasic, using basic frame blending (linear interpolation)
  • 36 IPP, NinjaPlayerBasic, no frame blending

Other factors influencing performance

(A) The test scene contains a particle system with volumetric material to enhance "dry ice feel". The volume-grid resolution is forced to 4 via an "Execute console command" level blueprint node (r.VolumetricFog.GridPixelSize 4). UE default is 8.

(B) All light sources in the scene are set to "moving" type (performance-heavy mode).

Texture assets

DryIceDensity flipbookVelocity flipbook
Uasset size1024 kbytes (1 Mb), 32 frames in a 8x4 matrix256 kbytes (0.25 Mb), 32 frames in a 8x4 matrix
Resolution2048 x 10241024 x 512
ChannelsmonochromeRGB
Compressionalpha, BC4 on DX11default, DXT1

 

FluidNinja VFX Tools for Unreal, © Andras Ketzer 2020

Keep reading

You may find this article interesting

Join discussion

Comments 2

  • Anonymous user

    it's great! but doesn't work on ES3.1
    what am i miss?
    i try it on ue 4.22

    0

    Anonymous user

    ·3 years ago·
  • fredfortin6

    Thanks for sharing, that's awesome stuffs! Loving the small project demos ;)

    0

    fredfortin6

    ·3 years ago·

You might also like

We need your consent

We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more