Experimenting in Unity

Guido Ponzini shared his vast experience in visual and interactive programming for industrial exhibitions, Adam Unfold Test and lots of other experiments in Unity.

Guido Ponzini shared his vast experience in visual and interactive programming for industrial exhibitions, Adam Unfold Test and lots of other experiments in Unity.

Introduction

My name is Guido Ponzini and in the past 5 years, I worked as Gamification and VR/AR/MR programmer for industry and brand promotions, creating experiences for exhibitions, simulations, stores, and training. 

I started as a musician and for more than 10 years I worked in this field, playing in the soundtrack of Assassin’s Creed Brotherhood, recording for Paramount/Viacom, Warner Music, Nintendo Europe, a live show featuring Wii Music and in musical productions as tour and recording artist. While I was working with electronic music I extensively experimented with 3D immersive audio, working both on live spatialization system (this year I held the course of interactive spatialization on wavefield synthesis systems at Conservatory A.Boito and University of Parma) and on VR/AR binaural spatialization. Now I’m also Unity3D certified teacher of Gamification and VR/AR Game Programming at the NABA university in Milan. 

Visual experiments

During my work in the music field, I started to use visual programming with Max/MSP and some programming with Processing and C-Sound. I always dreamed to create visual experiences, so I started to use visual generative stuff like Jitter and Processing, making interactive musical experiences with visuals through sensors and external hardware. After studying Objective-C and then Swift for making some iOS applications, I decided to begin to study Unity3D and C# language.

I was introduced by a friend to a company that works in the field of industrial exhibitions and they asked me to create some apps to visualize interactive 3D models of several industrial components. Starting with a touch transparent display apps for BAUMA in Munich, I created interactive experiences for several exhibitions around the world, from Con-Expo in Las Vegas and Eima in Bologna to Agritechnica in Hannover and Intermat in Paris. 

1 of 2
1 of 2

In those interactive experiences, I tried to bring Gamification principles right inside a software with a technical purpose: I merged a GTA approach with all the needs of a technical presentation and achieved the following:

  • the user can freely explore big agricultural or mining environments in the first person, discovering places and areas, driving pieces of machinery with physical simulation and collect some easter-eggs
  • photorealistic 3D models of all the products can be accessed tapping on the machines and selecting them, and it’s possible to rotate, zoom, explode and select single parts of the industrial component
1 of 2
1 of 2

Working on the interactive experiences

During these works, I was a solo-developer, so I dealt with CAD industrial models for the components, converted them, UV unwrapped, optimized, and animated inside Unity. I also created all the Substance Designer materials including metal grids for filters and many other particular industrial materials like sintered copper and wrote some custom shader for certain particular cases.  In addition, I worked on all the environments, creating batching systems for multi-tiled terrains, dealing with high-res machinery models (I made several LODs steps as each model can feature 450k polys), making run environments with drivable machinery, 2.5×2.5kms areas with vegetation and an immediate system for loading the products to explore, everything running at 60fps on a 980 and 120fps on a 1080.

Through these works, I discovered an incredible power inside Unity, especially through the Animation system that let me create accurate explosions steps for technical components with the possibility to edit single components trajectories without too many problems. 

Here are some of the showcases:

CaronteFX plugin

While I was working with that kind of approach, I came to face some physical simulation needs. In the beginning, it was very simple stuff, like flags, tissues or rigid/soft bodies. I started with cloth and RBS sims within Unity but I was looking for something different. Then I found the CaronteFX plugin on Asset Store. Starting to use it, I was completely impressed by the possibility, especially within my kind of works. 

I started a series of tests with this product (finding a stable workflow for my apps) and I came in contact with Next Limit Technologies. I became CaronteFX XPert user for Next Limit Technologies, creating some tutorials for using their plugin for Unity. 

Complex machinery simulation in Unity

While I was working with CaronteFX I started to deal with some heavy industrial simulations made with MatLab-like software, trying to bring them right inside Unity for creating the exact replica of complex machinery and so I started to use cached simulations. Immediately I started to test a workflow for importing RealFlow fluid simulations right inside Unity. 

After many experiments through all the different parameters and programs, I came to a first stable solution. 

Using Unity FBX exporter in case of a scene made right within Unity, or just importing the original model, I set up the simulation right into RealFlow. Then, once the simulation is calculated, I started fine-tuning the optimizations inside the mesh VDB node, trying to find the best balance between polycount and fluid quality.  

1 of 2

After that, within 3DSMax, I use the RealFlow bridge plugin for importing the BIN fluid into an empty project. Here I apply some modifiers, trying to optimize further and get a smoother look. 

At the end I export an Alembic cached sequence. This workaround also helped me to stabilize some crashes due to some incompatibilities with other Alembic export options that I found.  

1 of 2

Unity works great with Alembic, especially for the amazing job made for Adam movie. They created a really solid solution for importing Alembic sequences within your project and they can be integrated in few seconds into the Timeline. 

1 of 2

Once imported with the Alembic Import package, you can simply create a Timeline event with Alembic shot and assign your sequence to it. After that, I apply some materials made with custom shaders for Triplanar mapping and refractive additional options. 

Unfolding workflow

Recently I faced a new challenge: a company asked me to show how their new industrial area will be in the next future, featuring an animation of the construction of the buildings. 

I always loved origami and unfolding fx, so I decided to use this kind of animation for the entire sequence. 

I made two public test with this unfolding workflow. 

This time I started to work within Cinema4D, that was a true discovery for me! It’s really easy to integrate stuff between Unity and Cinema4D, I didn’t have the common problems that can be faced with other software together with cached mesh, as, for example, strange pivot points and so on. 

I made the animation inside Cinema4D  and I used the Alembic importer for integrated it inside the Timeline. 

1 of 2
1 of 2

Timeline is an exceptional tool: for a complex animation that I made directly within Unity, with more than 500 elements and more than 30 Alembic sequences put together, the workflow was extremely easy and stable, being able also to make changes on the fly through standard animation in Unity and Alembic sequences.

Advantages of Unity

In these years I used extensively Unity Animation sections, working many times on components that featured 150 or more single pieces to animate one by one. Mixing some scripting animations (especially through Coroutines) with traditional clips made within Unity, I love the incredible possibilities offered by this software. 

In the last years there have been a lot of solutions that help to rig, animate and model stuff right inside Unity: from UMotion Pro, ProBuilder (that now is free), UModeler, Surforge and Puppet3D, most of the things can be done right inside the software. 

I have also seen the new PiXYZ solution for CAD that could be a game-changer for people like me that had to deal with so much CAD files: in these years I passed a huge amount of time of the project creation in working with so many different CAD formats, trying to optimize, UV unwrap, texture and prepare all the elements for best fitting into a real-time engine. A seamless solution for that it’s something that can really make the life much easier, especially from the optimization point of view. Many times the products have thousands of single pieces and can be very time-consuming working on it. 

Adam Unfold Test

For the Adam unfolding test, what I wanted to reach was a higher visual quality. Working with brand advertising and industrial visualization, I always needed a perfect result with a clean metallic part. In the past, especially working with jewelry, I created custom shaders for getting good results with studio HDRI that I was not able to reach with Standard shaders, but then I started to use the amazing Apollo shader from Rispat Momit that it’s incredible for this kind of visualization (https://assetstore.unity.com/packages/vfx/shaders/apollo-light-based-shaders-beta-3-103978). 

For Adam demo I ported all the materials of Adam within Apollo shaders workflow and, with a just line of code for making them double-faced, I found an incredible visual improvement and control on the visual result of the scene. I was able to tune each element for reaching my idea, with a really deep control over all the different aspect of the shading process. 

Unity’s Post Processing Stack

For Post-Processing, I think that Post Processing Stack is absolutely awesome. To my mind, the only detail that it’s still missing is an easy connection between Depth of Field and Timeline keyframing. I know that it can be done through CineMachine, but I think that would be important to feature an easier way to key-framing it right inside the Timeline, especially if you are working with hand-made camera animations. However, I usually create a simple script for tracking the changes. 

Apart from Post-Processing, a plugin that I recently discovered on sale and I really love is CTAA, Cinematic Temporal Anti Aliasing, that can get rid of some artifacts in post-processing and make your scene really looks gorgeous, especially thanks to its compatibility with VR devices. 

Conclusion

All those tests are fast experiments that I make to try bringing some elements from traditional animation workflow into Unity, especially effects or simulations that can’t really be made with actual real-time physics calculation. Now my aim is to make them as lightest as possible in order to be able to include them easily within a commercial game. For that reason, I’m experimenting with shader workflows that maybe can help to make the cached mesh lighter (or even substitute it).

The collection of my experiments in Unity that I’m keeping updated is here.

Thank you!

Guido Ponzini, Unity3D Certified Developer / CaronteFX Xpert – Programmer

Join discussion

Comments 1

  • Shivani

    Hey thanks for briefing. I still face problem when I export alembic from maya. As alembic doess not come with the shading network. Its difficult to assign shaders everytime in every animation shots. Can you please suggest the work flow. It will we very helpful. ?

    0

    Shivani

    ·4 years ago·

You might also like

We need your consent

We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more