Cowboy Bebop Inspired Environment Production in UE4

Alex Zemskov prepared a detailed breakdown of his environment fan art inspired by Cowboy Bebop: blockout, painting, skybox creation, lighting, and more.

Alex Zemskov prepared a detailed breakdown of his environment fan art inspired by Cowboy Bebop: blockout, painting, skybox creation, lighting, and more.

Introduction

Hello everyone! For those who don’t know me – I am Alex Zemskov. I have graduated from Staffordshire University back in 2015 and up until this point had an opportunity to work as an environment artist at Playground Games, helping with the development of both Forza Horizon 3 and Forza Horizon 4.

Cowboy Bebop – Earth 2071

Inspiration

There were a lot of reasons why I decided to work specifically on this project. Cowboy Bebop had it’s 20th-anniversary last year, Netflix is currently working on their own live-action adaptation of the series (which is stirring a lot of buzzes), it has unique art direction and amazing music – the list goes on.

The most important factor for me, however, is that we have all this amazing game development technology at our disposal these days and up until this point we have only seen two Cowboy Bebop video games that were published exclusively in Japan – Cowboy Bebop on Playstation One in 1998 and Cowboy Bebop: Tsuioku no Serenade on Playstation 2 in 2005. That leaves us with an astonishing 14-year game technology gap that was not yet utilized by Bandai within this franchise.

When I came to a realization that there is such an incredible potential to show how far you can push a 3D fan art piece within this universe and demonstrate how a modern Cowboy Bebop video game could look like, I immediately started to look for inspiration and references within the anime itself.  While there is a ton of material you can work with, one particular environment caught my eye in episode 24 that only lasted a mere 6 seconds. Cloudy horizon, clear blue sky above, ruined skyscrapers, endless blue sea – it all made it look severely post-apocalyptic and yet so serene. It was a perfect candidate.

At this point, I’ve started asking myself “what do I really want to achieve here?”. Do I want to make this scene look realistic? Is it stylised? Maybe it is a blend of both? Eventually, I thought  it would be a fair challenge to try and replicate all of the hand-painted details that have already been engraved into the source material as closely as possible in Unreal Engine 4 and see how far I can push composition of this scene, considering it was originally rendered in 4:3 (Classic TV) aspect ratio.

Blockout Phase

During the first stages of development, it was incredibly important to nail down correct building proportions according to the reference and carve out basic substractions to see the exact location of demolished areas. Eyeballing was not an option, so I started to look for effective alternatives that would allow me to do fast and precise blockout.

I have stumbled upon this cool program on GitHub that is called OnTopReplica that allows you to select and project any window from any program on your desktop on top of another application while controlling transparency, size and even click-through capabilities of initial window. You can get it here.

After overlaying my reference image, I would start messing around with BSP blocks inside of Unreal, making sure that the position and rotation of all meshes would align with buildings on the projected image. It is important to note that the camera that is being used during this process must remain in a fixed position at all times, in order to avoid any complications.

Once the initial block out was completed, I would start exporting each individual building block out of Unreal, which further down the line would be used to create a final asset with the utilization of 3ds Max and ZBrush.

Later in production, once all buildings that were visible in core reference were modeled, unwrapped, textured and fully integrated into Unreal, I would create the second camera with wider viewing angle (*110 degrees to be precise), in order to expand the scene even more, adding a bit more cinematic feel to it. Making such priorities makes it easier to focus on replication of original shot, as well as ensuring that all improvised additions to the scene will work in unity with selected reference closer to the end of development.

Planning Phase

At this stage, it got pretty clear that every single building will require unique texture information and I will have to apply certain details by hand in order to make them look authentic and true to the original painting. In order to accomplish my goal fast and efficiently, I decided to create a strict development pipeline to keep things nice and organized, on top of other general guidelines.

Material creation:

The scene will not use unique tileable materials within the engine. Instead, it will have one master material that should be applied as an instance to every single mesh, in order to easily link unique textures for individual assets.

For painting purposes, the scene will require 3 procedural concrete materials. One to be applied as a base, 2 others to be painted in damaged areas for extra variation.

Asset geometry development:

All assets within a 50m range from the camera will have to go through high poly modeling process. Scanned normal information will be used in conjunction with procedural Substance Designer materials that will be manually painted in Substance Painter. Assets that locate beyond 50m range will only require a single low poly model.

Geometry painting:

All assets will have unique diffuse information overlayed on top of painted Substance Designer materials and sculpted normal information, in order to replicate original color with maximum precision. For this particular technique to work, concrete materials should be as light as possible. To be precise, albedo values should not go lower “#b3b3b3” or 70% grey.

Other guidelines:

In order to avoid this scene to look too realistic, roughness values of all materials (except water and glass) have to be as uniform as possible. Ultimately, while having some form of variation, all roughness inputs should be within “80% grey” value range.

In order to make the rendering look more anime-like, the scene should utilize an outline post-processing shader that will be applied to specific, hand-picked meshes within the environment.

Production Phase

Asset production was divided into 2 major phases. The first phase would include the following tasks:

  • Converting white box mesh into a low polygon model
  • Model unwrapping
  • High polygon model creation and sculpting (only applies to assets within close proximity)
  • Normal and cavity map scanning with Marmoset Toolbag

Phase 2 would mainly focus on:

  • Normal and cavity map integration into Substance Painter
  • Model painting
  • Texture integration into Unreal Engine 4
  • Potential lightmap adjustments for secondary UV channel in case of artifact projections

Sculpting & Painting Techniques

My first serious task was to nail down the details on high polygon model. Here is a guide of how I have approached this process in ZBrush:

Once high poly was completed, it was time for the biggest challenge – model painting in Substance Painter. For this process, I would heavily rely on Artistic Heavy Sponge, Artistic Brushing, and Basic Hard brushes. I would also use a slight touch of Leaks and Leaks Heavy particles in order to create fake paint dripping artifacts. For prototyping, I painted yellow building within the central part of composition first, which would eventually define primary painting workflow for the rest of the assets within the scene.

Painting procedure steps included:

  • Clean concrete application
  • Scanned normal overlay application
  • Rough or damaged concrete application on top of sculpted damaged areas
  • Cavity map integration
  • Painting of solid, black lines around the edges of damaged areas for improved artistic effect
  • Color information overlay
  • Window shader application
  • Artistic touch ups according to the reference, as well as a personal preference if deemed necessary due to compositional or technical reasons

Debris Creation Workflow

Initially, I wanted to process both the creation and placement of debris with some form of script, to make it look as believable as possible and save some time on production. However, upon doing some research, it became clear that most of the professional options that are currently available on the market cost a lot of money and require some time to get used to. In the end, I decided to utilize MassFX Tools inside 3ds Max and see if I would achieve desirable results.

I started by creating a variety of different debris chunks that later would be used to form clusters of rocks around selected buildings. These chunks would undergo the same texturing procedures as all other assets in the scene.

Once the textures were completed, it was time to set up MassFX parameters for both the building, upon which concrete blocks would fall, as well as debris chunks themselves. However, before I did any of that, I decided to make a little test to see if an entire workflow works from start to finish.

First, I created a small pool that would imitate the final building mesh. I would apply MassFx Rigid Body modifier and set rigid body type to Static and set the shape type to Original. This would ensure that during the simulation the building does not move, yet it provides precise collision boundaries for debris.

Next, I would create separate copies of each debris chunk that I would like to see in one cluster. Then, I would apply Mass Fx Rigid Body once again, except this time I’ll set Rigid Body Type to Dynamic, as well as shape type to Custom. It is important to note that in order to make this to work during simulation, this modifier has to be applied to each individual chunk separately and not within a group selection of several pieces.

Finally, I took all the modified debris pieces and placed them above the pool. I then copied an entire stack 10 times over in Z axis and adjusted rotation and positioning of several pieces in order to make the placement look a bit more organic.

If all MassFX modifiers were adjusted correctly, the simulation process should look like this:

And here is the final result that went through an exact same process, except that I added a couple of extra collision boxes in front of the building so that debris wouldn’t slide off so easily.

Skybox Creation

If I could rely on minor procedural techniques in order to paint my buildings, skybox, on the other hand, would not allow it. For this task, I would use Photoshop to paint both cloud diffuse and cloud alpha textures.

In Unreal Engine, I would use default sky to generate soft blue gradient on the background, while creating separate half-sphere skybox mesh that would project painted clouds underneath.

Lighting & Post-Processing Phase

This was the last and by far the most challenging phase. Any value changes would either make it or break it. At first, I was trying to go for “Mid-summer – 4:00 p.m.” type of look, heavily relying on GI. Evidently, I was just not quite satisfied with the results. While everything looked pretty close to the reference and you would get this nice, cozy feeling while looking into the distance, in my mind, I have imagined it being a little bit colder and “fresh”. A decision was made to slightly alternate original lighting and post-process settings, in order to make the scene feel around 10 a.m. in the morning.

Here are the settings I have ended up with for my final lighting bake.

Conclusion

While I wish I could make this environment bigger and make the project as a whole a bit more ambitious, I had to stop here due to the deadline I have set for myself. I hope this little breakdown gave a better insight into how I have approached the development of this scene, so you could use this knowledge to your own benefit.

I’d like to thank Kirill for conducting this interview and thank YOU for reading this article. If you wish to see more of similar projects coming out on my behalf in the near future, feel free to follow me on ArtStation.

Until next time! Bye-bye!

Alex Zemskov, Environment Artist

Interview conducted by Kirill Tokarev

Join discussion

Comments 0

    You might also like

    We need your consent

    We use cookies on this website to make your browsing experience better. By using the site you agree to our use of cookies.Learn more