Courtesy of Jason Huang

Retrofit AF26 Robot Lookdev - A Texturing Story with Jason Huang

Pierre Bosset on March 6 2019 | Stories, Substance Painter, Film/VFX

Introduction

Hi, I’m Jason Huang, an LA-based CG artist actively seeking challenging projects on which I can contribute collaboratively, putting crafted pixels and inspiring visuals on screens. I’m currently working at Digital Domain’s Digital Human Group. I explore all things CG via RnD projects, and enjoy digging into the evolution of cinematic shading, lighting, and rendering. I love computer graphics, and I’m always striving to learn more. Here, I’m pleased to share some of my work with the community.

The Beginning

AF26 is a shot from an independent sci-fi live-action short film that is intended to pitch for a feature film project. It is completed as a collaborative effort among artists and producers across the globe online. The original short can be viewed here. I was involved as the lighting lead to set up the hero light rig, as well as doing shot lighting for the project.

Fast forward four years; I want to make my own iteration of this particular shot, by redoing the lookdev, lighting, and compositing, to put it into my demo reel.

In the blog post, I’ll share the workflow and some tips and tricks of creating the shot, particularly in the context of lookdev, lighting, and compositing. I’ll also add a bit of background information concerning the hardware and software used, as below.

Software: Maya, Substance Painter, V-Ray, Nuke
Hardware: PC with AMD Threadripper 1950x, 32Gb RAM, GTX 1060 (6Gb VRAM)

Texturing and Lookdev

The texturing and lookdev of Robo is an integrated process that I carry out in Substance Painter. Robo has 180 UDIM tiles. At the time of texturing Robo, Substance Painter 2018.3 hasn’t been released yet, and so I add the Sparse Virtual Texture (SVT) and viewport upgrade, which makes working with heavy assets a bit easier.

I have to separate Robo into 9 FBX files, resulting in 9 Substance Painter scenes, so that I’m able to texture it with good viewport response. It comes down to finding a good balance between the total number of scenes, and the number of UDIM tiles for each scene that I have to wrangle in Substance Painter. If I separate Robo into too few parts, each would have too many UDIM tiles or textures sets, so that I’d have to keep switching while painting, because painting across texture sets isn’t possible yet. Conversely, too many UDIM tiles would simply be too heavy to work with. Hopefully, this workaround process can be disregarded with the release of the newer version of Substance Painter that’s supposed to be able to handle much heavier geometry and texture sets.

Robo is broken up in multiple scenes each with a various number of Texture Sets.

In order to efficiently texture Robo, I break him up based on material assignments, except for certain areas like the head and the chest battery pack, where I’d like to see materials in context. This way, each Substance Painter scene doesn’t have too many texture sets to manage, and the system doesn’t get bogged down. The number of texture sets ranges from 3 to 30 in 2k resolution.

For texturing hard-surface assets with weathering effects, I would highly recommend going through Substance Academy’s official tutorial here, which covers in detail the process and workflow tips on texturing aging surfaces in Substance Painter.

With a lookdev contact sheet lying around, or an idea of what material goes on which part of the character, I typically start with Smart Material presets that more or less resemble what I want each part of Robo’s surfaces to be in terms of the type of surface and color, and I start tweaking each PBR channel, gradually adding layers of weathering elements such as chipped paint, rust, dirt, grime, dust, and so on. The idea is to remain procedural for as long as I can and then get into texture manually at the end - and so I add those details procedurally with masks, staying in the procedural realm for as long as possible, until I have to get in and start hand-painting.

There are two main benefits to working this way. First, it’s faster to block in lots of weathering effects procedurally. Second, the procedural effect is added in non-destructively, meaning I can go back to tweak parameters later on. The hand-painting part at the end is the final touch to break up the procedural look and make the weathering effects more organic. Coming from using Mari for this type of work in the past, it’s very refreshing to work in Substance Painter’s real-time viewport.

Layer breakdown - clean labeling and naming help.

I also take advantage of Substance Painter’s PBR Validation material that is stacked at the top of my layers/groups where I toggle to check if the overall albedo and metal reflectance are within the reasonable range. Here, I don’t strictly follow the validation material’s guidelines, but set values based on what I see in the viewport, even though the value could be slightly off the suggested range. Still, the validation material is a good tool to ensure the resulting values are not all over the place.

PBR validation material in the viewport.

The majority of lookdev is taken care of in Substance Painter. In this case, I use a few self-made HDRs that are calibrated and neutralized as the base of the lookdev environment. The idea is to take out the lighting variable during lookdev so that I can focus on texturing and tweaking material properties. Those calibrated HDRs can be used in both Substance Painter and offline rendering (in this case Maya V-Ray) to ensure that the lookdev is conducted under consistent lighting, and to validate the look between both software tools. As both the real-time viewport in Substance Painter and V-Ray are built on physically-based rendering, I know the lookdev-ed asset will come out of the lighting and rendering with consistent, plausible results.

Screenshots of Substance Painter viewport and VFB to show consistent results between the two applications.

Regarding calibrating HDR, I have previously written a couple of blog posts here and here about the process. Feel free to check them out to dig deeper on this topic.

Once textures are exported from Substance Painter, it’s fairly straightforward to connect the textures to corresponding slots of a standard VRayMtl or VRayAlSurface in Maya. The results between Substance Painter’s real-time viewport and V-Ray output are visually close enough most of the time for my needs, knowing that this will provide enough room to tweak the look in post.

I make minor color corrections and tweaks only in Maya Hypershade at this stage, to fine-tune the look. A couple of benefits of letting Substance Painter take the heavy-lifting of lookdev are:
1. I can rapidly set the look in the real-time viewport, which reliably approximates how the final offline rendering output will look.
2. It’s simple to extend the resulting shading network in Maya Hypershade, to debug the shading if needed.

Lighting and Rendering

The lighting starts out from the HDR shot on-set as the base lighting. The set HDR wasn’t ideal in this case mostly due to the light fixtures used on-set all being clipped in the HDR during the capture process, and so not representing the full dynamic range of the set lighting. Nor was there a gray and chrome ball photo taken on-set as a reference. As what is provided from the integration team or set VFX grows, I need to exercise observations and some artistic choices. This happens in productions, and will impose the limitations around which I have to work as a lighter. One notable good point is that at least the lights used on the set, and their rough direction in the context of the room and set piece, are documented in the HDR.

With the base lighting provided by the set HDR, a few area lights representing the key lighting coming from screen left windows are added. Kick and fill lights are placed to bring out a bit more shape for Robo. The key here is being able to work in IPR, and to dial in the size and position of fill and kick lights so that certain body parts pick up interesting reflections while keeping the overall look natural and intuitive. An area light is also light-linked to Robo’s eye lenses.

Light rig used for the shot in Maya viewport

To speed up the lighting iterations, I typically turn off secondary GI bounces, subdivisions, displacement, and render elements to get quick feedback in V-Ray frame buffer (VFB). Once I have a version of lighting blocked in, I flip on those options and send the shot to render overnight with a setting that is acceptable in terms of noise, so that I’m able to check the result in the morning for next iteration.

Finally, I add a bit of color grading to sweeten up the shot. I learned several good tips and tricks from Hugo’s Grading basics in Nuke.

Slap comp of A or B.
Final comp-ed result.

That’s it. Coming from other 3D texture painting applications, working in Substance Painter is a truly refreshing experience. I really enjoy the texturing and lookdev process Substance Painter has provided. I can’t wait to get my hands on it for my next project. Cheers!

All images courtesy of Jason Huang. Follow Jason on Twitter.

On Facebook