Focus on the New AO Node

Nicolas Wirrmann on January 17 2017 | News, Substance Designer, Software, Tutorials

If you’ve been texturing for a long time, you know how precious ambient occlusion information can be. Whether you want to simulate dust, wear and tear, or give a bit more depth to your 3D object, ambient occlusion is your best friend. This is even more important within Substance Designer, where many effects are partially or completely driven by an ambient occlusion map.

Not so long ago, the only way to obtain this AO map was to bake it from your 3D mesh, which gives good results but relies on an offline process.

As we needed a dynamic solution to build interesting graphs, we created the first version of the Ambient Occlusion node which was driven by a height map only. This was very useful for creating most of the effects you enjoy today.

That said, while it’s extremely handy, the quality of the ambient occlusion node wasn’t comparable to a baked ambient occlusion map, and you had to choose between flexibility or quality.

We decided it was time to rework this node so that you don’t have to make this compromise, and we are extremely proud to introduce the new Ambient Occlusion node in Substance Designer 5.6!

Still based on a height map input, the new AO node now produces results of a quality comparable to the baked one (for a fraction of the time :-) )

See for yourself (use the arrows to switch between the two images):

Raytraced AO
Courtesy of Gametextures.com
Ambient Occlusion node
Courtesy of Gametextures.com

In the images slider above, the raytraced renderer is on the first one and the new AO node is the second one. The raytracer bake takes 90 seconds to calculate the image, while the Ambient Occlusion filter takes 5 milliseconds. That’s right – it’s 18,000 times faster. That doesn’t mean very much in and of itself since the two algorithms are different, but still, the result is visually similar.

For the techies out there, the algorithm is based on the Horizon-Based Ambient Occlusion research produced by NVIDIA. This research is oriented to 3D situations, and we’ve applied the principle to 2D data.

We can’t wait to see what you can to achieve with this new filter: please share your experience on the forum!

On Facebook