Tags

, ,

Here’s the basic node layout of this technique.

The basic layout for this bilateral denoise.Don’t worry, you won’t need to manually enter this. The “Aux” input is for forwards compatibility, to make it easier to add another determinant to the bilateral filter.

Speaking of which, the best way to determine the proper settings for that filter is: crank the color sigma up as high as it goes, increase the space sigma until you can’t see any more splotchiness, then decrease the color sigma until juuust before an unacceptable amount of noise creeps back in. The iterations parameter is pretty useless, so far as I can see, so just leave it at 1. Unfortunately, Blender doesn’t allow you to script those settings, so you’ll have to crack open the nodegroup to tweak it.

You might want to add a denoise filter to the output, as sharp edges in the colour inputs can result in something that looks like fireflies. There’s no need to use every input, either, skip the ones that are superfluous to your scene or take advantage of that to blur different passes differently.

I also tossed in a quick nodegroup that shows how to consolidate all that pass data into a finished image. You’ll probably break it apart and just use the subset you need, rather than tweak the innards. I’ve also shared copies of Mike Pan’s infamous benchmark and Olivier Vandecasteele’s caustics test (CC-BY-SA) with this technique in play, so you can quickly see it in action

In this era of cheap render farms, I don’t see the technique being very useful for single images, but with some care it should be a life-saver for animators. It works better than temporal noise reduction because it has access to the scene geometry, though there’s no harm in chaining these techniques together to smooth over their respective weaknesses.