Can anyone clarify what the difference is between the CUDA and Optix options under the Renderer Device settings in Rhino 7 ?
Specifically, why it’s either / or.
I also run Maya and there, Optix is simply the Nvidia Denoiser that can either be enabled or disabled regardless if you’re using the CPU or GPU to render within Maya’s standard renderer, Arnold. ( Assuming you have a GPU that supports it of course )
So, the idea it’s a completely separate renderer in R7 is somewhat foreign to me.
Additionally, I don’t understand what it does as there is no apparent denoising being performed on the rendered image when using the Optix renderer. Only when I install the Nvidia denoiser via the package manager do I see the effect, but it isn’t limited to the Optix renderer. The effect can be enabled regardless of the renderer hardware. ( CPU, CUDA or Optix )
Finally, just running a quick render test, ( with and without the denoiser active ) Optix appears to be slower than Cuda with the CPU coming in last. The time difference between the CPU and GPU rendering also stands out to me. While I know CPU is typically slower than GPU, the time difference in R7 is significantly higher than the differences are in either Blender or Maya. I wonder if Rhino is calculating the correct tile sizes when using the CPU as they are very different than the tile sizes used on a GPU. ( 32-64 vs 256 )
Blender Cycles is a ~12 second difference between CPU and CUDA render times.
Rhino Cycles is a ~40 second difference between CPU and CUDA render times.
So I suppose the question is: Why / when would you set the renderer to Optix given that it’s slower than CUDA ?
AMD Threadripper 3970x 32-Core
The following images are the test renders.
Optix is top
Cuda is middle
CPU is bottom