Can anyone clarify what the difference is between the CUDA and Optix options under the Renderer Device settings in Rhino 7 ?
Specifically, why it’s either / or.
I also run Maya and there, Optix is simply the Nvidia Denoiser that can either be enabled or disabled regardless if you’re using the CPU or GPU to render within Maya’s standard renderer, Arnold. ( Assuming you have a GPU that supports it of course )
So, the idea it’s a completely separate renderer in R7 is somewhat foreign to me.
Additionally, I don’t understand what it does as there is no apparent denoising being performed on the rendered image when using the Optix renderer. Only when I install the Nvidia denoiser via the package manager do I see the effect, but it isn’t limited to the Optix renderer. The effect can be enabled regardless of the renderer hardware. ( CPU, CUDA or Optix )
Finally, just running a quick render test, ( with and without the denoiser active ) Optix appears to be slower than Cuda with the CPU coming in last. The time difference between the CPU and GPU rendering also stands out to me. While I know CPU is typically slower than GPU, the time difference in R7 is significantly higher than the differences are in either Blender or Maya. I wonder if Rhino is calculating the correct tile sizes when using the CPU as they are very different than the tile sizes used on a GPU. ( 32-64 vs 256 )
Blender Cycles is a ~12 second difference between CPU and CUDA render times.
Rhino Cycles is a ~40 second difference between CPU and CUDA render times.
So I suppose the question is: Why / when would you set the renderer to Optix given that it’s slower than CUDA ?
With Rhino 8 is your recommendation of when to use CUDA vs OpiX the same as stated with Rhino 7?
Related; my impression was to use Intel’s Denoiser with Rhino 7. So does this still hold to use Intel’s Denoiser in Rhino 8 ? and with both CUDA and Optix ?
My (probably wrong) impression is that Nvidia is quite a bit ahead of Intel in things graphic-wise, so I’m curious to understand why you built in the Intel denoiser instead of the Nvidia?
@jdhill Thanks. Didn’t realize that. Certainly a very strong reason for choosing it
Is there much of a penalty on windows over the Nvidia, or is it a better performer too?