Cuda vs Optix

Can anyone clarify what the difference is between the CUDA and Optix options under the Renderer Device settings in Rhino 7 ?

Specifically, why it’s either / or.

I also run Maya and there, Optix is simply the Nvidia Denoiser that can either be enabled or disabled regardless if you’re using the CPU or GPU to render within Maya’s standard renderer, Arnold. ( Assuming you have a GPU that supports it of course )

So, the idea it’s a completely separate renderer in R7 is somewhat foreign to me.

Additionally, I don’t understand what it does as there is no apparent denoising being performed on the rendered image when using the Optix renderer. Only when I install the Nvidia denoiser via the package manager do I see the effect, but it isn’t limited to the Optix renderer. The effect can be enabled regardless of the renderer hardware. ( CPU, CUDA or Optix )

Finally, just running a quick render test, ( with and without the denoiser active ) Optix appears to be slower than Cuda with the CPU coming in last. The time difference between the CPU and GPU rendering also stands out to me. While I know CPU is typically slower than GPU, the time difference in R7 is significantly higher than the differences are in either Blender or Maya. I wonder if Rhino is calculating the correct tile sizes when using the CPU as they are very different than the tile sizes used on a GPU. ( 32-64 vs 256 )

Blender Cycles is a ~12 second difference between CPU and CUDA render times.
Rhino Cycles is a ~40 second difference between CPU and CUDA render times.

So I suppose the question is: Why / when would you set the renderer to Optix given that it’s slower than CUDA ?

System Specs:

AMD Threadripper 3970x 32-Core
128GB Ram

The following images are the test renders.

Optix is top
Cuda is middle
CPU is bottom

CUDA and OptiX are two different backends used by the Rhino Render engine Cycles.

OptiX will be useful for complexer scenes with lots of reflection and such, which is where it will shine over regular CUDA.

The scene you show isn’t particularly complex, so CUDA will be your best choice.


Here are some other posts regarding cuda/optix, they migh be interesting for you to read:

Hello Nathan,

With Rhino 8 is your recommendation of when to use CUDA vs OpiX the same as stated with Rhino 7?

Related; my impression was to use Intel’s Denoiser with Rhino 7. So does this still hold to use Intel’s Denoiser in Rhino 8 ? and with both CUDA and Optix ?

Thank you,


From Rhino 8.3 onwards the Intel denoiser is built in and automatically turned on (at least currently). It will be so with any render device.

I believe my earlier statement regarding CUDA and OptiX still holds, but I haven’t done any timing measurements with Rhino 8.

1 Like

My (probably wrong) impression is that Nvidia is quite a bit ahead of Intel in things graphic-wise, so I’m curious to understand why you built in the Intel denoiser instead of the Nvidia?

intel is an open source library that works on mac: GitHub - OpenImageDenoise/oidn: Intel® Open Image Denoise library

And indeed it is enabled on both Windows and Mac :slight_smile:

@jdhill Thanks. Didn’t realize that. Certainly a very strong reason for choosing it
Is there much of a penalty on windows over the Nvidia, or is it a better performer too?