GTX 1650 doesn't show up in Cycles / no AO rendering

Hello, I just noticed that my Arctic viewport does not render with Ambient occlusion lighting despite the Lighting Method being set to AO – same goes with Raytracing. Trying to troubleshoot this, I realized that my graphics card does not even show up in the Cycles under CUDA.

Is there any way to fix this?

Many thanks in advance.


Device 0: CPU > Intel Core i7-9750H CPU @ 2.60GHz > 0 | False | True | CPU


Non-hybrid graphics.
Primary display and OpenGL: Intel® UHD Graphics 630 (Intel) Memory: 1GB, Driver date: 5-28-2019 (M-D-Y). OpenGL Ver: 1.1.0

Secondary graphics devices.
NVIDIA GeForce GTX 1650 (NVidia) Memory: 4GB, Driver date: 3-18-2019 (M-D-Y).

OpenGL Settings
Safe mode: Off
Use accelerated hardware modes: On
Redraw scene when viewports are exposed: On

Anti-alias mode: None
Mip Map Filtering: Linear
Anisotropic Filtering Mode: Height

Vendor Name: Microsoft Corporation
Render version: 0.0
Shading Language: Not supported
Driver Date: 3-18-2019
Driver Version:
Maximum Texture size: 1024 x 1024
Z-Buffer depth: 32 bits
Maximum Viewport size: 16384 x 16384
Total Video Memory: 4 GB




For Arctic (and all the other display modes) to use the Nvidia GPU you need to make sure the GTX 1650 is set to be the primary OpenGL engine, you should be able to do that in the control panel for your GTX 1650.

Further I need to check what compute model the GTX 1650 is for, maybe I need to compile an extra set of CUBIN files for that model.

@shahe what Rhino version are you running? You left out the first part of the _SystemInfo output. Please include that too.

Thanks for the quick response Nathan. You are absolutely correct. I was able to resolve the issue by adding Rhino to Nvidia’s Control Panel.

I am using Rhino 6 SR18 2019-9-23 (Rhino 6, 6.18.19266.14201, Git hash:master @ 3d84f88dec99b2f4e8b7497e739ed2adc2ba8ef6)
License type: Educational, build 2019-09-23

I took the following steps:

-open Nvidia Control Panel from windows search
-Manage 3D settings
-Select Program Settings
-Click to add Rhino 6
-Select “High-performance NVIDIA processor”

Thank you very much!

Hi @shahe, is the Cycles render device section still empty for CUDA after the change you made?

Hi @nathanletwory, no the Cycles render device now shows GTX 1650 – it was also automatically checked after the adjustment on Nvidia Control Panel.

Ok, cool.

Are there really no way Rhino could choose what graphiccard to use from within Rhino? I don’t think games try to run on the intel GPU if a more powerful card is present.

This is a returning issue on the NG so I think it would be time well spent for a developer to look into.