Hello, I just noticed that my Arctic viewport does not render with Ambient occlusion lighting despite the Lighting Method being set to AO – same goes with Raytracing. Trying to troubleshoot this, I realized that my graphics card does not even show up in the Cycles under CUDA.
Is there any way to fix this?
Many thanks in advance.
RhinoCycles_ListDevices
Device 0: CPU > Intel Core i7-9750H CPU @ 2.60GHz > 0 | False | True | CPU
Vendor Name: Microsoft Corporation
Render version: 0.0
Shading Language: Not supported
Driver Date: 3-18-2019
Driver Version: 25.21.14.1971
Maximum Texture size: 1024 x 1024
Z-Buffer depth: 32 bits
Maximum Viewport size: 16384 x 16384
Total Video Memory: 4 GB
For Arctic (and all the other display modes) to use the Nvidia GPU you need to make sure the GTX 1650 is set to be the primary OpenGL engine, you should be able to do that in the control panel for your GTX 1650.
Further I need to check what compute model the GTX 1650 is for, maybe I need to compile an extra set of CUBIN files for that model.
Thanks for the quick response Nathan. You are absolutely correct. I was able to resolve the issue by adding Rhino to Nvidia’s Control Panel.
I am using Rhino 6 SR18 2019-9-23 (Rhino 6, 6.18.19266.14201, Git hash:master @ 3d84f88dec99b2f4e8b7497e739ed2adc2ba8ef6)
License type: Educational, build 2019-09-23
I took the following steps:
-open Nvidia Control Panel from windows search
-Manage 3D settings
-Select Program Settings
-Click to add Rhino 6
-Select “High-performance NVIDIA processor”
Are there really no way Rhino could choose what graphiccard to use from within Rhino? I don’t think games try to run on the intel GPU if a more powerful card is present.
This is a returning issue on the NG so I think it would be time well spent for a developer to look into.