Having a terrible experience while rendering... GPU won't run



having troubles while rendering!
the CPU running on 99%!!! while GPU is 0% how come?
as you can see i set the renderer to NVIDIA GPU (optix) but yet i cant get the GPU to run is (driver is up to date) is there anything that i need to understand better ?

the renderings work terribly bad ad slow

Start by running the _SystemInfo command in Rhino and paste the output here.

It starts with doing what Jeremy asked, but it seems to indicate it’s being used? It’d be a lot slower than that if it was CPU. Laptop GPUs are just kind of terrible, like calling it a “4060” is probably legally actionable deceptive marketing. The performance is limited by the wattage budget the laptop gives it and they will quickly get too hot and throttle down with any real sustained task, like this.

Note that “GPU utilization” graph in Windows MAY NOT be showing you CUDA rendering activity, what you need to look at depends on your Windows version.

He could try some ice packs and a laptop cooler

My first and only laptop did not last very long.

:[

Rhino 8 SR21 2025-7-7 (Rhino 8, 8.21.25188.17001, Git hash:master @ 2e05bb7e11ec03aa58cc543d92330d59df05d32b)
License type: Evaluation, build 2025-07-07
License details: Cloud Zoo
Expires on: 2025-07-28

Windows 11 (10.0.26100 SR0.0) or greater (Physical RAM: 32GB)
.NET 8.0.14

Computer platform: LAPTOP - Plugged in [73% battery remaining]

Hybrid graphics configuration.
Primary display: Intel(R) Iris(R) Xe Graphics (Intel) Memory: 2GB, Driver date: 7-24-2024 (M-D-Y).
> Integrated graphics device with 4 adapter port(s)
- Windows Main Display is laptop’s integrated screen or built-in port
- Secondary monitor attached to adapter port #1
Primary OpenGL: NVIDIA GeForce RTX 4060 Laptop GPU (NVidia) Memory: 8GB, Driver date: 5-29-2024 (M-D-Y). OpenGL Ver: 4.6.0 NVIDIA 555.97
> Integrated accelerated graphics device with 4 adapter port(s)
- Video pass-through to primary display device

OpenGL Settings
Safe mode: Off
Use accelerated hardware modes: On
GPU Tessellation is: On
Redraw scene when viewports are exposed: On
Graphics level being used: OpenGL 4.6 (primary GPU’s maximum)

Anti-alias mode: 4x
Mip Map Filtering: Linear
Anisotropic Filtering Mode: High

Vendor Name: NVIDIA Corporation
Render version: 4.6
Shading Language: 4.60 NVIDIA
Driver Date: 5-29-2024
Driver Version: 32.0.15.5597
Maximum Texture size: 32768 x 32768
Z-Buffer depth: 24 bits
Maximum Viewport size: 32768 x 32768
Total Video Memory: 8188 MB

Rhino plugins that do not ship with Rhino

Rhino plugins that ship with Rhino
C:\Program Files\Rhino 8\Plug-ins\Commands.rhp “Commands” 8.21.25188.17001
C:\Program Files\Rhino 8\Plug-ins\rdk.rhp “Renderer Development Kit”
C:\Program Files\Rhino 8\Plug-ins\RhinoRenderCycles.rhp “Rhino Render” 8.21.25188.17001
C:\Program Files\Rhino 8\Plug-ins\rdk_etoui.rhp “RDK_EtoUI” 8.21.25188.17001
C:\Program Files\Rhino 8\Plug-ins\NamedSnapshots.rhp “Snapshots”
C:\Program Files\Rhino 8\Plug-ins\MeshCommands.rhp “MeshCommands” 8.21.25188.17001
C:\Program Files\Rhino 8\Plug-ins\RhinoCycles.rhp “RhinoCycles” 8.21.25188.17001
C:\Program Files\Rhino 8\Plug-ins\Toolbars\Toolbars.rhp “Toolbars” 8.21.25188.17001
C:\Program Files\Rhino 8\Plug-ins\3dxrhino.rhp “3Dconnexion 3D Mouse”
C:\Program Files\Rhino 8\Plug-ins\Displacement.rhp “Displacement”
C:\Program Files\Rhino 8\Plug-ins\Calc.rhp “Calc”
C:\Program Files\Rhino 8\Plug-ins\SectionTools.rhp “SectionTools”

thank you here it is…

So, first thing is: your drivers are not up to date (Windows reports they are because it only has outdated ones in its stash). Go to your laptop manufacturer’s support site and see if they have later ones. You can then go to intel and Nvidia for the absolute latest. With some laptops there can be specific integrations that require the manufacturer version, so be prepared to back Intel or Nvidia ones out if something breaks (e.g. I have a Surface Book with a detachable screen and I have to use ancient Microsoft drivers).

Next, make sure that laptop is set to use the Nvidia card with Rhino. This could be controlled from the Nvidia control panel, Windows display settings or power management settings, depending on the make of laptop and version of Windows.

Also, search the forum for a thread called something like “so you have a laptop” which gives helpful advice (sorry, don’t have time right now to find the link).

HTH
Jeremy

1 Like

CUDA compute(it’s not “graphics” at all, it just can be used for it)is a totally different ball of wax from those settings, there is no alternate lower-power way of running it. Also the render window says right there that it’s using the GPU.

The CUDA drivers also don’t change week-to-week in response to bugs with games either like the OpenGL stuff might.

I suspect the problem is that “laptops suck,” it’s kneecapped by the thermal and wattage limits imposed by the system. Doing a GPU render on my 3090 adds 250 watts to the power consumption, I don’t know what to expect from a system that’s gonna be limited to way less than that total. Seeing the actual file might reveal some issue specific to it making it take a lot of time.

Hi @dror,

Here’s a question that could have been asked sooner: is this the first Rhino render you have attempted on this machine?

AFAIK the first time you use the Rhino cycles renderer after a Rhino install, the render engine prepares itself to run on that system. That preparation takes a significant time. If you cancel the render before it is completed then cycles will have to start the preparation afresh the next time.

If you see a message along the lines of “compiling kernels” in the progress bar at the bottom of the render window, that is what is going on. Go grab a coffee, go out to dinner if necessary, but leave it to finish. Your next render will complete without the lengthy preamble.

Let us know if this is what is happening with you.

Regards
Jeremy

The first screenshot you post says you’re using OptiX to render on your RTX 4060. This means the GPU is being used. As mentioned in earlier replies you need to look at the compute core utilization, not at 3D utilization.

The CPU will also be used, but that is for updating the screen with the render results. The window showing that render result applies the post effects (like gamma and denoising). It runs multithreaded and will fully utilize the CPU when given the chance by the operating system.

To further understand why your particular model feels slow with rendering we’d need to have a look at your file. There are many reasons why a rendering could be slow, but from your first screenshot it is hard to say what could be the case.

To see the configuration of your GPU you can find it here: List of Nvidia graphics processing units - Wikipedia . It should be pretty decent, but as a laptop GPU it will not perform as well as the desktop counterpart of the RTX 4060 (which you can also find on that page).

addendum: if you have selected a GPU as render device, but the kernels aren’t ready yet when you start rendering, or switch to raytraced in a viewport you’ll see a clear mention of that. In the render window in the title bar, for the viewport in the hud like this:

Hi guys thanks for all the good support, since last time I could advance really good and got information how to work faster and more efficient, first of all instead of working in mesh I converted all component to nurbs so it’s converted to mesh only before rendering and now the file work better,

Second, found out that working in shaded mode is way better than rendered… dahhh!

Third updating the driver helped me and now the render work significantly better,

Here are some results before edit on Photoshop:

Time to keep going and getting to know better my tools. Until then thank you guys

1 Like