Cycles, 2 GPUs combined slower than single GPU

My cards (one selected - gave faster render)

cyclescardsettings

time for one GPU

time for both selected

It is exactly the same scene and angle - basically I just unchecked the P6000, restarted Rhino and then rendered the same scene.

I did both cards first, thought that that was too slow so unchecked the slow card and rendered again.

Multiple cards will use 2X the slower card so if one GPU is faster, that one will get diminished when paired with a less powerful card.

To speed up your render time, have you tried using the Intel Denoiser from the PackageManager and only calculating to 500 samples before using that post effect to clean up any remaining grain? I don’t know the model or resulting render but I bet you could get down to 10 mins or less.

1 Like

Thanks Brian.

I might swap my cards around so I have 2 fast cards in the box.

Odd thing is that I remember that the faster card ran hotter ~70 degrees and the slower card ran colder: somewhere in the 50s. I was using the task manager’s performance tab to determine if both cards were working because I was RDPing onto the box and the NVIDIA control panel wouldn’t launch because I wasn’t using an NVIDIA card for my display. RDP could have been using the built in graphics chip on the MB (I guess) or the NVIDIA software was just confused.

I tried the denoiser some time ago but I have some coloured glasses and shiny metals and rely on reflections and refractions to get the look I like. It didn’t look good using the denoiser, things looked a bit ¿waxy? but this was months ago, maybe things are better now.

I almost always use the denoiser now and prefer the Intel one since my Nvidia card is doing enough already :slight_smile: . In my experience getting to somewhere between 200 and 500 samples is good before denoising but there may be certain material combinations that the algorithm isn’t handling well. Share the model with me if you think that’s the case and I can file something for the developers to see if it can be improved.

RDP adds a variable for sure but I think the timing difference was due to the slower card setting the pace. Watch your temps and go with just one card if things are getting too hot in the box IMO.

Hi Brian,

I tried the Intel denoiser but it keeps killing Rhino so hard that I don’t even get a Windows or Rhino error report dialog.

Rhino goes straight from rendering to gone. It dies hard. (with a vengeance)

you have an nvidia rtx… Use the nvidia denoiser and just the rtx. That is a beast of a card.

You should get good speeds from that. If not, we’ll want to see your scene and your settings-

I’m having problems with rendering in general now. I thought that it might be down to the denoisers but I have uninstalled both the Intel and the NVIDIA ones.

Apart from occasional hard crashes I see renders that complete immediately and look like this:

If I try to render in the viewport it sticks (perhaps forever, I haven’t waited long enough to know if the render ever completes). This is what I see:

I ran another test and saw a Cuda error out of memory just before I saw the checker pattern like the one shown above.

This is the current state of my GPU and shared memory, taken after I saw the error

I’m going to watch the memory as Rhino loads the shaders and meshes onto the card.

This is my current system memory status:

Ran the test again and saw this:


So I can’t account for the CUDA memory error.

You will need to give me a link to upload the file. It is too big for the forum software to accept.

I also want to mention that I upgraded my VRay a week or two ago. I don’t know if that is just post hoc ergo propter hoc or if it has any significance.

I see to get one or two renders out of Rhino before I have to restart.

In case it’s not clear: I saw the cuda memory error on the render window status bar.

I render at a custom resolution: 4096 X 2160 (Rhino never remembers this - I always have to set it anew when I open Rhino).

Well, that was true but I opened the file on another computer and it remembered the setting. I might not have installed VRay or the denoisers on that machine. I’ll check after my render has finished.

CUDA error often indicates driver problem. Make sure you have the latest production branch installed (studio driver) - not the feature branch (game ready driver).

I have tried on two machines. This one had problems…

This one did not.

Anyway, I upgraded the drivers on the problem machine and it is now working fine.

So, false alarm. Sorry guys.

OTOH the 2019 drivers had been working for ages so… ?!?

I’m glad you got this working with the driver update. I have seen Windows updates make the GPU drivers no longer work as well, that may have been the cause of your issue. In general, I keep Rhino, Windows and the GPU driver all updated in unison.