Multiple cards will use 2X the slower card so if one GPU is faster, that one will get diminished when paired with a less powerful card.
To speed up your render time, have you tried using the Intel Denoiser from the PackageManager and only calculating to 500 samples before using that post effect to clean up any remaining grain? I don’t know the model or resulting render but I bet you could get down to 10 mins or less.
I might swap my cards around so I have 2 fast cards in the box.
Odd thing is that I remember that the faster card ran hotter ~70 degrees and the slower card ran colder: somewhere in the 50s. I was using the task manager’s performance tab to determine if both cards were working because I was RDPing onto the box and the NVIDIA control panel wouldn’t launch because I wasn’t using an NVIDIA card for my display. RDP could have been using the built in graphics chip on the MB (I guess) or the NVIDIA software was just confused.
I tried the denoiser some time ago but I have some coloured glasses and shiny metals and rely on reflections and refractions to get the look I like. It didn’t look good using the denoiser, things looked a bit ¿waxy? but this was months ago, maybe things are better now.
I almost always use the denoiser now and prefer the Intel one since my Nvidia card is doing enough already . In my experience getting to somewhere between 200 and 500 samples is good before denoising but there may be certain material combinations that the algorithm isn’t handling well. Share the model with me if you think that’s the case and I can file something for the developers to see if it can be improved.
RDP adds a variable for sure but I think the timing difference was due to the slower card setting the pace. Watch your temps and go with just one card if things are getting too hot in the box IMO.
I’m having problems with rendering in general now. I thought that it might be down to the denoisers but I have uninstalled both the Intel and the NVIDIA ones.
Apart from occasional hard crashes I see renders that complete immediately and look like this:
You will need to give me a link to upload the file. It is too big for the forum software to accept.
I also want to mention that I upgraded my VRay a week or two ago. I don’t know if that is just post hoc ergo propter hoc or if it has any significance.
I see to get one or two renders out of Rhino before I have to restart.
In case it’s not clear: I saw the cuda memory error on the render window status bar.
I render at a custom resolution: 4096 X 2160 (Rhino never remembers this - I always have to set it anew when I open Rhino).
Well, that was true but I opened the file on another computer and it remembered the setting. I might not have installed VRay or the denoisers on that machine. I’ll check after my render has finished.
CUDA error often indicates driver problem. Make sure you have the latest production branch installed (studio driver) - not the feature branch (game ready driver).
I’m glad you got this working with the driver update. I have seen Windows updates make the GPU drivers no longer work as well, that may have been the cause of your issue. In general, I keep Rhino, Windows and the GPU driver all updated in unison.