Would an nVidia graphics card reduce Rhino display times?

My Win11/64 system runs on an Intel i5 2600K 3700 Mhz CPU chip. I have 2 monitors - 1 for Grasshopper and 1 for Rhino. I use GH (only) for all my design work; Rhino is only for display purposes.

Both monitors are plugged into the CPU’s graphics monitor sockets. My system has no separate graphics card. My question is: would Rhino render faster if I added an nVidia graphics card and switched my monitors off the CPU and onto the nVidia card?

Usually Rhino renders very quickly, but if I have some GH geometry that has, for example, several hundred small objects oriented onto a single 3D surface it can take quite a while (sometimes up to 20 seconds) for the baked image to actually be displayed on the Rhino screen. My GH screen shows me when Rhino has finished baking, but in these cases there is an additional delay while the baked geometry gets displayed. I’m guessing a separate graphics card would help this, but I have no idea how much.

I mean sure, anything is better than any integrated graphics, but your system is otherwise 10-year-old tech, so I’m not sure how much modern GPU to buy before your platform becomes the bottleneck.

Oops! You caught me; my CPU is actually I5-12600K which was released Nov 2021.

I got the motherboard/CPU about a year ago to replace a much older/slower i7 that also ran without a graphics card.

Oh, okay…I mean I don’t know where your bottlenecks are, but the best nvidia you can afford will certainly do better than the integrated video at spinning things around in openGL…

1 Like

It sounds like the delay you’re seeing might actually be Rhino generating the meshes for all the individual NURBS objects after they’ve been baked.
If you are orienting many copies of a surface/polysurface object, what you could try to speed things up is meshing that object first then orienting copies of this mesh.

Howdy!

I built a new PC 2 weeks ago using a 13700K and a RTX4080 and DDR5 memory. If you have a file you want to test for speed, please feel free to upload it and I can run it based on the settings you want so you can get a comparison. I went from a 7 year old CPU and RTX2060 card. The newer PC rendered in half the time something simple, though I admit I thought it would be faster.

Let me know.
Thanks!

Thanks Daniel - that’s a very interesting observation. I’ll run a test case and see what happens.

WOW! What a difference meshing the oriented object makes. 9.7 seconds VS 4.7 minutes to compute the SrfMorph - and quite a while to display it.

2023-09-01_112602

Spectacular suggestion Daniel - I guess it proves that it really helps when you know what you are talking about. Meshing first would never have occurred to me, so many thanks for your insight.

The attached rather nasty GH file is what I tested with. Normally everything except the inputs is inside a cluster (I never tried to make it pretty), so I expanded and copied everything so there were 2 versions. The top has an added Mesh component right before the geometry for orienting gets made. On the bottom I disabled the SrfMorph component so there’s a minimal wait before the GH file loads. Even so I suggest locking the solver before opening this file.

On my system, after the image is generated, panning in Rhino is somewhat herky-jerky. For complex images like this one (it’s actually a lot simpler than my real ones) I’ve gotten used to that; maybe the graphics card would eliminate that issue too. I’m considering adding a fanless nVidia GT 1030 based card since it’s only $91.

morphtest.gh (62.1 KB)