Is Rhino 6 optimised especially for Quadro?

On official site (http://www.rhino3d.com/v6_mockup/new_mockup/display_mockup), we can read that Rhino 6 graphics is much faster than in 5. In this site is also link to the benchmarks, which concern only Quadro cards. For Rhino 4 or 5 there is no difference is graphics speed between Quadro and GeForce cards. Does it means that Rhino 6 graphics will get significant benefits only with Quadro, which are horrible expensive? I hope so not.

No, you should see performance improvements with many different GPUs out there. I would recommend trying the beta to see if you notice any performance improvements with your card.

That said, you may get a lot more “bang” out of these higher end cards now.

Steve,

Thank you for reply. I have tested two graphics cards of similar computing power. The first one of the Quadro family (K4000) and the second of the GeForce family (GTX 650 Ti). I have tested them on NURBS surface models, created from meshes using reverse engineering software (RhinoResurf and Geomagic Design X). The meshes consisted of several million triangles, NURBS surfaces consisted of several hundred thousand control points. The models were presented in several ways (wireframe, rendered, ghosted), then moved, rotated and copied. I have to admit that the difference in graphics speed relative to Rhino 5 is significant. However, I have not noticed difference between Quadro and Geforce cards. In what cases can I see the performance differences between these types of cards?

@grzenda A good indicator of performance is the amount of CUDA cores in an NVIDIA GPU.

Quadro K4000: 1536 CUDA cores
GTX 650 Ti: 768 CUDA cores

1 Like

For Raytraced the difference in CUDA core count should be quite noticable. It should be like having two GTX 650 Ti’s in one, so pretty much double the rendering speed.

If CUDA is the crucial factor for V6 performance one would be silly to buy Quadro…

Geforce 1080TI 3580 Cuda Cores ~ 700Euro

Quadro P 6000 3840 Cuda Cores > 4200 Euro
Quadro P 5000 2560 Cuda Cores >1900 Euro

Quadro k4000 has 768 cuda cores (http://www.nvidia.pl/content/PDF/data-sheet/DS_NV_Quadro_K4000_OCT13_NV_US_LR.pdf), the same amount as GTX 650 TI.

Why have I compared these two cards? Becouse in classic 3D/gaming benchmark, they have similar performance. The difference is in specvievperf, where K4000 is better. But not every 3D modeler take advantage of Quadros, which are very expensive. After some tests in Rhino Beta I can not see a real difference betwen Quadro and Geforce.

Note; the Cuda core difference will only affect raytraced display mode in V6

David Eranen corrected me that Cuda core are just branded names for gpu cores so you actually will see an improvement in OpenGL performance with more cores. Thanks for pointing that out. I was attempting to mention that we don’t use Cuda computing for our OpenGL display and I failed. This is really just a technical way of saying the OpenGL display doesn’t favor nvidia over amd or intel.

But… it does matter whether the nVidia/AMD/Intel drivers are optimized for OpenGL.

Especially since consumer-grade GeForce gaming cards apparently can detect OpenGL calls for CAD and then slow down…

I don’t know if the devs found some workaround for Rhino 6, but that would be great.

1 Like

It would because the viewport performance on RTX 2070 is not very impressie, while the card doesn’t run on 100%, so this would explain a lot.

But your Holomark 2.61 score was the highest score, until @jeff got his hands on a RTX6000…

1 Like

Well maybe it’s my fault and I’m way too demanding.

Can I do some measurements on this scene? Eg. polygon count vs fps while orbiting? The buildings have chamfer edge softenings and simple mesh handrails. The trees are simple V-Ray proxies. I have fullHD screen and orbiting is visibly choppy.

HWmonitor info - GPU utilization at 47%:

image

How is your RAM (not VRAM) usage?

Out of my 16 GB only half is used. Sorry for the funky screenshot, I’ve had to add english labels in Photoshop. :smiley:

Cheers
Jonas

Please, please do not do anything to Rhino to optimize it specifically for Quadro.

Workstation-specific cards are a tax on those who create things–because the textured triangles don’t care what they draw.

I am of the considered opinion that the traditional workstation-configured card often does not have cooling capacity for continued real-time raytracing.

@Jonish Assuming you installed your DIMMs in dual-channel mode, it looks like your PC is just fine. To solve your problem, you need a Quadro for comparison. But that’s easier/cheaper said than done. :joy:

1 Like