I need help! I so much want to like Rhino 6

Please make sure to try and update your driver to the latest version first if at all possible. We’ve found this helps in a lot of cases.

We want to make the display work for all GPUs possible. Rhino’s new display engine was designed to downgrade to alternative modes of drawing when a driver reports support for lower levels of OpenGL. Obviously we need to fix a few things in cases like @tobias (if performance doesn’t improve after a driver update.) @tobias, what OpenGL version does your driver report in Rhino and what is the driver date? This can typically be found by either looking at Rhino’s OpenGL page of running the SystemInfo command in Rhino6.

Hi Steve,
My main problems with the new display are the bump map display, the wire thickness and antialiasing and the constant switch to boundingbox display with dynamic display.

From that bump map display can’t be solved since it was a bug in V5 and the wire thickness is better now with the WireThicknessScale setting.

What is left is the antialiasing which is nicer to my eyes in V5 (I tried fiddeling with Rhino.Options.OpenGL.Antialiasmode and the standard AA settings under OpenGL), and especially the boundingbox display with view manipulation drives me nuts. V6 appears laggy compared to V5 smooth as silk.

thanks, Tobias

PS: My driver is up to date AFAIK. Here my system specs:

Windows 7 SP1 (Physical RAM: 32Gb)
GeForce GTX 260/PCIe/SSE2 (OpenGL ver:3.3.0)

OpenGL Settings
Safe mode: Off
Use accelerated hardware modes: On
Redraw scene when viewports are exposed: On

Anti-alias mode: 8x
Mip Map Filtering: Linear
Anisotropic Filtering Mode: Height

Vendor Name: NVIDIA Corporation
Render version: 3.3
Shading Language: 3.30 NVIDIA via Cg compiler
Driver Date: 11-14-2016
Driver Version: 21.21.13.4201
Maximum Texture size: 8192 x 8192
Z-Buffer depth: 24 bits
Maximum Viewport size: 8192 x 8192
Total Video Memory: 896 MB

OenGL 3.3 is pretty old and is why some features that improve performance aren’t being used. Specifically GPU tessellation is a feature we need 4.1 for in order to draw wires much faster. Definitely try double checking with nvidia for an updated driver just in case a new one has become available.

Switching to bounding box display is due to performance. Is this happening in all display modes including wireframe and shaded or is it specific to a rendered or technical display?

We’ll fix the wire thickness and AA. It just takes time to get everything just right.

Bounding box display happens in all modes. I noticed that when I start Rhino afresh it’s ok for some (short) while. But after the display switches to Bboxes the first time it keeps on doing it even with the slightest movements.

Good to hear that!

@jeff have you seen this? Is there a way to globally switch off BB display?

Did you try the recommendation that @wim made on a separate thread of yours about bounding box display?

Yeah, they should really add a “V5 style display mode” that just works fast. I have asked for that since day one. But upgrading your hardware is obviously part of the game if you want to benefit from new technology.

BTW. You can try this display mode and see if that works any better for you:
V5 Render mode.ini (11.3 KB)

You just have to download the installers for Rhino6. Both Penguin and Flamingo is updated and they are free.

My guess is that your windows installation is 10 years old too and that it would benefit a lot from a fresh install. My homecomputer is a dual xeon with a GTX1070 that I use for HTC Vive was doing OK and not great because of the single core speed of the older CPU, but then windows crashed on me and I had to do a full install. (The previous OS was an upgraded windows 10 pro, so a new installation was done in less than an hour) and after that everything is just running smoother. No lag in the VR demos that lagged before and Rhino is performing smoother. So I’m just saying that blaming Rhino for slow hardware communication might not be the right thing.

1 Like

That is pretty much where we’ll probably end up as we find and fix issues with older systems. Not specifically a V5 display mode, but your display modes will feel a bit more V5ish as we discover better ways to degrade for older systems.

Our first goal is just to do a better job of discovering older systems that can potentially be improved because of very old drivers. That is a problem affecting many users who switch to V6 as V5 used such an old version of OpenGL they never needed to update their drivers even though new ones existed. This probably won’t help Tobias, but it will help a bunch of other users.

1 Like

for once I disagree with you.
if your system is old and doesn’t fit the minimum requirements it’s your problem.
as professionals you should put into the count to buy newer computer at a time.
wasting developing time to fit old machine it’s a shame.
IMO

1 Like

This is great general advice. My personal practice is to check for Nvidia driver updates two or three times a year. If I had any display issues it’s also the first thing I would think of because of following this forum, but I never have because my drivers are always reasonably up to date. My update process is more laborious than usual because my Rhino machine is off-line. Even so the whole process takes less than 30 minutes and the actual driver update is less than 15 minutes. In over 10 years I have never had any glitches with the update process and the drivers have always worked. I’m talking here about getting Quadro driver updates directly from Nvidia for a Dell Precision workstation.

@tobias

V5’s frame rate management was inaccurate… what you’re seeing in V6 is a more accurate representation of your system and it’s abiltiy to “keep up”. If V5 was accurate, you would have been seeing the same thing. The fact that you’re not is the result of what I would call a bug. If you do not want the degradation to occur, then set the FPS setting to something like 0.001.

Now, it’s possible that frame rates are actually slower than V5’s… in which case we need to figure out why. But don’t judge frame rates by whether or not bounding box display kicks in… Run some tests using Testmaxspeed. Start with something simple, like an empty scene…and work up from there…but make sure all settings are exactly the same so it’s an apples-to-apples comparison.

@gustojunk, all - the developers have cleaned out some significant bottlenecks in BoxEdit, for the SR6 release.
Also, it looks like the fix may well get into 6.5.

-Pascal

1 Like

I have the same problem and my card is only a year old. To add insult to injury the most recent driver for my card makes things even worse. The sad tale of the upgrade cycle has never changed. sigh.

And what card would that be?

Note: buying a “new” card is not the solution… buying a “more capable” card is the solution. Buying something like a brand new Quadro K420 will probably be a step backwards… Yes it’s a new card, yes it’s a “Quadro”… but it’s a very low end Quadro with only 512mb - 1GB of video memory… which V6 will eat up in no time. So even though the K420 can handle pretty much anything you throw at it, once it’s memory is used up, it will have made no difference in any kind “upgrade” scenario. The irony of all of this is that we’re told to place everything and anything we can in GPU memory because it will make things run and perform much faster…but then once we do that, and consume most of the memory or fragment it, then the results are completely the opposite.

So again, which card do you have and how much memory does it have? If it’s a fast GPU with lots of memory, then we need to probably take a closer look and figure out why and where you’re having issues.

I’m so spoiled by you all. I don’t deserve this. thank you thank you thank you.

G

3 Likes

Nuthin’ like pressure… thanks.

-Pascal

1 Like

And work this with VRay?

GeForce GTX 965M/PCIe/SSE2 (OpenGL ver:4.6.0 NVIDIA 388.08)

OpenGL Settings
Safe mode: Off
Use accelerated hardware modes: On
Redraw scene when viewports are exposed: On

Anti-alias mode: 8x
Mip Map Filtering: Linear
Anisotropic Filtering Mode: Height

Vendor Name: NVIDIA Corporation
Render version: 4.6
Shading Language: 4.60 NVIDIA
Driver Date: 10-19-2017
Driver Version: 23.21.13.8808
Maximum Texture size: 16384 x 16384
Z-Buffer depth: 24 bits
Maximum Viewport size: 16384 x 16384
Total Video Memory: 2 GB

Hi @dmoyes,

Yep, that’s a pretty good card, and 2GB should be plenty… Although, this is a laptop, and there are all kinds of other issues and bandwidth problems that can exist with laptops… Is this a dual GPU laptop? (i.e. Is there also an Intel GPU embedded).

Given that, I’d like to get more details on the problems you’re seeing and what we can do here to try to replicate them.

Thanks,
-Jeff

Yes it’s got 2 gpu’s. The other is an Intel. I’m running Rhino on a 2nd monitor that’s a 4K Dell. Rhino 5 ran perfectly. 6 feels clunky and sluggish with slow redraws between viewport changes. I’ll try and get more specifics when I have some down time.