That is pretty much where we’ll probably end up as we find and fix issues with older systems. Not specifically a V5 display mode, but your display modes will feel a bit more V5ish as we discover better ways to degrade for older systems.
Our first goal is just to do a better job of discovering older systems that can potentially be improved because of very old drivers. That is a problem affecting many users who switch to V6 as V5 used such an old version of OpenGL they never needed to update their drivers even though new ones existed. This probably won’t help Tobias, but it will help a bunch of other users.
for once I disagree with you.
if your system is old and doesn’t fit the minimum requirements it’s your problem.
as professionals you should put into the count to buy newer computer at a time.
wasting developing time to fit old machine it’s a shame.
IMO
This is great general advice. My personal practice is to check for Nvidia driver updates two or three times a year. If I had any display issues it’s also the first thing I would think of because of following this forum, but I never have because my drivers are always reasonably up to date. My update process is more laborious than usual because my Rhino machine is off-line. Even so the whole process takes less than 30 minutes and the actual driver update is less than 15 minutes. In over 10 years I have never had any glitches with the update process and the drivers have always worked. I’m talking here about getting Quadro driver updates directly from Nvidia for a Dell Precision workstation.
V5’s frame rate management was inaccurate… what you’re seeing in V6 is a more accurate representation of your system and it’s abiltiy to “keep up”. If V5 was accurate, you would have been seeing the same thing. The fact that you’re not is the result of what I would call a bug. If you do not want the degradation to occur, then set the FPS setting to something like 0.001.
Now, it’s possible that frame rates are actually slower than V5’s… in which case we need to figure out why. But don’t judge frame rates by whether or not bounding box display kicks in… Run some tests using Testmaxspeed. Start with something simple, like an empty scene…and work up from there…but make sure all settings are exactly the same so it’s an apples-to-apples comparison.
@gustojunk, all - the developers have cleaned out some significant bottlenecks in BoxEdit, for the SR6 release.
Also, it looks like the fix may well get into 6.5.
I have the same problem and my card is only a year old. To add insult to injury the most recent driver for my card makes things even worse. The sad tale of the upgrade cycle has never changed. sigh.
Note: buying a “new” card is not the solution… buying a “more capable” card is the solution. Buying something like a brand new Quadro K420 will probably be a step backwards… Yes it’s a new card, yes it’s a “Quadro”… but it’s a very low end Quadro with only 512mb - 1GB of video memory… which V6 will eat up in no time. So even though the K420 can handle pretty much anything you throw at it, once it’s memory is used up, it will have made no difference in any kind “upgrade” scenario. The irony of all of this is that we’re told to place everything and anything we can in GPU memory because it will make things run and perform much faster…but then once we do that, and consume most of the memory or fragment it, then the results are completely the opposite.
So again, which card do you have and how much memory does it have? If it’s a fast GPU with lots of memory, then we need to probably take a closer look and figure out why and where you’re having issues.
Yep, that’s a pretty good card, and 2GB should be plenty… Although, this is a laptop, and there are all kinds of other issues and bandwidth problems that can exist with laptops… Is this a dual GPU laptop? (i.e. Is there also an Intel GPU embedded).
Given that, I’d like to get more details on the problems you’re seeing and what we can do here to try to replicate them.
Yes it’s got 2 gpu’s. The other is an Intel. I’m running Rhino on a 2nd monitor that’s a 4K Dell. Rhino 5 ran perfectly. 6 feels clunky and sluggish with slow redraws between viewport changes. I’ll try and get more specifics when I have some down time.
Well, that’s a bit more resource requirements you’ve added… Multiple monitors (especially 4K configs) will eat up video memory immediately… The only reason I asked about the Intel GPU is because we’ve seen certain issues where the Intel GPU drivers were getting in the way of the overall system’s display performance…but I doubt that’s what you’re seeing, since it usually results in Rhino freezing up or viewports not responding at all.
What display modes do you mostly work in? And are you seeing these clunky and sluggish behaviors in any mode or specific to certain modes? Is it worse in certain modes over others or is it pretty consistent across all modes?
To be honest… this sounds like an answer just a programmer can give. Probably technically correct but of no help in real life. For me personally I don’t care if what I see represents accurately my systems abilities.
It’s a bit like “hey, I invented a new way of taking pictures of you. Sorry, you look a bit ugly now but it represents your beauty much more accurately”