I can’t help solve your problem I’m afraid but I hope somebody can - that’s very similar to the spec of new machine I’m currently considering. Everything I’ve read about GeForce cards suggests they work perfectly well with Rhino, I hope that is the case. I’ve always bought low to mid level Quadro cards and was hoping to get more performance in V-Ray with a GeForce (or maybe 2) this time.
I have one 1080ti(and it plus 2 970s until I sold them) and have no trouble…well I guess I had an issue about V6 not recognizing changes in hardware, but it was fixed. Sounded like the second poster in the other thread had them misconfigured.
What is your exact make and model of the GeForce 1080 Ti? (where did you get it, who is the OEM, what config is it, etc…)
I’m going to contact NVidia on this one, since I’m currently at a total loss on why some users are seeing what they’re seeing, however, this is a gaming card, so it may be somewhat of a roadblock or hurdle…we’ll see. I just want to make sure that if/when we start collecting any of these, we get ones that are known to work and ones that are known not to work…and hopefully they’re not the same ones.
@Everyone who owns a GeForce 1080 Ti, please provide your configuration stats as well, along with your experiences with the card and Rhino.
There is most likely nothing we’ll be able to do in terms of V5, so the only hope there is if a driver update solves the issues. However, we will be investigating possible solutions or workarounds that can be done in V6, so if you’re not on board the V6 Rhino train and you own one of these cards, you might want to consider hopping on now.
I don’t think that’ll be necessary, but thanks for the offer… Also, those appear to be water cooled, and I’m not sure if they’re standalone or require a water-cooled configuration…either way, I’d rather not deal with it … Now, if you’re having issues with them…well, then maybe we’ll need to take a look.
I spoke with NVidia, and they’re currently not sure why performance is being reported as “jerky”, and didn’t really have any suggestions other than to report a bug. As for not being able to recognize the cards, it sounds like what I thought…If you configure your cards to be “compute driven”, which I guess happens when you use them in parallel (note: This is not the same as SLI), then there is no Primary Display Adapter on the system, and so Microsoft’s generic display driver is in effect…and as of right now, I don’t see (understand) how a program can create an accelerated device context when there is no primary adapter.
Lastly, do not confuse CUDA and its uses with OpenGL and its uses…they are not really the same thing, and have different usage and system requirements.
Please let me know if you run into problems with these cards asap.
I’ve been using a “1600p” monitor since a few video card generations before I had a 1080ti. In 1999–last century! Before video cards really DID anything!–I was using more pixels than a 1080p display. The biggest monitor with basically as many pixels as it can possibly handle that’s nice to look at for hours is what matters for “work,” fps is a distant second consideration.
Anything above 60fps is not needed for a CAD monitor, but can be nice if you game.
(I have a 144 hz laptop, but I sat it to 60 as I don’t need my GPU to calculate more frames than needed, that just wastes battery life IMO )
A 1440p monitor is nice, I have an AOC 3440x1440 at home and a Dell 38??x1600 at work (ultra wide monitors) and I love them for Rhino, the extra space on the sides gives more room for toolbars. Personally I prefer them over 4K monitors just because of that width feature.
I don’t know your budget, but IPS monitors are usually better at color reproduction than VA’s. IMO worth the cost as they are our eyes and what we evaluate the design on.
Good luck and happy shopping!