Jerky NVIDIA GeForce GTX 1080 Ti

Hello,

I have a new System. In 3d it runs pretty jerky. Does rhino supports the graphic card already? if not yet is the answer, till when it will be work? Thanks for your feedback!

System
Win 64
Version 5 SR12 64-bit (5.12.50810.13095, 10.08.2015)
V-Ray 34001
Machine

Dell T5600
2 CPU Xeon E5-2670
NVIDIA GeForce GTX 1080 Ti
64 GB RAM

Best Regards ,

Dennes

I can’t help solve your problem I’m afraid but I hope somebody can - that’s very similar to the spec of new machine I’m currently considering. Everything I’ve read about GeForce cards suggests they work perfectly well with Rhino, I hope that is the case. I’ve always bought low to mid level Quadro cards and was hoping to get more performance in V-Ray with a GeForce (or maybe 2) this time.

I hope you get it working properly.

For Rhino WIP with Raytraced you can already use a multi-CUDA GPU setup (2 or more Nvidia cards)

1 Like

Some other Ideas in terms of the Card?

best advice is always look for a newer driver from nvidia. they are typically pretty good about updating.

I’ve posted on this in a newer topic - as of writing (with latest NVIDIA driver updates released 10/6/17) I can’t get rhino to recognize 1080 ti cards in either V5 or V6 WIP.

Oh oh. This is not good… I’m putting these in a box soon, hopefully next week.

Come on RMA team, get your shit together! :joy::stuck_out_tongue:

2 Likes

@jeff any ideas here?

I have one 1080ti(and it plus 2 970s until I sold them) and have no trouble…well I guess I had an issue about V6 not recognizing changes in hardware, but it was fixed. Sounded like the second poster in the other thread had them misconfigured.

Hi @JimCarruthers,

What is your exact make and model of the GeForce 1080 Ti? (where did you get it, who is the OEM, what config is it, etc…)

I’m going to contact NVidia on this one, since I’m currently at a total loss on why some users are seeing what they’re seeing, however, this is a gaming card, so it may be somewhat of a roadblock or hurdle…we’ll see. I just want to make sure that if/when we start collecting any of these, we get ones that are known to work and ones that are known not to work…and hopefully they’re not the same ones.

@Everyone who owns a GeForce 1080 Ti, please provide your configuration stats as well, along with your experiences with the card and Rhino.

There is most likely nothing we’ll be able to do in terms of V5, so the only hope there is if a driver update solves the issues. However, we will be investigating possible solutions or workarounds that can be done in V6, so if you’re not on board the V6 Rhino train and you own one of these cards, you might want to consider hopping on now.

Thanks,
-Jeff

It’s a Gigabyte “Gaming OC,” 11GB. Cheapest one you could get when I got it. On a hex core i7…yeah it’s fine in 5 and 6.

Jim

Hi @jeff,

I’m building a system w/ 4 of these. Waiting for the ram to ship to put it all together, hopefully this week.

The card I got is
EVGA GeForce GTX 1080 Ti SC2 HYBRID GAMING, 11GB GDDR5X, iCX Technology - 9 Thermal Sensors Graphics Card 11G-P4-6598-KR

Amazon link: https://www.amazon.com/gp/aw/d/B0713XX64Y/ref=ya_aw_oh_bia_dp?ie=UTF8&psc=1

I’ll be happy to lend you one if you want to try things on your end. I’d need your shipping address.

EDIT: maybe you want two of them to also see what happens when multiple cards are present?

G

Thanks Gustavo,

I don’t think that’ll be necessary, but thanks for the offer… Also, those appear to be water cooled, and I’m not sure if they’re standalone or require a water-cooled configuration…either way, I’d rather not deal with it :slight_smile: … Now, if you’re having issues with them…well, then maybe we’ll need to take a look.

I spoke with NVidia, and they’re currently not sure why performance is being reported as “jerky”, and didn’t really have any suggestions other than to report a bug. As for not being able to recognize the cards, it sounds like what I thought…If you configure your cards to be “compute driven”, which I guess happens when you use them in parallel (note: This is not the same as SLI), then there is no Primary Display Adapter on the system, and so Microsoft’s generic display driver is in effect…and as of right now, I don’t see (understand) how a program can create an accelerated device context when there is no primary adapter.

Lastly, do not confuse CUDA and its uses with OpenGL and its uses…they are not really the same thing, and have different usage and system requirements.

Please let me know if you run into problems with these cards asap.

Thanks,
-Jeff

They are standalone. Just a bit of water ina closed system. I’m not a fan of making a whole system watercooled. Too risky.

I’m unfamiliar with the setup options here. So I’ll see once we start poking around.

Our goal is to have one card being used as display adapter for the system, and work with Rhino and others. Then have Octane and Vray see all the cards for GPU rendering.

I’ll report back soon.

G

I just bought a second-hand GTX 1080, I also plan to upgrade my monitor, would the 1440p gaming monitors be the best value? or I should stick with the 1080p gaming?

I’ve been using a “1600p” monitor since a few video card generations before I had a 1080ti. In 1999–last century! Before video cards really DID anything!–I was using more pixels than a 1080p display. The biggest monitor with basically as many pixels as it can possibly handle that’s nice to look at for hours is what matters for “work,” fps is a distant second consideration.

1 Like

Anything above 60fps is not needed for a CAD monitor, but can be nice if you game.
(I have a 144 hz laptop, but I sat it to 60 as I don’t need my GPU to calculate more frames than needed, that just wastes battery life IMO :wink: )
A 1440p monitor is nice, I have an AOC 3440x1440 at home and a Dell 38??x1600 at work (ultra wide monitors) and I love them for Rhino, the extra space on the sides gives more room for toolbars. Personally I prefer them over 4K monitors just because of that width feature.
I don’t know your budget, but IPS monitors are usually better at color reproduction than VA’s. IMO worth the cost as they are our eyes and what we evaluate the design on.
Good luck and happy shopping!