I placed an order for a build with a 2080ti, i9 9900k and a 27” Benq monitor. I’m hoping this is a good build for my purposes of rhino 6 and rendering vray or keyshot for at least 3 years of heavy usage. I can still amend some parts if I let them know by tomorrow so if there’s a very good reason to change something, would appreciate the feedback. Have read on some other threads that the quadro is better for rhino… and could get the quadro rtx4000 for a bit less than the 2080ti. (But the comparison doesn’t look good: https://gpu.userbenchmark.com/Compare/Nvidia-RTX-2080-Ti-vs-Nvidia-Quadro-RTX-4000/4027vsm716215)
What do you think?
Thank you.
The Ryzen 12 core 3900x is due out this month. I’d hold out for that. I’d also hold out for the RTX 2080 Super which is also out this month. It’s rumored to be just as a powerful as the old 2080 ti for the cost of the non ti.
I am sorry, but the main office called, and they need you to send me that machine immediately after it comes in : )
Well, if you wait 6 more days, AMD might have a cheaper CPU, by some $229.
Nice inexpensive case, so that money can go toward the parts. Good memory. I might go with a Big Nactua Fan Cooler because it’s simpler, although a heatpipe/heatsink cooler is heavier on the motherboard. I prefer Corsair power supplies because mine lasted 7 years, so far, unless you can afford Seasonic. I hope that monitor has pretty accurate color for that size/price.
My MSI motherboard also lasted for 7 years, though I might check the UFI/CMOS version, to make sure there the firmware has no must-have updates.
This is the best comparison I could find on nVidia performance:
So in some CADs the Quadro rules, but in Max and Maya they don’t. And the 2080 has 2x the cuda cores, important if cuda rendering is part of your thing.
I’m mostly going to do work in rhino 6 with the vray next plug-in and possibly keyshot. Some photoshop as well. Is the rtx4000 a better much better performer for rhino 6 with vray compared to 2080ti? Any way to confirm this without buying both cards and running tests?
I’m using 1080Ti and I pretty happy with it combine with a 4K resolution.
You can purchase one or two used at this point.
So the 2080Ti is OK but
most important is a big 4k 144Hz monitor to look in to the curve shape. Then the GPU that can push and handle that.
I use curved 48 TV (because is what I was able to afford in this country). But is much better a 144Hz monitor so that you can stay more time in front of the monitor. To push that amount of pixels or refresh rate a 2080 Ti is good. The 2080Super is similar in frame rate to 1080Ti and also are OK (to consider).
The old and used 1080Ti is a bit more powerful (in FPS) than the new 2080 Super. So consider upgrading your 400u$d monitor to a 4K 144Hz. If you purchase a small 4K monitor you will need to exand the icon size and font and that is not good (is worthless). A big 4K TV at 60Hz can give you nausea after 7hs of work.
I’m interested in this question. There seems to be NO or very little information around that directly compares ‘consumer’ grade GPUs with pro grade ones.
Some people say that pro GPUs are better for CAD and 3D modelling although I’ve never seen a strong argument as to why other than ‘driver optimisation’. My suspicion is that much of it is a con by nVidia and AMD to sell essentially the same gear at vastly inflated prices.
It would be really good to have some end user generated benchmarks of performance for different GPUs on PC and Mac.
Well the short answer is the differences are becoming less relevant as time goes on, it is a bit of a “market differentiation” ploy. You can Google benchmarks comparing any card to any other these days, there is no reason to debate. My general advice is to only buy the absolute top end of “pro” products, they might not be much faster but if you can afford to pay 3 times more for 10 percent more speed, go ahead, makes more sense than paying about the same for less performance under most situations.
The userbenchmark website is the best I’ve seen but it does seem to be focussed on gaming. It would be really good to have something primarily looking at OpenGL, GPU raytracing etc.
Maybe I’ve missed something but I do periodically check.
Jumping from 1080p to 4K is like having 4 monitors 1080p push by one GPU.
So if your geometry is becoming complex as for example you will have frame rate problems rotating in perspective.
Increasing antialiasing settings?
Depend on the antialiasing type. There are a lot of different types. You can customise it in Window specific for Rhino. If It is a fast good simple antialiasing can just pick up 20% of performance. If It is an old type, x2 can be like having 4 monitors and rendering that into one monitor. X4 X8 is expensive in that case. So your frame rate can decrease x2 x4 x8.
Usually, the monitor is fixed to 60Hz. But the rotation of the object can decrease. Under 15 FPS frame per second or in this case rotation per second is not good for inspecting the object. More post process you add as for example Display Pen or Artistic more GPU power you need. Also using Rendering with lots of big colour 4K textures can eat GPU power.
Monitor example screenshot: Rhino running at 768p with X4 antialiasing is not the same as a 4k without.
768p left and 4K on the right