Architecture Workstation

My girlfriend is in architecture school at RPI and is finding that her school provided XPX laptop (i7 - 1050) isn’t really as efficient as she would like for rendering. I am also in need of a new workstation. We are thinking we could build a new one with the power shes going to need and we can split the cost. I’ve put together a hardware list that I think will be adequate for what she needs.

Software she is running is VRay, Rhino, Auto-CAD, Adobe CC products and Photoshop.

The GTX 980ti I have selected is in my current PC and we would just swap it over to the new system. Will the 980ti be good enough for now or should we not even consider it?

What monitor setup should we go with? Dual 4K 27"s or a 4K ultra-wide?
Other thoughts about this build?

Aaaaaanybody have thoughts? :slight_smile:

I don’t run Rhino very hard in my testing a tech support role.

I have a Quadro P2000 running two 27" displays at 2560 x 1440.
16GB RAM, i7 processor, and a medium size SSD.
The performance is really good.

This is in a desktop so the price was good too.
I don’t use any of the other applications you listed; just Rhino.


The list you provided will most likely be overkill for most architecture school work. You should be able to get a moderately powered workstation for under $3,000 if you are going to build it yourself from parts. The GeForce GTX 980 Ti will be more than enough to run the software listed above.

I would recommend looking at some of the bundled tower options that Newegg has.

Best Regards,

Do you by any chance know how my 980ti would stack up against something like a P2000?

Thank you for your input. :slight_smile:

I do not.
Maybe someone else that is watching this category will have an opinion.
We just don’t do much hardware testing.

Thank you for the advice! I was concerned the 980ti would not be the best option for modeling and rendering.

I am thinking I might bump the CPU down to an i7 EX edition. I just like crazy hardware LOL

Thank you again!

That said, for V6, I would choose the Quadro as the GeForce drivers in the past have been intentionally throttled when used for OpenGL applications like Rhino. The GeForce cards are intended for DirectX based gaming>

I do not now if the throttling is still true or not.
I have a P2000 and it’s really sweet.

I’m speaking outside of my expertise here, but I believe the GPU isn’t where V-ray will draw from in processing a rendering. I recall the CPU cores playing the vital part in that operation.


That really depends on whether you have a GPU that can do the render processing, and choose that over a CPU.

If you are interested in comparing how different hardware compares with one another, I would suggest the following sites for benchmarks:
(GPU Comparison)[]=3218&cmp[]=3712
Note that each benchmark may or may not reflect what you intend to compare. For example, the GTX 980 TI is ranked #17, whereas a P2000 is ranked #42.

One thing to point out is that the GTX 980 TI has a 250W Thermal Design Power (TDP), whereas the Quadro P2000 has a 75W TDP, so its not a fair comparison.

(System Comparison/CPU comparison)

Best Regards,

With ray-tracing in the new 20xx GeForce cards, Would those be the way to go for Cost/Performance?

I’m not sure Vray has finally released a build that supports the RTX cores. They’ve shown some demos and preliminary benchmarks using it but I don’t think it’s publicly available yet. keep in mind they’ve done a major overhaul of the GPU engine since so those figures probably don’t represent what RTX will be like once it’s publicly available.

Have you tested out Vray cloud? If you’re not rendering non-stop and usually get just a few rendering crunch times in the year, I think it’s a better solution. If you have something good enough for look development on your machine, you can have multiple high res renders running on the servers while you’re still working on your local machine. It’s a pretty awesome workflow.

1 Like

I jumped to the V-Ray GPU train now and it’s quite good working. I use two 2080ti and one 1080ti (per extension cable). Best it works if the 2080ti are used for rendering only and all other software is set to the 1080ti in the Nvidia control panel (it’s only Photoshop and Rhino which are set to the 1080ti and the monitor is pluged to 1080ti too). Also Enscape can be set to still use one of the 2080ti.

My goal was to use the 2080ti in NVlink mode to get 22GB VRAM, but it’s not working yet. I’m waiting for a feedback from the developing team, since it should work. But without NVlink more speed is available and my setup allow to use approx. 11GB for rendering only, so quite complex scenes can be rendered.

Cost/Performance - I use my old DualXeon and upgraded the cards only, so for me it was the best cost/performance way. Also I like the GPU upgrade way, the system (CPU/RAM/OS) can stay untouched, you only need to change cards.

I would look for a good Threadripper (1950X or 2950x) PC and start with a 2080ti and a smaller GPU for monitor use. Later, I would add a second 2080ti.

GPU rendering - it works best per LC+BF and this is slower than my LC+IM setup used the last past years. But the V-Ray next and GPU bring back the old speed, but with a great stable quality - fine details and no splotches anymore. My impression is my DualXeon (32x3.2GHz) in LC+IM was so fast like one 2080ti LC+BF now.

V-Ray next - for me it has started to be stable enough since the VfR next release. There are some limits and bugs, but I hope, the next months will stabilize it. I hope the speed will increase, if the RTX features are more supported.

1 Like

I’m currently on fence between i9 9900k and amd threadripper for rhino and vray. What’s the Arguments in favour of one or the other? I hear the AMD thread ripper have stability issues. Is that really a problem for rhino user?