My first quad-GPU + Dual Xeons + Kitchen Sink Home Built PC


#1

Hi All,

Ok, it’s time to build a machine to run Octane fast. The goal is to have a system to setup scenes for design reviews of CAD databases of final products, including exploded views of the entire product assemblies. I need something that can do teh following:

  • live in my home office (manageable heat and noise when I’ll be sitting next to it)
  • good Rhino performance
  • ok CPU rendering speed (a 4 or 6 core i7 will do)
  • main focus on GPU rendering for Octane.
  • runs Windows 10
  • it will be running it with one 4K monitor

Specs I have in mind so far:

  1. Intel Core i7-7700K 4.2GHz Quad-Core Processor
  2. Noctua NH-D15 82.5 CFM CPU Cooler
  3. Asus Z170-WS ATX LGA1151 Motherboard
  4. G.Skill Ripjaws V Series 64GB (4 x 16GB) DDR4-3333 Memory
  5. Samsung 960 PRO 512GB M.2-2280 Solid State Drive
  6. Asus GeForce GTX 1080 8GB TURBO Video Card (4 units)
  7. Fractal Design Define XL Black Pearl ATX Full Tower Case
  8. be quiet! DARK POWER PRO 11 1200W 80+ Platinum Certified Semi-Modular ATX Power Supply
  9. Noctua NF-S12A PWM 120mm Fan (4 units)

Here’s a link to the parts from PC part picker: https://pcpartpicker.com/list/GWyrWX

I plan to share the progress here so we can all learn and maybe someone else wants to build something similar.
I’d love to hear your thoughts and if you would consider changing anything.

Thanks!

G


Desktop recommendations for heavy files, professional use as fast as possible
#2

Show off! Now I have GPU envy… Good work, better share some Octane stuff from that rig.


#3

This is very nice, I know, but want to see a show off system? Take a look at what Smicha built recently: https://www.youtube.com/watch?v=59piATARCXM :scream:

…and yeah I will shame what I do with Octane (all my work is confidential, but I’ll make some shareable bits of course)


#4

Good choice on the memory. Most people don’t understand memory and cheap out.


#5

yeah I never cheap out on components anymore. There’s nothing, absolutely nothing, more expensive than downtime. I adjust my budget to ‘how much’ of something I can get, not ‘how good’.


#6

We wanna see pics of this beast when its done… so we can all :astonished:


#7

Whoa! Back up the bus here… make sure your CPU and mobo support 40 PCI express lanes or you might not get full use of 4 vid cards. That CPU is only 16 lanes


#8

Ok…

BTW, Smicha in the Otoy forums (Octane) suggested this:

Asrock Rack EPC612D8A
1x Xeon 1650v4
Noctua NH-U12DXi4
1x32GB ECC kingston
4x1080 hybrid or 1070 hybrid
EVGA 1600W P2 with eco semipassive mode and 9 gpu connectors - for 4 hybrids 8+8 connectors
Thermaltake Core X5
PL2 fans for extra airflow
Samsung PM961 NVMe SSD (or PRO 960 for more)
a smaller SSD for system (256GB MX300 crucial)
Win7


#9

Interesting system. I’m curious for some photos. For my use I would take 64GB RAM.


#10

I have a Titan X and 980ti.

The TitanX card is noticeably faster to load and render, despite the 980ti having comparable CUDA cores… And of course there is 12gb of VRAM to load the scene onto.

Not sure how 1080 would fair in comparison.


#11

The Titan X never crossed my mind as a viable option:

http://gpuboss.com/gpus/GeForce-GTX-TITAN-X-vs-GeForce-GTX-1080

I could build 2 complete systems of 4x1080s if I had that kind of money laying around. Or one 7x1080s.


#12

Good call, I totally agree Micha, thx.


#13

Interesting… big price differential compared to U.K.


#14

Very nice system!

Each nVid GPU need 8 PCI lanes to work properly and you have that with the above. You can also use an X99 based mobo if you want more options. GigabyteX99 is the one I use. One thing you might consider is to get a quad mem kit (non ecc, ecc is slower and you’re not launching the space shuttle :grin: ) to populate at least 4 mem slots to take advantage of the quad channel memory feature. FYI: There’s nothing special about a quad mem kit other than they’re just 4 pieces of ram that match exactly. If you put in 2 sticks of ram now and buy two more later they might not work as quad. Otherwise, can’t wait to see your build when it’s done! :slight_smile:

A side note for anyone building something similar: When using nVid GPU’s they must all be the same card. AMD’s you can mix and match.


#15

Hi Freeke, you rock! Thanks so much for the advice. some more questions if you don’t mind:

  1. I follow you on the support for all swim lanes on the motherboard, an dhow that GigabyteX99 would do that. But that also applies to the CPU? Is that why Smicha recommended Xeon 1650V4 instead of the i7 that I had chosen?

  2. If I drop the ECC and focus on 4 equal sticks, going back to “G.Skill Ripjaws V Series 64GB (4 x 16GB) DDR4-3333 Memory” makes sense?

  3. I s this a good hybrid GPU? https://www.newegg.com/Product/Product.aspx?Item=N82E16814487266
    will noise be manageable?


#16

Yup exactly. It’s like 16 lane highway vs a 40 lane highway and each of your monster truck nVid’s use 8 lanes each. They won’t all fit at the same time on a 16 lane highway. Huge bottle neck!

Good choice on RAM. You’re obviously going for speed/near real time rendering so fast mem is a must.

Nice GPU’s! Can’t say for sure about noise level but one tip to keep noise and heat down is to get a large airy case with a large fans. They move more air at slower speeds and therefore are much quieter.

Antec 900

Thats an 7" fan on top set to low speed. Two 4" fans on the front combined with the 7" on top all set to low speed and they move a lot of air and its very quiet. Xeons also have a nice feature when it comes to heat and that is they have a built in switch that shuts down if they get too hot so you never have to worry about cooking your CPU.

You’d need a bigger case than this Antec 900 tho. I just noticed how close the PS is to that last PCI slot. Plus it looks like on both the ASRock and the Gigabyte that some of the header pins will get covered up by the 4th GPU. Just some extra things to consider. Very tricky cramming 4 GPU’s into a case.


#17

Very cool, I thinks this so why Smicha was recommending a thermaltake x5 http://www.thermaltake.com/products-model.aspx?id=C_00002806


#18

Yeah that’s the ticket! Nice! @ $139 I might have to get one of those too! :grinning:


(Marc Gibeault) #19

A recent thread in V-Ray’s hardware forum from a guy with the same goal: http://forums.chaosgroup.com/showthread.php?92207-Render-farm-hardware&p=724580#post724580
Interesting conclusion; cheaper and less risky to go with 2 computers with i7s and 2 GPUs each.

In case you can’t access this forum;

I just finished my second farm, gpu based. I went with motherboards that could handle 2 cards each. Went with regular PC cases, nothing fancy, i7 processors. PSU are 1000 watt in case I need it, but I only measure 550 watts on max render at the outlet.

Electircy is $0.40 per killowatt hour, so there is a small cost with that. I had to do some wiring in the building that runs the 5 nodes, 15 amp circuit for 2 boxes, and 20 amp circuit with 3 boxes.

Basically, I did a spreadsheet with options ahead of time to see what the end costs would be whether I went with a box with 4 GPUs or boxes with 2 GPUs…it was little cheaper with 2 GPUs per box…plus if anything fails, I’m not spending $300 for a MB, or $600 for a processor, its more like $100 and $330, respectively.

The video cards are 980 ti’s classified b-stock evga’s


#20

But the user at the V-Ray forum is building a render farm machine only.

For example I use a DualXeon and I’m glad not decide to turn on/off the slave often. Most I use my master machine only and for large renderings I turn on the slave. So, I like one power machine for working plus inexpensive slaves. An extra machine extra cost for OS, RAM, PSU, motherboard, housing and time build the system and to keep software updated.