Ok, it’s time to build a machine to run Octane fast. The goal is to have a system to setup scenes for design reviews of CAD databases of final products, including exploded views of the entire product assemblies. I need something that can do teh following:
live in my home office (manageable heat and noise when I’ll be sitting next to it)
good Rhino performance
ok CPU rendering speed (a 4 or 6 core i7 will do)
main focus on GPU rendering for Octane.
runs Windows 10
it will be running it with one 4K monitor
Specs I have in mind so far:
Intel Core i7-7700K 4.2GHz Quad-Core Processor
Noctua NH-D15 82.5 CFM CPU Cooler
Asus Z170-WS ATX LGA1151 Motherboard
G.Skill Ripjaws V Series 64GB (4 x 16GB) DDR4-3333 Memory
Samsung 960 PRO 512GB M.2-2280 Solid State Drive
Asus GeForce GTX 1080 8GB TURBO Video Card (4 units)
Fractal Design Define XL Black Pearl ATX Full Tower Case
be quiet! DARK POWER PRO 11 1200W 80+ Platinum Certified Semi-Modular ATX Power Supply
I plan to share the progress here so we can all learn and maybe someone else wants to build something similar.
I’d love to hear your thoughts and if you would consider changing anything.
yeah I never cheap out on components anymore. There’s nothing, absolutely nothing, more expensive than downtime. I adjust my budget to ‘how much’ of something I can get, not ‘how good’.
Whoa! Back up the bus here… make sure your CPU and mobo support 40 PCI express lanes or you might not get full use of 4 vid cards. That CPU is only 16 lanes
The TitanX card is noticeably faster to load and render, despite the 980ti having comparable CUDA cores… And of course there is 12gb of VRAM to load the scene onto.
Each nVid GPU need 8 PCI lanes to work properly and you have that with the above. You can also use an X99 based mobo if you want more options. GigabyteX99 is the one I use. One thing you might consider is to get a quad mem kit (non ecc, ecc is slower and you’re not launching the space shuttle ) to populate at least 4 mem slots to take advantage of the quad channel memory feature. FYI: There’s nothing special about a quad mem kit other than they’re just 4 pieces of ram that match exactly. If you put in 2 sticks of ram now and buy two more later they might not work as quad. Otherwise, can’t wait to see your build when it’s done!
A side note for anyone building something similar: When using nVid GPU’s they must all be the same card. AMD’s you can mix and match.
Hi Freeke, you rock! Thanks so much for the advice. some more questions if you don’t mind:
I follow you on the support for all swim lanes on the motherboard, an dhow that GigabyteX99 would do that. But that also applies to the CPU? Is that why Smicha recommended Xeon 1650V4 instead of the i7 that I had chosen?
If I drop the ECC and focus on 4 equal sticks, going back to “G.Skill Ripjaws V Series 64GB (4 x 16GB) DDR4-3333 Memory” makes sense?
Yup exactly. It’s like 16 lane highway vs a 40 lane highway and each of your monster truck nVid’s use 8 lanes each. They won’t all fit at the same time on a 16 lane highway. Huge bottle neck!
Good choice on RAM. You’re obviously going for speed/near real time rendering so fast mem is a must.
Nice GPU’s! Can’t say for sure about noise level but one tip to keep noise and heat down is to get a large airy case with a large fans. They move more air at slower speeds and therefore are much quieter.
Thats an 7" fan on top set to low speed. Two 4" fans on the front combined with the 7" on top all set to low speed and they move a lot of air and its very quiet. Xeons also have a nice feature when it comes to heat and that is they have a built in switch that shuts down if they get too hot so you never have to worry about cooking your CPU.
You’d need a bigger case than this Antec 900 tho. I just noticed how close the PS is to that last PCI slot. Plus it looks like on both the ASRock and the Gigabyte that some of the header pins will get covered up by the 4th GPU. Just some extra things to consider. Very tricky cramming 4 GPU’s into a case.
I just finished my second farm, gpu based. I went with motherboards that could handle 2 cards each. Went with regular PC cases, nothing fancy, i7 processors. PSU are 1000 watt in case I need it, but I only measure 550 watts on max render at the outlet.
Electircy is $0.40 per killowatt hour, so there is a small cost with that. I had to do some wiring in the building that runs the 5 nodes, 15 amp circuit for 2 boxes, and 20 amp circuit with 3 boxes.
Basically, I did a spreadsheet with options ahead of time to see what the end costs would be whether I went with a box with 4 GPUs or boxes with 2 GPUs…it was little cheaper with 2 GPUs per box…plus if anything fails, I’m not spending $300 for a MB, or $600 for a processor, its more like $100 and $330, respectively.
The video cards are 980 ti’s classified b-stock evga’s
But the user at the V-Ray forum is building a render farm machine only.
For example I use a DualXeon and I’m glad not decide to turn on/off the slave often. Most I use my master machine only and for large renderings I turn on the slave. So, I like one power machine for working plus inexpensive slaves. An extra machine extra cost for OS, RAM, PSU, motherboard, housing and time build the system and to keep software updated.