Rendering rig. Which video card(s)?

Hi all,

I’m repurposing a Dell 7500 workstation we had around and not being used much, to make a rendering rig (mostly GPU).

Currently this has 12 physical cores Xeons/48GB ram:

This is good for CPU rendering (KeyShot/Modo). But I want fast GPU rendering too (Octane). And this is where I need help. The PC currently has an older Quadro 5000, this PC is about 6 years old or so. The box is a large tower and it have 7 PCI slots (one being used w/ existing Quadro:

I want to add 2 (or more?) fast GPUs, and I heard GeForce GTX 1080s are the best rendering value. Is this still the case?

Can I put 2? 3? At what point rendering speed (for Octane) stops scaling well?

Also what about heat? Noise? Should I consider water cooling? Is there a good plug-and-play option for it? Or do I have to be a plumber?

Power consumption is anotehr consideration. The current power supply is 1100 watts. Do we need to upgrade that?

I see that EVGA, Asus, PNY, MSI, Giga-Byte, Zotac all made various models. So I have no idea what to get.

I’d appreciate any feedback to get going from Brooke who have built multi-GPU systems.

Thanks!

G

Drop a couple of 1080’s in that beast and it should blow you away! I’m partial to Asus, just got a new computer with a 1070 in it and rarely get it heated up with Octane. I have an older 780 Ti that I will add when I get a bigger power supply.

Like this model for example? Graphics Cards - All series|ASUS USA

That looks good, especially the fan setup.

Open up the case and take a photo of he motherboard if you can?

If I’m not mistaken the 7500 only has 2x pcie slots, the rest are PCI (not suitable for modern GPUs)
Also, I think the dual CPU 7500 uses a rider board for the second CPU which may overlap the pcie slots…
might be able to remove & sell the second CPU to find part of the gpu purchase (& make room for it) if I’m right…?

Will probably happen if you take a closer look to the PSU which does not appear to have a standart ATX layout :slight_smile: My guess is that it may be a redundant EPS12V PSU, hard to tell without seeing the case open. Even if this PSU would be full 1100W version, i doubt it has the enought PCIe connections to mount 2x GTX 1080 which would require at least 2x 8-Pin PCIe connectors. The regular (single fan, founders edition) has 1x 8-Pin PCIe, while the overclocked ones like the Zotac or Asus ROG Strix require 180W each minimum under load and should be connected with 1x 8-Pin PCIe and 1x 6-Pin PCIe each.

Taking a look at the mainboard or its specs as @Prehabitat suggested would help. I think this xeon chipset does only support 1-2 PCie slots which are all PCIe 2.0 x16. I’ve seen some of these Dell cases open an would suggest to measure the slot lengths (and width) as these are often not within regular standarts. Sometimes various air shrouds are installed to get the heat out which might complicate mounting oversized cards (eg. tripple slot) or tripple fan cards…

c.

I opened it up. It looks like:

  • an open slot 1 ( PCIe2 X8) below
  • Quadro is in slot 2 (PCIe2 x 16 75w).
  • an open slot 3 (PCIe2 X8) above (unless Quadro is also connected to slot 3, it’s occluded from view)
  • an open slot 4 PCIe2 x16 75w)
  • open slot 5 (short PCI?)
  • open slot 6 (PCIe2 X4)
  • open slot 7 (long PCI?)

Ok, I took the Quadro out. It was using just slot 2. Here’s a better view of all the slots

I[quote=“clement, post:6, topic:40887”]
while the overclocked ones like the Zotac or Asus ROG Strix require 180W each minimum under load and should be connected with 1x 8-Pin PCIe and 1x 6-Pin PCIe each.
[/quote]

Clement, based on my picture, if I understand you correctly I could put the first Asus ROG in slots 1+2 and the second Asia ROG in slots 3+4?

EDIT: …I think I know what you mean regarding the 6/8 pints now. PSU connectors. …well I think I’m one 8-pin short :frowning:

Just from looking at your image I’d say you’ll be able to put two large cards in there but three may be pushing it. Don’t worry about noise. Octane scales amazing, every time you add a GPU it works that much faster . Make sure you calculate for the maximum Watt draws from the cards. I don’t know cards that go over 400 W full load, and most are at 2 to 300 full load. Therefore with two big GPUs your power supply may still just be enough, don’t bother with water cooling yet.

It’s true that you need to be sure you have enough connectors coming out of your power supply, once you get up to three or four GPUs you are maxing out the number of connectors out of even a big supply . With 2 you’re fine. But anyway that Board and Case is not going to support three or four . Also pay no attention to SLI, and it also doesn’t matter if it’s running on a 4X or an 8X slot, because that only modifies the time it takes to load textures on and off the card . DO buy GPUs with the most ram you can afford six gigs would be a good start . You are limited by the card with the least amount of ram !!!

You are not gaming with this rig so the considerations are very different and don’t let gamers tell you otherwise . All that matters is raw gpu processor power and ram.

1080’s have the most value at this point, but 780 ties are still incredible, for value .

I just edited the post above, and based on what clement mentioned, it looks like I’m one 8-pin connector short.

I’m considering adding an extra PSU next to the case, that should work right?

@gustojunk, looks good. The PCIe x16 2.0 slots are the ones with the light blue lashes. Both slots are double width which is required for the GTX1080 reference cards (the single fan founders edition). Note that these tripple fan cards are often labeled “dual slot” but sometimes they are slightly thicker which makes it a pain to mount above each other since the air flow gets completely blocked.

You might measure the length you have to the black plastic part, which is a card holder above the round air shroud with the “Prevision” typo on it and compare with the length of a GTX. If it is to tight, the black card holder usually can be just taken out.

If this makes no problem, take a look at the cables of the PSU and try to locate more PCIe cables. They should look like the one mounted in the quadro 5000. How many do you have ? (btw. make a closeup of the PSU sticker) so you can find out the required specs. I think you would need 33AMP’s on the 12V rail the PCIe cable runs on, to be save.

btw. funny up side down layout of this dell mobo and many many air shrouds :yum:

c.

Speak to Simcha at octane forums; he has some good real world power data on 4-14x 1080 rigs (yep, 14x).
I have an overclocked i7-3770k machine with 1x590 & 2x780 powered by a 1kw PSU and never been much over 600w at the wall while using octane (much lower than the calculated figure shows).
Note: this includes at boot time…

I’ve even tried dual instancing and CPU rendering while CUDA rendering and didn’t go much over 700w at the wall.
I think I’d need to be using stress testing algorithms on everything to actually hit the theoretical power draw as calculated…

Not saying the 7500 is sufficient, but check for the figure from simcha first

1 Like

Closeup of PSU sticker here…

But as I mentioned on my ‘EDIT’ of my first response to you, it looks like I’m one 8-pin short.

Regarding length to the back plastic card-holder I think I have enough. Here is how much extra lengthy is in the Quadro card (notice the extra spacer metal bracket to connect to back holder), the Asus card has 11.73" length and here the space is past 13":

IMO 1100w psu is plenty for 2x 1080’s Pretty sure you can get adapters to get the extra pins you need.
ie.
6pin to 8pin
2xMolex to 8pin

If it was me I’d try to get some info on where all those 12v rails go; so you can isolate the each gpu onto its own 12v rails…

But could save you a lot of money just getting the 2x1080’s and an adaptor rather than shelling for a 1200w psu too…

1 Like

Besides the 1-8pin and 2-6pin connectors, therefore 1-8-pin short, all I have left are these (only 1 Molex):

If I have to use slots 2 and 4 (blue lashes) then I’m stacking them back to back. So the lower card will be fanning to nowhere. The card’s HEIGHT is 1.57", same as those 2 slots gap.

I can’t use slots 1 and 4 instead to give them a bit of a gap? Or 2 and 6?

Gustavo, the fans suck air in, so if you go for a tripple fan design (and find a way to connect if using the molex adapter), it might be that the blocked card does only get the heat from the backplate of the other card. Your system has an optimized airflow using various shrouds to make sure all hot air gets out of the case. The tripple fan design cards are not able to put air out of the case, the founder edition does but the single fan design is louder.

If you’re unsure about the PSU, look at the specs. The lable says it has 18,0A for each of the 12V rails. That is not too much but in the year this system was manufactured display cards where using much more power, eg. mine is from the same year and sucks 280W max while the GTX1080 is labeled with 180W. If each PCIe cable uses these 18.0A the combined power should be sufficent but i would try this out with a single card first.

Consider that you have 2 xeon CPUs, each takes 140W and some other devices. To calculate what is required, you might enter your specs here. I get around 822 load wattage if i enter 2x GTX 1080, your CPUs and some standart stuff like one 1x HDD, 3x USB devices, Firewire etc.

What concerns me slightly is that i do not see a fan in the PSU, if there is one it might happen that you get some noise under full load (all components maxed out). If there is no fan you might get a nice firework some day when running this 24/7.

i guess you should try to use the two x16 slots. Do you have a manual of the mainboard ? From what i find you have:

2x PCI Express 2.0 x16 (the ones with blue latches)
1x PCI - full length - full height
2x PCI Express 2.0 x16 (one in x4-Mode, one in x8-Mode)
1x PCI-X (64-bit/100MHz)

c.

I am not sure if a standart ATX PSU will fit into this case looking at the mounting holes.

c.

1 Like

Agreed. It looks like the graphs I’ve seen also included CPU/Ram. So 250-300 is a more appropriate per/card Max load.