Jump to content

Adding Multiple GPUs - Good Advice Needed


Recommended Posts

OK, I want to get properly into GPU rendering (I use CPU mostly), I have little money but might be able to invest in an additional GPU every few months. The dilemma is how to add them to my set up?

 

I have been scratching my head about this for ages, so am making a concerted effort to reach out to the community for a solution. (I can't believe there aren't a lot of people in my position, but it seems you're all happily rendering away on your multi-GPU set-ups!)

 

Required: A solution providing...

 

1) Multiple GPUs (want to add 4-8 over time)

2) External to workstation (no space in case)

3) Avoiding need for additional software (licences)

4) Avoiding unnessary bottlesnecks (I/O)

5) Avoiding complicated solutions (I'm more "Artist" than "Geek")

6) Avoiding expensive options (such as the Netstore NA255)

7) Available in the UK (import will add to cost)

 

 

My Workstation - "Workstation Specialists WSX6.1" (ATX)

CPU: i7-3930K Six-Core @ 4.20GHz

CPU COOLER: Antec Kuhler h2o 920

MOBO: Asus P9X79 WS

RAM: 32GB Ram

GPU: Quadro 6000 (Fermi)

PSU: EVGA 1000 T2 (1000W 80Plus Titanium)

OS: Win 7 Pro on SSD

 

I've looked at Amfeltec and the quote I got seemed expensive for what it is. Also apparently the I/O would cause a delay at start of rendering.

 

The Netstore NA255 looks great to me; fast, all-in-one, etc. but it's soooo expensive here (I could by nearly four GPUs for that!)

 

The prebuilt solutions are extravagant, because I need to focus on purchasing GPUs not paying someone to make me a rig.

 

Conversely, looking at the bewildering array of adapters online is a minefield; the specs, the prices, the stupidly short splitter cables that wouldn't reach outside the case, etc. Will one of these be the solution?

 

I just want to get a bunch of GPUs and render. Is it really this difficult?

 

PSUs are easy enough, I have a spare 750W for now, and will add/replace in time. Enclosure: I could make a case or cage if the electrical componants all do the job regardless. But I am no computer builder.

 

Dear GPU renderfolks, please... "HELP!!!" ;D

Link to comment
Share on other sites

Hi Thomas,

 

shouldn't you begin with just one gpu and take it from there.

 

Well I didn't mention it before but I have other CPU machines doing DR, so the GPU solution needs to be significantly quicker in order to make test renders and real-time feedback profoundly faster than at present.

 

As it happens I bought a 980ti a couple of months back, and it was faulty, so I got around to sending it back the other day, (and got a refund yesterday actually). Anyway, I bought another 980ti last night (different brand), delivered tomorrow, so will fit this into mobo tomorrow.

 

Thing is that doesn't do much for me really. I need the new card to run three monitors, as the old one only runs two screens. So I'm just replacing the old 6000.

 

I can get a second 980ti in a few weeks, so I'll then have two 980ti and that leaves one slot, that I wouldn't want to cram another GPU into, because of cooling, (no space etc). (I'm sticking to reference design cards without WC btw.) Anyway, I'd have one spare pcie (fourth one in use) and that would probably be best used to link to a cluster of GPUs in order to start seeing a significant improvement in rendertimes. After the first two GPUs, I imagine I will then start seeing the benefits of linear scaling of multiple GPUs.

 

I hope I explained that well enough.

 

Since posting here, I have been talking to some of the guys in the GPU rendering groups on Facebook, and they've been very helpful, so I'm getting there, but it is more complicated when you want power on a budget. That's why I'm picking peoples brains, because if I'm canny I may be able to get up and running without throwing too much money at it. It can get real expensive quickly.

 

I may have to accept that building a GPU node with mobo and extra software licence will be the most economical way, depending on the options I find I am presented with. But I was hoping to focus on just getting more GPUs, and cracking on with some visuals.

Link to comment
Share on other sites

Check out the octane render forums, the guys out there discuss a lot of those builds. They even have benchmark sheets for gpus, etc.

 

You can follow and talk with this guy on twitter. He's very active on the octane forums and he's building a beast with 7 gtx 980ti :-O

https://twitter.com/tomglimps/status/722020152045383680

 

Good luck with your project. And watch out for the gtx 1080, I just saw photos on the net. Looks sexy!

Link to comment
Share on other sites

Check out the octane render forums, the guys out there discuss a lot of those builds. They even have benchmark sheets for gpus, etc.

 

You can follow and talk with this guy on twitter. He's very active on the octane forums and he's building a beast with 7 gtx 980ti :-O

https://twitter.com/tomglimps/status/722020152045383680

 

Good luck with your project. And watch out for the gtx 1080, I just saw photos on the net. Looks sexy!

 

Cheers Philippe, I'm there already! - say hello if you see me :)

Link to comment
Share on other sites

Well, i am not the one with brains, but I will see if i can help you.

 

Think the best thing to do is get two gtx in the case, let it run on full throttle for a long time and carefully monitor your temps. See how two gtx performs with the software and if performance is to your liking and adequate.

 

If performance is good, but temps are too high, you may want first try to add as much casefans as possible into your case or onto or next to the gpu's to get all of that heat out of the case.

 

If temps are okay, around 80 degrees celsius, you may want to add more gtx and see how the temps are with gtx closely next too each other. It may be okay temps wise.

 

If temps are too high and above that you even want to add more gpu's you may want to google for mining rigs to get some idea's. Also have a look at the pci-e extension cable they use. Motherboard and stuff can be paced loose (outside the case), but the gpu's must have some rail to fix them on. And as a extra you can place some extra fans towards the gpu's for more cooling during long render sessions, especially during those hot summers.

 

Watercooling will become expensive i think. Dimitris has a very nice watercooled rig.

Link to comment
Share on other sites

Got the 6000 and 980ti working together. Two monitors on the 6000, third one on the 980ti. Testing in RT at 99% load, fans on 100%, temps were around 70c on the Quadro, and 45c on the Geforce. Taking the fans down to an acceptable noise level the temps go up to 90c and 70c respectively. So all-in-all its hot but acceptable, but another card in the mobo wouldn't be a good idea. Very cramped, and no room for more fans. I could move a monitor over to the 980ti to try and even up the temps a bit.

 

But keeping everything in one case was never on the agenda. The WS is still pulling it's weight so best to stick with it and get some sort of GPU platform running in tandem.

 

There are several options. I'm getting a handle on the pros and cons. I could start collecting components for new build GPU node, or save for a Netstore NA255. Not a lot in it price-wise. But by the time you've bought all the other components, a case as opposed to a cage isn't worth skimping on, by the looks of it.

 

The main thing to consider I suppose, is whether to run a separate node on the LAN or have an extender unit via PCI-e. For the price of another licence the benefits of having an independent GPU node might be worth it. Food for thought still.

Edited by TomasEsperanza
Link to comment
Share on other sites

What gpu renderer do you use? vray or octane?

 

If I had to build a multi gpu rack I would do something like that...maybe even homemade

 

http://tomglimps.com/custom-pc-for-otoys-octane-render-by-polder-animation/

 

V-Ray CPU mostly, with some GPU RT for feedback. Interested in Octane. Will prob compare the two and pick, once got a few more cards going. Am conversing with both the main V-Ray and Octane GPU groups on Facebook too.

Link to comment
Share on other sites

Good suggestion from Phillipe.

 

Just watch out with the temps, because heat is a killer.

 

Swapped over a monitor, so the more powerful card takes two of the three. That has evened up the temps. Between now and the next 980ti, I should have a plan pinned down. Thanks for all your suggestions guys :)

Link to comment
Share on other sites

I don't want to be a grouch, Thomas, but if gpu rendering (via RT gpu) is only for feedback and the final production is passed through cpu render nodes, as you mentioned above, then I don't think it's wise to focus so much on an external gpu node of any kind. If, lets say, two 980Tis (the fastest gpus in the market right now) don't already give Real Time feedback in satisfying times, then there must be something wrong with either the software or the hardware in your system, and I suspect that a total of 4 or 5 980Tis wouldn't live up to your expectations either.

 

I'm not a gpu rendering guru, for sure (I'm not a guru of any kind in general). I just sense that you're focusing on something that's not the best for your field of interest.

Link to comment
Share on other sites

I don't want to be a grouch, Thomas, but if gpu rendering (via RT gpu) is only for feedback and the final production is passed through cpu render nodes, as you mentioned above, then I don't think it's wise to focus so much on an external gpu node of any kind. If, lets say, two 980Tis (the fastest gpus in the market right now) don't already give Real Time feedback in satisfying times, then there must be something wrong with either the software or the hardware in your system, and I suspect that a total of 4 or 5 980Tis wouldn't live up to your expectations either.

 

I'm not a gpu rendering guru, for sure (I'm not a guru of any kind in general). I just sense that you're focusing on something that's not the best for your field of interest.

 

Thanks man, I appreciate that. I would be using GPU for production if I had more, that's why I say I'm just using GPU for occasional feedback.

 

The GPU feedback is actually ok with two cards, but the response to exposure and shaders is rather different to CPU, so I feel it's necessary to choose between either CPU or GPU for production. At the moment if I prepare a scene for CPU, but then use GPU to asses progress for exposure, lighting and materials, it doesn't always translate well, I can change things to make it look good one way or another, but I find the scene won't generally sit well for both workflows simultaneously. Looking at the developments in GPU rendering technology, it's seems logical to move over to a GPU workflow at some point soon. That's my rational anyway.

 

I'm not happy about the limitations of GPU, but can see these being overcome in the next year or two. I won't pretend to understand all the technical jargon pertaining to the technology, but I am keeping an eye on developments in order to ascertain what capability is available. With a GPU infrastructure and workflow in place this year, I may have to replace cards I'm buying now, to take advantage of the imminent shared system memory capability promised by manufacturers and developers. But hopefully it will simply be a matter of swapping out GPUs when that capability becomes available (hopefully!). I read Pascal cards would facilitate this, so maybe shared system RAM will be available before I have built up my proposed 6GB card based node. Then it could be just a case of replacing the two cards I have now.

 

If people with deeper understanding of the state of affairs regarding imminent GPU technology, can simplify that for myself and other end-users here, that will no doubt be very helpful.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...