Jump to content

Rackmount servers: i7 970 vs i7 2600k


Recommended Posts

Hi, I am planning on building a rackmount server, and I have these two platforms at the top of my list.

 

Purpose: 100% dedicated to 3D rendering. Apps: Maya 2011 and Vray. Will render animation. Gpu rendering is capped with the available gpu ram, and from what I have seen does not share the memory when multiple gpu's are used, so it is not an option.

 

i7 970 builds:

 

Cpu= 622 euro i7 970

Ram: 128 euro 12gb Kingstom HyperX 3x4gb 1600mhz cl9

HDD: 32 euro Western Digital Caviar Blue 500gb

Psu: 68 euro SuperFlower Golden Green Pro 550 watt 80 Plus Gold

Motherboard: 143 euro Asus P6X58D-E

4U Rackmount Chassis: 98 euro X-Case RM 400/10 V2, up to 3 front 120mm fans, and 2x 80mm exhaust fans.

Hot swap sata (single 3.5") : 13 euro

Cooler = 62 euro Noctua NH-D14 or Megahalems depending if the Noctua fits or not, height of case: 176mm.

gpu= 30

 

Total: 1196 euro

Cinebench @4ghz: 10.2

Power consumption: 290-320 watts @ full load

Cores: 48

 

i7 2600k builds

 

Cpu: 282 euro i7 2600k

Ram: 131 euro 12gb Mushkin Blackline 1600mhz 2x2gb + 2x4gb cl9

Hdd: 32 ^

Psu: 45 Cougar 450 watt, 86% Efficiency @ 50% max load.

Motherboard: 127 euro Gigabyte Ud3p/r or Asus P8P67

4U Rackmount Chassis: 98 Xcase^ includes rails.

Hotswap: 13 euro

Cooler = 55 euro Megahalems with 2 fans

Gpu: 30

 

Total: 813 euro

 

Planned OC: 4.2-4.5 Ghz, depending on Stability Testing results and Temps.

Cinebench: ~8.5 +- .5 points depending on oc...

Power consumption: 190-220 watts @ full load.

Cores: 44

 

To put things into perspective, my budget is ~9600 euro

 

8x 970 builds: 9568 euro

11x 2600k builds: 8943 euro

*Note, the 2600k option will take up more space, so a second server cabinet will be needed, so the 600 euro will compensate for opting with 2 smaller server cabinets apposed to one.

 

Server cabinets: Digitus, with 6x 120mm fans top exhaust, both front and rear doors are meshed for airflow.

 

I have been planning this for some time, and I will start ordering these parts within a 2-4 week timeframe. Also, I am sure many of you are screaming at me "*** wait for Bulldozer!!!" However, I have a deadline regarding my rendering, and I would not meet my deadline by the time bulldozer comes out.

 

Thank you for reading my post! Suggestions and opinions would be appreciated. :)

Link to comment
Share on other sites

You know, when I saw your thread title I thought it was going to be a question that would require a cost/power/electricity/space analysis that I don't have time to think about right now, but it looks like you've already done that and it looks like the 2600k comes out on top (same money gets you a bit more power, assuming your OC plans work, using less electricity).

Link to comment
Share on other sites

Yeah, it would most likely outperform the 970's for less money. However, the money saved for the computers would be compensated for the extra space required. For example:

 

970's: 8 builds x 4u= 32 u, for ~700 euros I could have a premium server cabinet with a build in 6x120mm exhaust fan setup. It would all be in one unit.

 

2600k's: 11 builds x 4u= 44u. The maximum is 42u so, I would need to buy 2 seperate cabinets, and for the same money would take up twice the amount of space realistically. Also, the cabinets would be of medium build quality without a top-mount exhaust system included.

 

Power consumption wise, I cannot help but think that the 2600k's might basically have the same total power consumption because of the extra fans, and an extra ram module per unit.

 

In the end, I think it comes down to user convenience, experience and available space. The 970 for me I know will work because my 980x with a noctua nhd-d14 rendering for 12hours+ does not exceed 63c. I guess another big factor is what OC I can achieve on the 2600k's, what temps I get and what stability it would provide.

 

It's such a hard decision!! It is afterall, over 10,000 euros....

 

Thanks for your input.

 

Edit: Another factor: Ram expandability.

 

The 2600k's 4 ram slots would be fully loaded for 12 gb of ram with 2x4 + 2x2 gb modules, while on the other hand the 970 will use 3x4 gb modules and have a spare 3 slots for future expandability. If I were to upgrade to more ram in the future the 970 would make it easy and with the 2600k, I would have 22 x 2gb modules that I would have to sell to get a maximum of 16gb while the 970 can handle 24gb. The 2600k can candle 8gb modules but they are waay too expensive and would make the whole platform not cost effective when I need more ram.

 

On the other hand, if the 2600k's will run stable @ 4.5ghz then that would make them the clear winner, and an initial 4x4gb modules would not be too bad.

 

By the way, these days I am using up 10gb average on my animation rendering, and I have to optimize the scene to be below 12gb rarely as I like to use huge 5000x+ pixel textures with a ridiculous amount of polygons and such...Gotta love it :o

 

Suggestions welcome!

Edited by authie
Link to comment
Share on other sites

Personally if it was me I'd stay away from self build rack mounted servers.

 

I'm not averse to custom PC building in general, in fact I build all of our 3D workstations.

 

It's just that with racks the accesibility to rack specific reliable components and decent rack chasis is just too much of a pain, and always seems to result in unreliable servers. Even ones from the smaller manufacturers.

 

Of course it typically comes down to cost, that I do understand, and I guess it depends on how much you rely on the them for your work.

 

But when you can have 1U dual processor purpose built machines (Dell, HP etc) your saving on both rack space and electricity. Plus with dual processors you only need half the number of servers, and you get a host of rack specific monitoring niceties that makes life so much easier.

 

The most important thing is reliability, having been through about 6 render farm generations now all I can say is that the purpose built ones from the major manufacturers are without a doubt orders of magnitude more reliable than the ones from small manufacturers and self build.

 

A peace of mind factor that is easily worth the price premium, and typically you will make the cost back in spending more time on chargeable work rather than fiddling with reluctant servers.

 

Now I'm sure there are many people that will now say rubbish my farms rock solid, and that maybe true, but in my experience that's unusual.

 

As for the cost don't take the web configurators as gospel, if your buying multiple machines you can always negotiate, and believe me you might be surprised at the saving you can make. The last eight we bought allowed us to negotiate a healthy 20% odd discount.

 

One other thing to remember is the heat and noise eight machines will generate, even though you have fans exhausting the heat from the machines if it's going into your working space it's going to get pretty uncomfortable pretty quickly. :)

Link to comment
Share on other sites

Thank you for your opinion tsmithf,

 

In terms of rack specific reliable components I can understand where you are coming from in regards to heat production, however if one can find a decent chassis with plenty of cooling (3x120mm front intakes, with 2x80mm exhaust fans) I don't see a problem. The chassis will also come with the matching rails so I would not have to worry about making my own ones fit.

 

I cannot speak for the reliability of the sandy bridge platforms, but the 980x I have sitting at 4ghz is actually a very mild overclock because they turbo to 3.6ghz out of the box, and the voltages are very slightly raised for rock solid stability. The 970 is basically the same chip without the unlocked multiplier. "Mainstream" parts have come a long way it seems, if we were talking about 2-3 years ago I would 100% agree with all your points but from the performance and reliability I have experienced from the single x58 platforms (of the 980x especially) with mild overclocking along with good coolers, I don't see a reason not pull the trigger.

 

I did research dual xeon setups, especially the dual e5620 variants, however consider that one of these builds will likely cost ~300 euro more, and that it actually has a lower score in Cinebench at around ~9.2 vs the 980x at around 10.2 . You might be suprised to hear that these two options basically have the same power consumption because 2x 80 watt for the xeons= 160watt while a 970/980x has the exact same power output at 4ghz (when comparing Cpu's only), while providing more performance for less money.

 

You do make excellent points on reliability however, and that makes me want to stay clear from sandy bridge more and more as it really is a shot in the dark.

 

And yeah, the cabinet will be placed in my office, as it is now winter I don't have to worry about an AC unit until summer arrives :). Also, because these 4u chassis have 120mm/80mm fans, it is very possible to make it into a very quiet server as I can swap out the fans. 1U servers on the other hand use 40mm fans correct? Those are extremely loud.

 

Thank you for your honest explanation I appreciate it.

Edited by authie
Link to comment
Share on other sites

  • 2 weeks later...

I'm late to this party, but I would like to point out something that often gets neglected in these types of cost/performance analyses, and that is the "human time" factor. It would take quite a bit of additional cost in hardware before I would consider implementing a solution that involves double the amount of physical computers for the same performance. Over the lifespan of this setup, how many hours do you think will be lost to configuring, maintaining and trouble-shooting these machines? Monitoring Windows updates alone would drive me insane...:) Anyway, cost/performance is tricky because it often fails to factor in those activities outside of processing and power consumption. There's also the additional equipment to deal with (cabling, power supplies, switches, hard drives, etc.) that can drive up the cost over the long run more than is evident at the start...

 

Good luck!

Link to comment
Share on other sites

I have two comments, one, the above about maintaining them is not really an issue, as long as your not continually tinkering with new software or installing plug-ins. Once your system is up and running you don't need to do much other than the occasional Ms update, and you can even automate that. Most of my machines update when they are powering down.

 

My second point is this, don't get 4u racks, you'll regret it. My first farm was using horizontal desktop box in a rack, roughly using 4u a piece each, and I regret going that route. I had to buy a massive rack enclosure 48u, and it's just a total waste of space. My advice is try and get these CPUs in the smallest boxes you can. If you can afford 1u, then go for the 2u from x-case as you can use mostly full size components in them anyway. And also, be prepared for a LOT of noise from the fans, my x-cases have 5 fans in each, and a LOT of heat. I would always go for less boxes with more power. It's amazing how quickly these PCs become obsolete.

 

I reckon about a three 3 year life span on a farm, then you start to notice things need to speed up.

Edited by Bewdy
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...