Jump to content

NVIDIA-Quadro-6000-6GB or GTX 780 GB?


simonm
 Share

Recommended Posts

Hi all

 

I have purchased the GTX 780 6GB and now am wondering if it was a wise decision. Its a great card and all but i feel it struggled a little in some of my scenes, especially there are multiple instances of forest pack objects. The quadro mentioned is a few hundred dollars more and Im confident it will serve my purposes a little better.

 

Does anyone have any insight to this?

 

Thanks

Link to comment
Share on other sites

Sorry im referring to this quadro 6000 - its not the current model

http://www.amazon.com/PNY-DisplayPort-Profesional-Graphics-VCQ6000-PB/dp/B0044XUD1U/ref=sr_1_1?ie=UTF8&qid=1413891146&sr=8-1&keywords=quadro+6000

 

A few hundred dollars? The current generation quadro 6000 is something to the tune of £4,000 (around 7,000 AUD), with the previous generations still costing around £2,000 to £3,000.

 

Where are you getting these prices?

Link to comment
Share on other sites

Well it is probably the old Fermi architecture, which was powerful.

 

If it's purely viewport performance you're after, it probably will be better given that the main difference is the drivers used are specifically geared towards polygon heavy CAD usage.

 

If however you're after GPU rendering performance, I'd definitely stick with what you've got - the 780 has 2304 CUDA cores, whilst the old Quadro 6000 has 448. Quite a difference.

 

http://www.nvidia.com/object/product-quadro-6000-us.html

 

http://www.geforce.co.uk/hardware/desktop-gpus/geforce-gtx-780/specifications

Link to comment
Share on other sites

I agree with Chris, if you are looking mostly for video performance, I'll go with the Quadro, that's what we use at the office, that's what I use at home. I tried different GTXs but they all fail in precision and in very heavy Line work. Now if you are in to the GPU rendering, a GTX will give you a better performance Vs money.

The Quadro 6000 has the fermi cores, yes they are older than Kepler but they are more efficient; that's why GTX has way more cores. Now the latest Maxell core are much better.

Link to comment
Share on other sites

How are you (and others) confident it will be "better" for viewport ? The current 'K'6000 might, it particular scenes ( heavy wireframe views ) but otherwise it would be quite even but old 6000 ? Absolutely not. It was mostly rendering workhorse anyway.

 

Your 3dsMax version has more to do with performance nowadays than the card itself.

Link to comment
Share on other sites

Your 3dsMax version has more to do with performance nowadays than the card itself.

 

This is true, since max 2009 the video performance of 3D Max when down the drain, I feel that just in the last two release came back to life, 2015 seems pretty good, so much time wasted any ways.

 

Regarding your question you are right, moving for an expensive GTX to a Quadro wont be like night and day, but there is small details that make the difference, besides NVidia always castrate the drivers for GTX so they have less pixel precision than Quadro drivers. For example, far away lines selection, double face flick on view port or pretty close faces, consistency in Antialising, ghosting, FPS and others.

 

Video performance is very subjective always, we all have different requirements and each scene is different, but for me with several years of experience I can see the difference, I would love to pay the price of a GTX and get Quadro performance, but NVidia won't let you do it. Long time ago you could hack the GTX but not anymore, and that's the whole deal to all of us hurt to pay the hefty Quadro prices.

Link to comment
Share on other sites

Agree with that, few friends told me visual quality (namely anti-aliasing) of linework is superior with Quadro.

Outside of crippled drivers by nVidia, Max isn't particulary well optimized either as the performance is nothing to boast with either AMD counterpart (Radeons or FirePro).

 

You find the 2015 to be good ? I installed the trial and it's miserable, it's so horrible. You open up layer manager, and big archviz scene are unusable at that moment. Worst thing I have ever seen. I literally haven't heard single positive thing from my surrounding. It's such a shame as the features are great...but the performance with the idiotic layer manager is abysmal.

Link to comment
Share on other sites

I have 2015 at home for freelancing, at the office we still use 2014, so my personal projects are not that large in comparison. But I just finished a large building in the forest, (South of Chile) and performance wise was very good it was a REVIT building import and the rest was Forest Pro.

I am talking only of viewport not tool bars and menus, they still suck, is that what you meant?

I was happy when I heard about the nested layers, it is a great concept to me, I also use Cinema 4D but the Max implementation is very lame, plus they hide other "natural" functions of the old Layer system... oh well nothing new from Autodesk really.

Link to comment
Share on other sites

I mean manipulating objects or doing anything if following conditions apply:

 

Layer manager is open

and

Scene has many (1000+, not forestpack instances or proxies) objects.

 

The same case that is absolutely flawless in 2014, 2015 causes freezing, lags and overall slow behaviour. You close the layer manager, and it's ok.

 

Everyone I talked with has this issue, yet with SP2, nothing changes. Reminds me of this ex-autodesk guy who wrote on this forum how he rewrote the code to work with big scenes. Sure, works "amazingly".

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...