Jump to content

Quadro FX 1700 SLI?


Recommended Posts

Was just helping to spec some new workstations and I just found that a few of our older workstations have dual Quadro FX1700 cards, but from what I know these are not SLI compatible.

 

Its only the higher specced Quadros that will work with SLI.

 

Can someone confirm this so we can rip out the useless second cards and chuck them into some of our slower machines.

 

Bloody Dell speccing workstations with non-SLI quadros, anyway isnt SLI redundant in 3D Max? Havent seen a definitive answer.

 

Dell - FAIL

Link to comment
Share on other sites

Looks like 3700 is the lowest that is capable of SLI....

http://www.nvidia.com/object/quadro_sli.html

 

Easiest thing to do would be to check the nvidia control panel and see if SLI i enabled (or even available in your case)

 

Also keep in mind that it is not possible to run dual SLI and dual monitors at the same time. so if you have two monitors you definitely aren't using dual-SLI. On the other hand if you have dual cards, dual monitors you can run in multi-head mode and get each card running dedicated to each monitor to get full res on both. Most single low-medium level dual headed cards only put out 1/2 res on the second head.

Edited by BrianKitts
Link to comment
Share on other sites

I would think you could get better preformance by running in OpenGL with dual SLI, that's what its made for..... but I just tested it and OpenGL is awful!

 

system specs are a dual quad xeons (2.6)

64bit XP / 8G RAM

dual QuadroFX 4600s

 

max test scene with a 9.5 mil poly scene....

 

in direct3D, I get at a minimum of 12fps when the whole model is in view, and it jumps up to 70 fps when only part of the model is in view.

 

id openGL I get 0.2 fps when the whole model is in view and can't break 10fps when the whole model is in view.

 

I tested the scene with SLI on and off and pretty much the same results. So maybe it really doesn't have any advantage. I know it has an advantage in Archicad running OpenGL which is why we configure all our systems this way, but if it's just max it looks like its a waste of money.

 

can anyone else test to confirm?

Edited by BrianKitts
Link to comment
Share on other sites

I would be interested in hearing how/why those older workstations even had 2 1700FX cards to begin with.

... another example of pin-head IT-types ordering 3D equipment that they know nothing about, or executives buying the most expensive workstation, since after all, the more money it costs, must mean it's better? :rolleyes:

Link to comment
Share on other sites

  • 2 weeks later...

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

×
×
  • Create New...