pedrom Posted May 5, 2003 Share Posted May 5, 2003 Hi to all, I was wondering if anyone could tell me the weak points of the 3Dlabs cards, especially the Wildcat VP series, as no one seems to recommend them. I've been browsing the sites of some workstation vendors (Boxxtech, HP, Dell, etc.) and the only cards they seem to use are the high-end Wildcat 4, but even then, NVIDIA Quadro is always the default choice. Any help appreciated! Best regards, Pedro Meireles. Link to comment Share on other sites More sharing options...
Greg Hess Posted May 5, 2003 Share Posted May 5, 2003 I was wondering if anyone could tell me the weak points of the 3Dlabs cards, especially the Wildcat VP series. Expensive. Slower then competition. Video overlay problems. Misc Driver issues. Link to comment Share on other sites More sharing options...
pedrom Posted May 6, 2003 Author Share Posted May 6, 2003 I've seen some good reviews of the Wildcat VP870. I've even read somewhere that these cards have better 2D image quality than Matrox. And what about the Wildcat 4? I know it's expensive, but a lot of people seem to like these cards. In the end I think what counts is what type of application you'll be using (and of course the amount of money you can spend!). What I'm trying to understand is for what types of applications can the 3Dlabs cards be better than the NVIDIA ones. I've already had a discussion with garethace about this at the Ace's Hardware forum and he was very helpful. Thanks! Pedro Meireles. Link to comment Share on other sites More sharing options...
Greg Hess Posted May 6, 2003 Share Posted May 6, 2003 If the application of choice is 3dsmax, there is no better solution then an nvidia quadro (at the moment). Why are nvidia cards always used as the default recommendation for 3dsmax/discreet products? To put it simply... OpenGL in 3dsmax is a Piece of Crap. If you compare Max's ogl/d3d to any of the other major players in the business (XSI/Maya) the implementation of their packages speed is FAR superior to 3dsmax. Take the same size poly model, the same video card, and the same resolution/color depth and compare cross platform, and 3dsmax will lose everytime. Not just lose, but be shamed. Why is this? Max uses the same basic ogl base all the way from version 1 and 2. While other programs have since moved on. So with all this max/discreet bashing, whats the point? The point is that nvidia cards tend to be extremely forgiving about the type of ogl implementation that the program/game is using. If you take a Wildcat 4 or 5 and throw it into 3dsmax, you've just wasted a huge amount of money. Why? Because Max can't make full use of the card's capabilities. You'll notice that in most online reviews (of some reputation). http://www.aceshardware.com/read.jsp?id=45000366 (since you were just talking about them). Look how badly the Wildcat's lose. Those cards are very expensive when compared to the quadro's in the test. And they get slaughtered in max. BUT http://www.aceshardware.com/read.jsp?id=45000363 Look what happens if you switch applications. The key points of technology are... 1) What is the budget? 2) What applications are you using? 3) When is your buying time? If your answer is. 1) Under 1k for the Graphics Card. 2) 3dsmax5 3) Now Then its either a Quadro4 980XL, or A Quadro FX-1000. I've worked personally with wildcats before, and their just not worth the price of admission if your dealing with discreet products (Under 3.5k) As for VP's...They are just slower versions of the respective quadro cards. Link to comment Share on other sites More sharing options...
pedrom Posted May 6, 2003 Author Share Posted May 6, 2003 I think I'm starting to get the picture... So, if money was no object, would you recommend a Wildcat card for, say, Maya 5? Also, do you know if the Wildcat 4 cards are intended to replace de Wildcat III in the long run, or are they supposed to co-exist? Thanks for your answers. You've been very helpful! Pedro Meireles. Link to comment Share on other sites More sharing options...
garethace Posted May 6, 2003 Share Posted May 6, 2003 Go Greg!!!!! :ngelaugh: You are correct about MAX too, not really worth throwing extra dollars/euros for its craaaap ogl implementation. Mind you, if you want to use a bit of Lightscape a good card is useful? What about Lightscape and the Wildcat? Link to comment Share on other sites More sharing options...
Greg Hess Posted May 6, 2003 Share Posted May 6, 2003 So, if money was no object, would you recommend a Wildcat card for, say, Maya 5? No. I'd recommend a Quadro FX 2000. Quite possibily the fastest card available currently in a[EDIT] wide variety of "CG Animation" programs. [/EDIT] Also, do you know if the Wildcat 4 cards are intended to replace de Wildcat III in the long run, or are they supposed to co-exist? Supposed to replace. Hey Gare, I don't know enough about lightscape to make an informed recommendation on that subject. Link to comment Share on other sites More sharing options...
garethace Posted May 6, 2003 Share Posted May 6, 2003 Fair enough, Lightscape is a software without a home right too, if i understand correctly here from forums. Stopped around 1999 in development for the 3.2 edition and that was the end of it. I was wondering about installing a WildCAT card and power supply/motherboard requirements. Particularly since you do mention a 400watt + power supply for a dual system. Even though Quadro/Gef have been much easier to install and work with from a system building point of view - i wonder is that still the case today with Gef FX? I mean, how are motherboards generally coping with the power draf of a beast like that? P.S. Hope you got something from my CAD software few tips in the other thread Greg. Link to comment Share on other sites More sharing options...
Greg Hess Posted May 6, 2003 Share Posted May 6, 2003 I mean, how are motherboards generally coping with the power draf of a beast like that? The Geforce/Quadro FX's use an internal power connector to get additional voltage. The Wildcats require specific boards, with specific voltage requirements. (Aka they don't work in every motherboard). P.S. Hope you got something from my CAD software few tips in the other thread Greg. Aye I did. I learned its another wonderful world of applications which follows different rules and different configurations for optimum performance. Thanks for all the info. Link to comment Share on other sites More sharing options...
garethace Posted May 6, 2003 Share Posted May 6, 2003 Well the basic point and most important of all - is that these guys who model, i mean really model. They cannot understand the concept of polygon counts and polygon count ceilings. To the best CAD modellers out there, to them there is no such thing as a polygon count ceiling. Which is a real pity software like Lightwave is not more popular, at least for presentation work that is. Itanium stroke 1, could open large models in roughly 3 days, whereas Itanium 2 can do the same 'opening the model benchmark' in 8-9 hours. THIS represented a huge stride in terms of CAD modelling development and performance. So basically CAD models do tend to stress the idea of the single workstation and single user to a very high degree. But then again, the world around us is designed by CAD now, space ships, trains, power stations, bridges, .... but my experience basically has been that CAD engineers have tried to go from letter A to letter Z too fast, faster than computer performance could keep pace with (within acceptable price parameters). Link to comment Share on other sites More sharing options...
garethace Posted May 12, 2003 Share Posted May 12, 2003 Allplan is one very good modelling environment, and does have pretty good shaded dynamic graphics acceleration in viewport. I picked up this comment on the allplanforum hardware thread. One thing that is really obvious in comparing the Nvidia FX 2000 with the Wildcat 6110, Wildcat 5000 and Gloria III is how vector redraw speed is getting slower in newer cards. These new cards seem to emphasise shading and rendering at the expense of line drawing speed. The Wildcat 5000 is very good at line redraws in Allplan, the Wildcat 6110 is only faster when you assign all the memory to 'offscreen' otherwise it is much slower than the 5000 and the FX 2000 is so bad at line redraws that the new Allplan 2003 feature where you can move the viewport images around is unusable (this may improve when the latest promised drivers are released - but I doubt it). I find this comment interest, a strength of a better graphics board for intensive 3d cad work, is the responsiveness of the GUI and dialogue boxes, with the line/mesh redraw acceleration. You tend to see this more with faster harddrives like SCSI, with more memory and with multiple CPUs on a good operating system like NT, w2k or xp. But the gpu should help too - apparently not, with the latest batch of gaming rasterising type of cards re-packaged to fit intensive CAD requirements. I've been looking at mobile graphics today - which has always been a bit hit-and-miss with Allplan. While an Nvidia Go graphics laptop would have been sensible they don't seem to be available from Sony and so in the end I bought a new Sony Vaio GRV616G - half the price of the 'professional' GRX616SP and with Ati mobile Radeon 9000 64Mb graphics rather than the 7500 of the GRX. I used to avoid Ati cards for their buggy drivers but rumour has it that things have improved. We'll see! The allround fastest machine for Allplan however is still the Sony laptop!! Given my experiences with the Ati card in the Sony, I think it might be worth giving the Ati Radeon 9xxx range a try for a budget workstation card - although I have been told that older Ati cards had problems with Allplan. Link to comment Share on other sites More sharing options...
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now