Workstation Graphics: AGP Cross Section 2004
by Derek Wilson on December 23, 2004 4:14 PM EST- Posted in
- GPUs
Final Words
The sheer amount of data contained in the review is overwhelming, and if you've made it this far, congratulations.Architecturally, ATI and NVIDIA both base their workstation level parts on consumer level boards. The 3Dlabs workstation-only approach is tried and true in the market place. The similarities between the architectures serve to validate all of the parts as high quality workstation solutions.
Among the disappointments that we suffered during testing was the lack of a GLSL benchmark test that could balance out the picture we saw with Shadermark. The consumer-based architectures of ATI and NVIDIA will have a natural bias toward HLSL support, while 3Dlabs hasn't the need to put much effort into optimizing its HLSL path. The firm grasp that OpenGL has as a standard among workstation applications goes well beyond inertia. The clean, state driven approach of OpenGL is very predictable, well defined, and powerful. It is only natural for 3Dlabs to prefer support for GLSL first and foremost, while NVIDIA and ATI cater to Microsoft before anyone else. We are working to solve this problem and hope to bring a solution to our next workstation article.
We also ran into an issue while testing our Quadro FX 4000 on the DK8N board. Running SPECviewperf without setting the affinity of the process to a single processor resulted in a BSOD (stop 0xEA) error. We are working with NVIDIA to determine the source of this issue.
In tallying up the final results of our testing today, we have to take a look at the situation from a couple of different perspectives.
The largest market in workstation graphics is the CAD/CAM market, and most large scale engineering and design firms have a very large budget for workstation components. In those cases, the top productivity is sought after at all times, and so the top performing part in the case of the application used will be purchased with little regard for cost. As most of our benchmarks show, the NVIDIA Quadro FX 4000 is able to push ahead of the competition. Notable exceptions are the ensight and solidworks SPECviewperf viewsets. Generally speaking, if an engineer needs the highest performing AGP workstation part on the market today, he or she will need the Quadro FX 4000, and cost will be no object.
The DCC workstation market is smaller than the CAD/CAM segment. It also sees more small to mid-sized design houses. Here, cost is more of a factor than at a company that would, for instance, design cars. When looking at a workstation part, productivity is going to be an important factor, but price/performance is going to be a much more important factor. With the 3Dlabs Wildcat Realizm 200 coming in just behind the Quadro FX 4000 in most cases, the significantly lower cost makes it a much better value to those on a budget. The street price of the Quadro FX 4000 is at least $700 more than either the Realizm 200 or the FireGL X3-256. That's almost enough to pick up a second 3Dlabs or ATI solution.
The ATI FireGL X3-256 is really targeted at an upper mid-range workstation position and the performance numbers hit their target very solidly. The ATI part is, after all, a 12 pixel pipe solution clocked at 490MHz. The high end consumer part from ATI is a 16 pixel pipe part clocked at 500MHz. Bringing out an AGP based solution derived from the XT line with 1.6ns GDDR3 (rather than the 2.0ns the X3 has), would very likely push ATI up in performance against its competition. It might simply be that ATI doesn't want to step on its FireGL V7100 PCI Express part, which is just about what we want to see in a high end workstation solution. When all is said and done, the FireGL X3-256 is a very nice upper mid-range workstation card that is even able to top the high end AGP workstation parts in a benchmark or two. The antialiased line support is faster and smoother looking than the competition in most cases, but when a lot of lines are piled on top of one another, the result can look a little blurrier than the other two cards.
The real downside of the FireGL X3-256 is that we were able to find Wildcat Realizm 200 cards for lower prices. The FireGL parts are currently selling for very nearly their MSRP, which may indicate that ATI is having some issue with availability even on the workstation side. With the 3Dlabs solution priced at the same level as the ATI solution, there is almost no reason why not to go with the higher performing Wildcat Realizm 200.
But if your line of work requires the use of HLSL shaders, or you are a game developer hoping to do double-duty with DCC applications and work with in-engine tools, the 3Dlabs Wildcat Realizm 200 is not for you. GLSL shaders are quite well supported on the Realizm line, but anything having to do with HLSL runs very slowly. Many of the Shadermark shaders looked fine, but the more complex ones seemed to break down. This can likely be fixed through driver updates if 3Dlabs is able to address HLSL issues in a timely and efficient manner. If price performance is an issue, a workstation part is called for, and HLSL is needed (say, you're with a game design firm and you want to test and run your HLSL shaders in your DCC application), then we can give a thumbs up to the FireGL X3-256.
We were also disappointed to see that the Wildcat Realizm didn't produce the expected line stippling under the 3DStudio Max 6 SP1. There are line stipple tests in the SPECviewperf 8.0.1 benchmark that appeared to run fine, so we are rather surprised to see this. A fix for the flickering viewports when using the custom driver is also something that we want to see.
The final surprise of the day was how poorly the consumer level cards performed in comparison to the rest of the lineup. Even though we took the time to select the highest clocked monstrosities that we could find, there was nothing that we could do to push past the workstation parts in performance most of the time. There were some cases where individual tests would be faster, but not in the types of tests that we see most used in workstation settings. Generally, pushing vertices and lines, accelerating OpenGL state and logic operations, supporting overlay planes, having multiple clip regions, supporting hardware 2-sided lighting in the fixed function pipeline, and all the other extra goodies that workstation class hardware has just makes these applications run a lot faster.
On the high end of performance in the AGP workstation market, we have the NVIDIA Quadro FX 4000. The leader in price/performance for AGP workstations at the end of 2004 is the 3Dlabs Wildcat Realizm 200. Hopefully, 2005 and our first PCI Express workstation graphics review will be as exciting as this one.
25 Comments
View All Comments
Sword - Friday, December 24, 2004 - link
Hi again,I want to add to my first post that there were 2 parts and a complex assembly (>110 very complex parts without simplified rep).
The amount of data to process was pretty high (XP shows >400 Mb and it can goes up to 600 Mb).
About the specific features, I believe that most of the CAD users do not use them. People like me, mechanical engineers and other engineers, are using the software like Pro/E, UGS, Solidworks, Inventor and Catia for solid modeling without any textures or special effects.
My comment was really to point that the high end features seams useless in real world application for engineering.
I still believe that for 3D multimedia content, there is place for high-end workstation and the specviewperf benchmark is a good tool for that.
Dubb - Friday, December 24, 2004 - link
how about throwing in soft-quadro'd cards? when people realize with a little effort they can take a $350 6800GT to near-q4000 performance, that changes the pricing issue a bit.Slaimus - Friday, December 24, 2004 - link
If the Realizm 200 performs this well, it will be scary to see the 800 in action.DerekWilson - Friday, December 24, 2004 - link
dvinnen, workstation cards are higher margin -- selling consumer parts may be higher volume, but the competition is harder as well. Creative would have to really change their business model if they wanted to sell consumer parts.Sword, like we mentioned, the size of the data set tested has a large impact on performance in our tests. Also, Draven31 is correct -- a lot depends on the specific features that you end up using during your normal work day.
Draven31, 3dlabs drivers have improved greatly with the Realizm from what we've seen in the past. In fact, the Realizm does a much better job of video overlay playback as well.
Since one feature of the Quadro and Realizm cards is their ability to run genlock/framelock video walls, perhaps a video playback/editing test would make a good addition to our benchmark suite
Draven31 - Friday, December 24, 2004 - link
Coming up with the difference between the spec viewperf tests and real-world 3d work means finding out which "high-end card' features that the test is using and then turning them off in the tests. With NVidia cards, this usually starts with antialiased lines. It also depends on whether the application you are running even uses these features... in Lightwave3D, the 'pro' cards and the consumer cards are very comparable performance-wise because it doesn't use these so-called 'high-end' features very extensively.And while they may be faster in some Viewperf tests, 3dLabs drivers generally suck. Having owned and/or used several, I can tell you any app that uses DirectX overlays as part of its display routines is going to either be slow or not work at all. For actual application use, 3dLabs cards are useless. I've seen 3dLabs cards choke on directX apps, and that includes both games and applications that do windowed video playback on the desktop (for instance, video editing and compositing apps)
Sword - Thursday, December 23, 2004 - link
Hi everyone,I am a mechanical engineer in Canada and I am a fan of anandtech.
I made last year a very big comparison of mainstream vs workstation video card for our internal use (the company I work for).
The goal was to compare the different systems (and mainly video cards) to see if in Pro-Engineer and the kind of work with do we could take real advantage of high-end workstation video card.
My conclusion is very clear : in specviewperf there is a huge difference between mainstream video card and workstation video card. BUT, in the day-to-day work, there is no real difference in our reaults.
To summarize, I made a benchmark in Pro/E using the trail files with 3 of our most complex parts. I made comparison in shading, wireframe, hidden line and I also verified the regeneration time for each part. The benchmark was almost 1 hour long. I compared 3D Labs product, ATI professionnal, Nvidia professionnal and Nvidia mainstream.
My point is : do not believe specviewperf !! Make your own comparison with your actual day-to-day work to see if you really have to spend 1000 $ per video cards. Also, take the time to choose the right components so you minimize the calculation time.
If anyone at Anandtech is willing to take a look at my study, I am willing to share the results.
Thank you
dvinnen - Thursday, December 23, 2004 - link
I always wondered why Creative (they own 3dLabs) never made a consumer edition of the Wildcat. Seems like a smallish market when it wouldn't be all that hard to expand into consumer cards.Cygni - Thursday, December 23, 2004 - link
Im surprised by the power of the Wildcat, really... great for the dollar.DerekWilson - Thursday, December 23, 2004 - link
mattsaccount,glad we could help out with that :-)
there have been some reports of people getting consumer level driver to install on workstatoin class parts, which should give better performance numbers for the ATI and NVIDIA parts under games if possible. But keep in mind that the trend in workstation parts is to clock them at lower speeds than the current highest end consumer level products for heat and stability reasons.
if you're a gamer who's insane about performance, you'd be much better off paying $800 on ebay for the ultra rare uberclocked parts from ATI and NVIDIA than going out and getting a workstation class card.
Now, if you're a programmer, having access to the workstation level features is fun and interesting. But probably not worth the money in most cases.
Only people who want workstation class features should buy workstation class cards.
Derek Wilson
mattsaccount - Thursday, December 23, 2004 - link
Yes, very interesting. This gives me and lots of others something to point to when someone asks why they shouldn't get the multi-thousand dollar video card if they want top gaming performance :)