ATI Radeon X700 XT: More PCIe Midrange
by Derek Wilson on September 21, 2004 5:58 AM EST- Posted in
- GPUs
The Test
The same general guidelines apply to this review as applied to our previous 6600 GT review. We are still comparing many PCI Express Intel system numbers to AGP 8x AMD based systems. Numbers for cards other than the 6600, X700, and X600 are run with older drivers as they are taken from previous reviews we have done. We are in the process of trying to update all our performance numbers with the latest drivers (and then some), but we were unable to do so before this review. The 6600 GT, X700 XT, and X600 XT numbers can all be reliably compared, but please use other numbers for reference only. The key thing to take away from this for most people will be the relative performance between the competing $200 ATI and NVIDIA parts anyway, so we hope that this setup won't be an inconvenience.Processor(s): |
AMD Athlon 64 3400+ |
RAM: |
2 x 512Mb OCZ 3500 Platinum Ltd (2:3: 2:10) |
Hard Drives |
Seagate 120GB 7200 RPM (8MB Buffer) |
Video AGP & IDE Bus Master Drivers |
VIA Hyperion 4.51 |
Video Card(s): |
ATI Radeon X700 XT |
Video Drivers: |
ATI Catalyst 4.6 |
Operating System(s): |
Windows XP Professional SP2 |
Motherboards: |
MSI MS-6702E (VIA K8T800 Pro Chipset) |
As previously stated, the older drivers were used in previous reviews. Only the 6600 GT, X700 XT, and X600 XT used the latest drivers. These three cards were also run on the Intel based PCI Express system, while other cards were run on AMD based AGP systems. AGP cards were run in the Athlon 64 FX 53 system for the Doom 3 and Source Engine tests and all other AGP cards were tested in the 3400+ system.
We wanted to include Aquamark 3 numbers, but we were unable to get the X700 XT to get to the point where it would spit out a number for us. The test ran all the way through, but just didn't display the score. This is probably just some error on Aquamark's part as it gets through the entire benchmark just fine every time.
We should also mention that we did not use the Catalyst Control Center for these tests as we did not want to install the .NET Framework on our test system. All the functionality of the driver is there with the exception of being able to disable certain optimizations that ATI is now rolling into a package called Catalyst AI (which we will talk more about in a seperate article). Thus all the default Catalyst AI settings are used, and our standard control panel setup is used (all defaults with vsync disabled).
40 Comments
View All Comments
Entropy531 - Tuesday, September 21, 2004 - link
Didn't the article say the pro (256mb) was the same price as the XT (128mb)? It does seem odd that the 6600s are only pci-e. Especially since nVidia only makes motherboards with AGP slots, right?Drayvn - Tuesday, September 21, 2004 - link
However, on this site http://www.hothardware.com/viewarticle.cfm?article... it shows the X700XT edged out a win overall.What i think is ATi are doing what nVidia did in the high end market, they brought out the X700Pro, which is very close to the X700XT, but cheaper, and probably highly moddable.
Buy a X700Pro with 5 - 10% loss of performance for $60 less?
blckgrffn - Tuesday, September 21, 2004 - link
What mystifies me (still) is the performance discrepancy between the 6800 and 6600 GT. In some cases, the 6600 GT is whooping up on it. The 6600GT preview article made some allusions to 12 pipes not be as effeicient as 8 and 16, etc. But if the performance is really so close between them, the 6800 is probably going to go the way of the 9500 Pro. That's too bad, my 6800 clocked at 400/825 is pretty nice. If anyone could clear up why the 6600 GT is faster than the 6800, that would be nice. The fill rates should be nearly identical, I guess. But doesn't the 6800 retain it's 6 vertex shaders and wouldn't the extra memory bandwidth make a noticeable difference?Resh - Tuesday, September 21, 2004 - link
Just wish nVidia would come out with the NF4 NOW with PCI-Express, etc. a board with two 16x slots, one 6600Gt now, and one later is looking pretty awesome.rf - Tuesday, September 21, 2004 - link
Looks like ATI dropped the ball - 12 months or more kicking nVidias ass and now they are the ones lagging behind.Oh well, I am not in the market for a graphics card at the moment (bought a 9800XT last year) but if I was, I'd be switching to nVidia.
I do have to say that the move away from AGP is annoying. What about the people that want to upgrade their components? Are we supposed to ditch kit that is less than 6 months old?
ZobarStyl - Tuesday, September 21, 2004 - link
I must agree all things considered the 6600GT really comes out the winner...I mean, look at the x800/6800 launch, the x800Pro looked like it just massacred the 6800GT, and now no one thinks twice at the 400$ price point which is better because nV put out some massive driver increases. Considering the 6600GT already has the performance AND feature advantage over the x700, there's just no contest when you add in what the nV driver team is going to do for its perf. Can't wait to dual up two 6600GT's (not SLI, multimonitor =) )LocutusX - Tuesday, September 21, 2004 - link
Just to be clear, I think #3's statement was invalid simply because Nvidia is winning half the Direct3D games as well as all the OGL games.LocutusX - Tuesday, September 21, 2004 - link
#3: "Again we see ATI=DX, nVidia=OpenGL. "Nah, don't think so. Here are the notes I took while I read the article;
6600gt
d3 (big win) - OGL
far cry (with max AA/AF) - DX9
halo - DX9
jedi academy (big win) - OGL
UT (tie) - DX8/DX9
x700xt
far cry (with NO aa/af) - DX9
source engine (small win) - DX9
UT (tie) - DX8/DX9
I'm sorry to say it, but the X700XT is a disapointment. I'm not an "nvidiot"; check my forum profile, I'm an ATI owner.
Shinei - Tuesday, September 21, 2004 - link
#11: Probably because you won't have much money left for a video card after you buy all the new crap you need for a Prescott system. ;)Anyway, this quote made me wonder a bit.
"From this test, it looks like the X700 is the better card for source based games unless you want to run at really high quality settings."
Er, if I can get great graphics at a decent framerate (42fps is pretty good for 16x12 with AA/AF, if you ask me (beats the hell out of Halo's horribly designed engine)), why WOULDN'T I turn on all the goodies? Then again, I used to enable AA/AF with my Ti4200 too, so my opinion may be slightly biased. ;)
Woodchuck2000 - Tuesday, September 21, 2004 - link
#10 - I agree entirely! These are midrange cards. Yet they're released first as PCIe parts. Which is only available as a high-end Intel solution. Why does this make sense?