NVIDIA GeForce 7800 GT: Rounding Out The High End
by Derek Wilson & Josh Venning on August 11, 2005 12:15 PM EST- Posted in
- GPUs
Doom 3 Performance
One of the most demanding games that we test in terms of graphics, Doom 3 shows some impressive gains. Let's take a look.We'll start by comparing the 6800 Ultra and the 7800 GT. The most notable increase here is at 2048x1536 with AA enabled, where we see a 43% improvement in fps with the 7800 GT. We get a similar increase (48.4%) at that resolution without AA enabled, but with AA, we went from 19.3 fps, an unplayable framerate, to 27.6, which is borderline-playable. At 1600x1200, both AA and no AA see only about a 14% increase.
As expected, we see higher gains than this when we compare one 6800 Ultra to two in SLI mode. Without AA, the framerates for both resolutions increase by around 30 fps, a 34% increase at 16x12, and a 77.6% increase at 20x15. The gains are even more impressive with AA enabled. 16x12 AA goes from 41.6 to 75.4, an increase of 81.3%; and at 20x15 AA, from 19.3 to 38.8 - an impressive 101% increase.
The gains that we see with the 7800 GT will definitely make a difference in performance with this game, but unfortunately, the GT still struggles at 20x15 with AA enabled. Two 6800Us in SLI mode don't have this problem, and in fact, they handle 20x15 with AA fairly well. This might not matter however, to those who don't care about AA at high resolutions.
It's interesting to note that Doom 3 appears more dependent on GPU memory bandwidth than GPU processing speed, at least in certain scenarios. Notice how the 6800 Ultra SLI configuration actually beats the 7800 GT SLI configuration in several of the tests. The 6800 cards do seem to have more problems with the 20x15 resolution, however.
77 Comments
View All Comments
Quiksel - Thursday, August 11, 2005 - link
Like I mentioned in one of the other articles:"(1) I understand that taking new tech and reviewing it on launch day, etc., is important. (2) Then comes the mass production of the tech by different manufacturers, so there's a need for the readers to be informed on the differences between the different products. (3) Then there's the difference between the interim releases after the initial launch of the new tech that also need reviewing and explanation. From those three different times of a piece of new tech, I would typically expect 3 articles or so for each piece of said new tech. From my initial post, I have just been surprised that what seems to be happening are lots of reviews centered around the second phase of your review cycle, and so that's why I was asking whether this is really what readers want to see on AT all the time (i.e., $500 graphic cards to oggle and wish a relative would die so that we could afford it)."
"Can't tell you how weird I felt last night to read the new article about the $3000 desk. I guess it helps to have some off-the-wall review about such a nice piece of desk. But is that really what the readers want to see? More hardware that they can't afford? One poster above me here mentioned that you've lost touch with your readers, and sometimes, I wonder whether you're really just trying to fill a niche that no one else is really pursuing in an effort to either drive the industry in that direction or just cater to a crowd that may or may not even visit here. Who knows. I sure got confused with such an article. These 7800GTX articles have done the same for me."
"I don't know what to tell ya to do, because I'm not in your position. But I certainly don't feel as at home on this site as I used to. Am I getting too old to appreciate all this nice shiny new expensive hardware?? :)"
4 out of the last 5 articles on AT are all this high-end tech! Where's the sweet spot? The budget? ANYTHING ELSE BUT THE HIGH-END??
flame away, thanks :)
coldpower27 - Thursday, August 11, 2005 - link
What else is there to review? I mean it's not like Nvidia has relased the 7600 Series yet??? Neither is RV530 anywhere to be found. And typically a high end piece of hardware is new, and you remember Anandtech did review the Athlon 64 X2 3800+. Though I would like to see a reivew of the recently announced Sempron 3400+. I would also like to see how the new Celeron D 351 stacks up as well.I am not sure it's all that interesting to review the same video card over and over again like reference 6600 GT vs a new one with a new more advanced heatsink, then a new one with a better bundle of software etc...
JarredWalton - Friday, August 12, 2005 - link
I have my doubts as to whether a 7600 type card will even *BE* launched in the next six months. Think about it: why piss off all the owners of 6800GT cards by releasing a new card that isn't SLI compatible? From the customer support standpoint, it's better to keep the older SLI-capable cards in production and simply move them to the mid-range and value segments. Which is exactly what NVIDIA did with 6800 and 6800GT with this launch. Now if the 6800U would just drop to $350, everything would be about right.jkostans - Thursday, August 11, 2005 - link
The 7800GT is slightly slower than a 6800 Ultra SLI setup and the GTX is on par or faster. The GT AND GTX cost less than the additional 6800 ultra upgrade to SLI, so SLI is rather useless. Why opt for an extra power hungry 6800 ultra when you can just swap for a lower power 7800 GT or better performing and lower power GTX for less money? This will happen with the 7800 GTX SLI setup too. SLI should only be a considerationas an initial buy (for rich gamers who want the absolute best), not as an upgrade path for later. Gotta love nVIDIA "rendering" their own technology useless lol!.JNo - Thursday, August 11, 2005 - link
Hear Hear! Good point, well made and I think intelligent people realised this from the off. Let me think - 2x 6800U dustbusters causing a racket or 1 new 7800GT(X)...Anemone - Thursday, August 11, 2005 - link
Hi thereI'd like to suggest maybe using 1920x1200 for high res tests. The popularity of widescreen gaming (where possible) is growing, and this provides a more commonly used "extreme resolution" than the 2048x1536, thus, imo a bit more relevant.
Just my $.02
Thanks
JNo - Thursday, August 11, 2005 - link
I second this motion for 1920x1200!! Why test at 2048x1536 when most people who could afford these monitors (albeit CRTs) would likely go for widescreen instead? Slightly less pixels but better visual impact... (nb love watching other CS players not spotting an enemy on the peripheral of my screen presumably cos their monitors are not widescreen!)adonn78 - Thursday, August 11, 2005 - link
First off, no gamer plays videogames at resolutions above 1600x1200! Most of us stick to 1024x768 so that we can get high framerates and enable all the features and play the game on the highest settings. In addition you did not show how the GT and GTX stacked up against the previous generation suchs as the 6800 ultra, GT and the 5950 ultra. And Where is the AGP version? My computer is 2 years old and I am upgrading my graphics card soon. I guess I'll wait to see if ATI makes AGP cards for their next generation. And where the heck is the R520? ATI is really lagging this time around. Hopefully we will get some AGP love. AGP still got a good 2 years of life left in it.DerekWilson - Thursday, August 11, 2005 - link
I play games at 1920x1080 and 1920x1200 depending on what room I'm in ... and I like to have at least 8xAF on and 4xAA if I can.When I'm not playing at those resolutions, I'm playing at 1600x1200 with 4xAA 8xAF period. Any lower than that and I feel like I'm back in 1996.
But that may just be me :-)
If I ran benchmarks at 1024x768, no matter the settings, all these cards would give me the same number (barring everquest 2 on extreme quality which would probably still be slow).
I also play with vsync on so I don't get tearing ... but we test with it off so we can remove the limits and see the cards potential.
neogodless - Thursday, August 11, 2005 - link
Hey, that's good to know about the vsync... back when I played Doom III, I noticed some of that, but didn't know much about it. I just felt "robbed" because my Geforce 6800GT was giving me tearing... thought maybe it couldn't keep up with the game. But everywhere I went I saw people saying "Vsync off! Two legs good!"