NVIDIA GeForce 7800 GT: Rounding Out The High End
by Derek Wilson & Josh Venning on August 11, 2005 12:15 PM EST- Posted in
- GPUs
Final Words
With the insertion of the NVIDIA GeForce 7800 GT into the high end landscape, it becomes difficult for us to provide our readers with a one size fits all answer. Recommendations are complex because we must take into account current hardware, performance needs, price, and future upgrade prospects. Buying a graphics card right now poses quite a difficult situation.From the numbers it is very clear that the 7800 GT is a better card than the 6800 Ultra. For the price, there is no contest between the two. The 6800 Ultra is no longer a viable solution except as an upgrade to SLI from a previously owned Ultra. That being said, paying about $400 for a second 6800 Ultra may be a good solution if everything else is already there and ready to go. For those who have a non-SLI PCI Express system and only a 450 W power supply, spending the money on a 7800 GTX is a much better upgrade than spending the money for a decent PSU, SLI motherboard, and 6800 Ultra. The best case scenario there will cost something like $650. If a system upgrade is in your future, it may be better to save your money until the new AMD processors and sockets come along.
We feel that the verdict for 6800 Ultra owners is to hold their cards and wait rather than go for the SLI upgrade. Those with less than last year's high end may be interested in the upgrade to the 7800 GT, but there are some cautions we would make in recommending this card. The current street prices show the 7800 GTX hitting as low as $500 (and a few OEM models are even lower). With many of the 7800 GT cards on the market hovering at $450, throwing the extra $50 at a GTX is a worthy upgrade for the performance increases gained if you need the extra push to get the quality and resolution desired. If high resolution and solid performance are needed, the GTX will not disappoint.
Here is where everything gets a little hairy. Will you be monitor limited by an upgrade? Many very good flat panels will only display up to 1280x1024 or 1600x1200. Owning one of these panels really negates the necessity for a 7800 GTX right now. From the numbers we have seen, the highest end card we would recommend for owners of a 1600x1200 flat panel (who don't get paid for gaming) is the 7800 GT. We would also recommend finding one for $400 rather than paying full price.
Owners of 1280x1024 panels don't really need to spend the money here as even the 7800 GT will run into walls at this resolution. 6800 GT prices are now dropping to $300 and below. At this price point, owners of older systems who haven't taken the high end plunge and don't want to spend the money on 1600x1200 and up panels would do very well to upgrade to a 6800 GT. Even the venerable 128MB 6800 is available at nearly $200.
We have entered an era where the graphics solution is often limited by the monitor more than anything else at the high end. It is important to pay attention to what is needed for a particular system rather than just grabbing the fastest thing out there. Even with all the bells and whistles, if nothing is going to be done at very high resolutions, current games do not stress the highest end cards enough.
Granted, when games based on technology like the Unreal Engine 3 come along, that 7800 GTX SLI system will come in handy at any resolution. However, we don't like recommending buying now for some promise of longevity or future performance. When games that push these cards come along, new cards will be out (or just around the corner). It is always best to buy for current needs.
That brings us back to the SLI situation and we have to stress that using SLI as an upgrade path isn't the best idea. SLI is best exploited as an ultra high end technology. For that reason, we won't recommend going out and buying a 7800 GT SLI system as spending less money on a single GTX (or a little more on two) are both more valuable options. This time around we can recommend 6800 Ultra SLI as a special case upgrade for current owners of SLI systems and a single 6800 Ultra who want added performance and need to choose between the 7800 GT and a second 6800U card. This recommendation is based on the fact that no more API features or technologies have been added to the GeForce 7 series that make it desirable over a 6 Series setup of the same speed.
Overall, our feeling is that the 7800 GT is a suitable replacement for the 6800 Ultra in the market. Keeping the relatively new, lower yield G70 silicon relegated to the very high end and eliminating (or severely reducing production of) the lowest yield NV4x part will help NVIDIA to maintain higher margins. The fact that plenty of 7800 cards are available now (including the GT) is a huge improvement over what we saw last year on the high end from NVIDIA with very low availability of high end parts. Keeping this trend rolling as long as they have is absolutely wonderful. We are very impressed with NVIDIA and we hope ATI will be able to follow this trend when they finally release Crossfire and their R520 based parts.
For now, NVIDIA's 7800 series own the high end graphics space. Now it's time for monitor makers to catch up and start releasing panels that can make use of the available dual-link DVI port at reasonable prices.
77 Comments
View All Comments
IdBuRnS - Tuesday, November 29, 2005 - link
I just placed an order for a eVGA 7800GT with their free SLI motherboard.This will be replacing my ATI X800 Pro and MSI K8N Neo2 Platinum.
A554SS1N - Tuesday, September 6, 2005 - link
Ok, now for my views on the 7600GT...The 6600GT had exactly half the pixel-pipelines and memory bus of the 6800GT/Ultra, and this makes me think the 7600GT will be the same in relation to the 7800GTX. To add weight to this theory, a 128-bit would be much cheaper to produce, and with a smaller die size, be more economical and cooler. By being cooler, smaller fans can be used, saving more money. Also, NVidia would probably want to keep the PCB smaller for mainstream components (something that I would like myself).
So basically, my suggestion is that the 7600GT would be a 12-pipe, 128-bit card probably with those 12 pipes matched to 8 ROP's (like the 6600GT was 8-pipe matched to 4 ROP's). Around 5/6 Vertex pipelines would sound about right too. If teh core were at 450Mhz or even 500Mhz with 12-pipes, and paired with 1100Mhz memory, it would likely turn out somewhere inbetween a 6800 and 6800GT in performance, but importantly, would be:
- Cooler
- Potentially Quieter
- More energy efficient
- Smaller PCB
- Potentially cheaper
- More easy to produce, therefore able to provide lots of cores to the mass market
Just my opinions, but I can believe in a 12 pipe card more than 16 pipe mainstream card which I consider to be a "pipe dream".
Pythias - Friday, August 12, 2005 - link
You know its time to quit gaming when you have to have a card that costs a much as a house payment and a psu that could power a small city to run it.smn198 - Friday, August 12, 2005 - link
Marketing
DerekWilson - Friday, August 12, 2005 - link
We stand corrected ... After reading the comments on this article it is abundantly clear that your suggestion would be a compelling reason to release lower end G70 parts.Evan Lieb - Friday, August 12, 2005 - link
I think maybe some of you are taking the article a little too seriously. Most hardware articles nowadays are geared toward high end tech for good reason; it's interesting technology and a lot of people want to read about it. It's useful information to a lot of people, and a lot of people are willing to pay for it. You want entry level and mid range video reviewed too? That's fine, but you'll have to wait like everyone else, AT can't force NVIDIA to push out their 7xxx entry level/mid range tech any faster. When it's ready you'll probably see some type of review or roundup.Regs - Thursday, August 11, 2005 - link
Well there is still no AGP for 939 AGP owners and the performance difference between the Ultra and GT this year is a lot more significant from last years. I would hate to spend 500 dollars on a "crippled" 7800 GTX. Not to mention ATI is still a bench warmer in this competition. Just seems like upgrading this year is not even worth it to a 939 AGP owner no matter how much of a gamer you are. I'm disappointed in the selection this year. Performance is there, but the price/value and inconvenience is above and beyond. Last year was a great time to upgrade, while this year seems more like a money pit with no games to fill it over.bob661 - Friday, August 12, 2005 - link
Next year is probably a better time to upgrade for the AGP owners, I agree. For me, I want a 7600GT. If there will be no such animal then maybe a 7800GT at Xmas.dwalton - Thursday, August 11, 2005 - link
I intially agreed with that statement until I thought about 90nm parts. Correct me if I am wrong but Nvidia has no 90nm parts.While nvidia current line of 7xxx and 6xxx provide a broad range of performance. I'm sure nvidia can increase profit margins by producing 90nm parts.
Nvidia can simply take the 6800 GT and Ultra 90nm chips and rebadge them the 7600 vanilla and GT. Since this involves a simple process shrink and no tweaking, these new 90nm can possibly be clocked higher and draw less power while increasing profit margins, without the cost of designing new 7600 chips based off the G70 design. Making everyone happy.
coldpower27 - Friday, August 12, 2005 - link
I would like G70 technology on 90nm ASAP, I have a feeling Nvidia didn't do a shift to 90nm for NV40 for a reason, as that core is still based on AGP technology, and Nvidia currently doesn't have a native PCI-E part for 6800 Line, they are all using HSI on the GPU substrate from the NV45 design.NV40 on 0.13 micron is 287mm2 as pointed out by a previous poster, a full optical node shrink from 0.13 micron to 0.09 micron without any changes whatsoever, would bring NV40 287mm2 die size to ~ 172mm2 as full node optical shrink generally gives a die size of around 60%. This die size may not be enough to maintain a 256Bit Memory Interface,
Hence why Nvidia is rumored to do only a 0.11 micron process shrink (NV48) on the NV40 as that would bring a core down to about 230mm2 which is 80% of the size. Still large enuogh to maintain the 256Bit Memory Interface with little problem.
Making a 90nm G7x part for the mainstream segement directly would be very nice.
Let's say it has 16 Pipelines, and 8 ROP's to help save transistor space, plus the enhanced HDR buffers, and Transparency AA. It would be fairly close to the range I believe of 170mm2, it would probably still be limited at 128Bit Memory Interface, but the use of GDDR3 1.6ns @ 600MHZ could help alleviate the bandwidth problems some. Remember large amounts of memory bandiwdth combined with high fillrate is reserved for the higher segements, very hard to have your cake and eat it too in the mianstream.
Let's faice it for the time being, were not going to be getting fully fucntional high end cores at the 199US price point with 256Bit Memory Interface, so far we have gotten things like Radeon X800, Geforce 6800, 6800 LE, X800 SE, X800 GT. Etc etc. It just doesn't seem profitable to do so.
From what we have seen mianstream parts based on the tweaked technology are usually seen, RV410 Radeon X700, NV36 Geforce FX 5700 are mainstream cores based on the third and second generation of R300 and NV30 technology.
The 6800 @ 199US, 6800 GT @ 299US, 6800 U @ 399US is a temporay measure and production should slow on these cards as Nvidia ramps up the 90nm G7x based parts.