In an unusual move, NVIDIA took the opportunity earlier this week to announce a new 600 series video card before they would be shipping it. Based on a pair of Kepler GK104 GPUs, the GeForce GTX 690 would be NVIDIA’s new flagship dual-GPU video card. And by all metrics it would be a doozy.

Packing a pair of high clocked, fully enabled GK104 GPUs, NVIDIA was targeting GTX 680 SLI performance in a single card, the kind of dual-GPU card we haven’t seen in quite some time. GTX 690 would be a no compromise card – quieter and less power hungry than GTX 680 SLI, as fast as GTX 680 in single-GPU performance, and as fast as GTX 680 SLI in multi-GPU performance. And at $999 it would be the most expensive GeForce card yet.

After the announcement and based on the specs it was clear that GTX 690 had the potential, but could NVIDIA really pull this off? They could, and they did. Now let’s see how they did it.

  GTX 690 GTX 680 GTX 590 GTX 580
Stream Processors 2 x 1536 1536 2 x 512 512
Texture Units 2 x 128 128 2 x 64 64
ROPs 2 x 32 32 2 x 48 48
Core Clock 915MHz 1006MHz 607MHz 772MHz
Shader Clock N/A N/A 1214MHz 1544MHz
Boost Clock 1019MHz 1058MHz N/A N/A
Memory Clock 6.008GHz GDDR5 6.008GHz GDDR5 3.414GHz GDDR5 4.008GHz GDDR5
Memory Bus Width 2 x 256-bit 256-bit 2 x 384-bit 384-bit
VRAM 2 x 2GB 2GB 2 x 1.5GB 1.5GB
FP64 1/24 FP32 1/24 FP32 1/8 FP32 1/8 FP32
TDP 300W 195W 375W 244W
Transistor Count 2 x 3.5B 3.5B 2 x 3B 3B
Manufacturing Process TSMC 28nm TSMC 28nm TSMC 40nm TSMC 40nm
Launch Price $999 $499 $699 $499

As we mentioned earlier this week during the unveiling of the GTX 690, NVIDIA is outright targeting GTX 680 SLI performance here with the GTX 690, unlike what they did with the GTX 590 which was notably slower. As GK104 is a much smaller and less power hungry GPU than GF110 from the get-go, NVIDIA doesn’t have to do nearly as much binning in order to get suitable chips to keep their power consumption in check. The consequence of course is that much like GTX 680, GTX 690 will be a smaller step up than what NVIDIA has done in previous years  (e.g. GTX 295 to GTX 590), as GK104’s smaller size means it isn’t the same kind of massive monster that GF110 was.

In any case, for GTX 690 we’re looking at a base clock of 915MHz, a boost clock of 1019MHz, and a memory clock of 6.006GHz. Compared to the GTX 680 this is 91% of the base clock, 96% of the boost clock, and the same memory bandwidth; this is the closest a dual-GPU NVIDIA card has ever been to its single-GPU counterpart, particularly when it comes to memory bandwidth. Furthermore GTX 690 uses fully enabled GPUs – every last CUDA core and every last ROP is active – so the difference between GTX 690 and GTX 680 is outright the clockspeed difference and nothing more.

Of course this does mean that NVIDIA had to make a clockspeed tradeoff here to get GTX 690 off the ground, but their ace in the hole is going to be GPU Boost, which significantly eats into the clockspeed difference. As we’ll see when we get to our look at performance, in spite of NVIDIA’s conservative base clock the performance difference is frequently closer to the smaller boost clock difference.

As another consequence of using the more petite GK104, NVIDIA’s power consumption has also come down for this product range. Whereas GTX 590 was a 365W TDP product and definitely used most of that power, GTX 690 in its stock configuration takes a step back to 300W. And even that is a worst case scenario, as NVIDIA’s power target for GPU boost of 263W means that power consumption under a number of games (basically anything that has boost headroom) is well below 300W. For the adventurous however the card is overbuilt to the same 365W specification as the GTX 590, which opens up some interesting overclocking opportunities that we’ll get into in a bit.

For these reasons the GTX 690 should (and does) reach performance nearly at parity with the GTX 680 SLI. For that reason NVIDIA has no reason to be shy about pricing and has shot for the moon. The GTX 680 is $499, a pair of GTX 680s in SLI would be $999, and since the GTX 690 is supposed to be a pair of GTX 680s, it too is $999. This makes the GTX 690 the single most expensive consumer video card in the modern era, surpassing even 2008’s GeForce 8800 Ultra. It’s incredibly expensive and that price is going to raise some considerable ire, but as we’ll see when we get to our look at performance NVIDIA has reasonable justification for it – at least if you consider $499 for the GTX 680 reasonable.

Because of its $999 price tag, the GTX 690 has little competition. Besides the GTX 680 in SLI, its only other practical competition is AMD’s Radeon HD 7970 in Crossfire, which at MSRP would be $40 cheaper at $959. We’ve already seen that GTX 680 has clear lead on the 7970, but thanks to differences in Crossfire/SLI scaling that logic will have a wrench thrown in it. But more on that later.

Finally, there’s the elephant in the room: availability. As it stands NVIDIA cannot keep the GTX 680 in stock in North America, and while the GTX 690 may be a very low volume part due to its price, it requires 2 binned GPUs, which are going to be even harder to get. NVIDIA has not disclosed the specific number of cards that will be available for the launch, but after factoring the fact that OEMs will be sharing in this stockpile it’s clear that the retail allocations are certainly going to be small. The best bet for potential buyers is to keep a very close eye on Newegg and other e-tailers, as like the GTX 680 it’s unlikely these cards will stay in stock for long.

The one bit of good news is that while cards will be rare, you won’t need to hunt across many vendors. As with the GTX 590 launch NVIDIA is only using a small number of partners to distribute cards here. For North America this will be EVGA and Asus, and that’s it. So at least unlike the GTX 680 you will only need to watch over two products instead of a dozen. On a broader basis, long term I have no reason to doubt that NVIDIA can produce these cards in sufficient volume when they have plenty of GPUs, but until TSMC’s capacity improves NVIDIA has no chance of meeting the demand for GK104 GPUs or any of the products based off of it.

Spring 2012 GPU Pricing Comparison
  $999 GeForce GTX 690
  $499 GeForce GTX 680
Radeon HD 7970 $479  
Radeon HD 7950 $399 GeForce GTX 580
Radeon HD 7870 $349  
  $299 GeForce GTX 570
Radeon HD 7850 $249  
  $199 GeForce GTX 560 Ti
  $169 GeForce GTX 560
Radeon HD 7770 $139  


Meet The GeForce GTX 690
Comments Locked


View All Comments

  • theSeb - Thursday, May 3, 2012 - link

    I must say I found it quite odd and hilarious to see people accusing Anandtech of favouring AMD by using a monitor with a 1200 vertical resolution. 16:10 monitors are not that uncommon and we really should be showing the industry what we think by not purchasing 16:9 monitors.

    Anyway, if anything this review seems to be Nvidia biased, in my opinion. The 7970 CF does not do too badly, In fact it beats the 690 / 680 CF in many games and only loses out in the games where it's "broken". I am not sure why you cannot recommend it based on the numbers in your benchmarks since it hardly embarrasses itself.
  • silverblue - Thursday, May 3, 2012 - link

    It's not "people", it's "person"... and he's only here to troll graphics card articles.

    When AMD gets it right, CrossFire is absolutely blistering. Unfortunately, the sad state of affairs is that AMD isn't getting it right with a good proportion of the games in this review.

    NVIDIA may not get quite as high scaling as AMD when AMD does get it right, but they're just far more consistent at providing good performance. This is the main gripe about AMD; with a few more resources devoted to the project, surely they can overcome this?
  • CeriseCogburn - Friday, May 4, 2012 - link

    Yes, of course, call names forever, but never dispute the facts.
    I will agree with you though, amd drivers suck especially in CF, and they suck for a lot of games for a long long time.
  • silverblue - Friday, May 4, 2012 - link

    No, I said AMD's drivers have issues with Crossfire, not that they suck in general.

    I've also checked three random British websites and there's no issues whatsoever in finding a 1920x1200 monitor. I also looked at NewEgg and found eight immediately. It's really not difficult to find one.
  • CeriseCogburn - Saturday, May 5, 2012 - link

    1920x1200 all of you protesteth far too much.
    The cat is out of the bag and you won't be putting it back in.
    Enjoy the bias, you obviously do, and leave me alone, stop the stalking.
  • seapeople - Saturday, May 5, 2012 - link

    I'm with ya bro. Forget these high resolution monitor nancy's who don't know what they're missing. I'm rockin' games just fine with 60+ fps on my 720p plasma tv, and that's at 600hz! Just you try to get 24xAAAA in 3D (that's 1200hz total) on that 1920x1200 monitor of yours!

    Framerate fanboys unite!
  • CeriseCogburn - Sunday, May 6, 2012 - link

    Ahh, upped the ante to plasma monitors ? ROFL - desperation of you people knows no bounds.
  • saf227 - Thursday, May 3, 2012 - link

    On page 2 of the review - where you have all the pictures of the card - we have no real basis for figuring out the cards true size. Could you include a reference in one of those photos? Say, a ruler or a pencil or something, so we have an idea what the size of the card truly is?
  • Ryan Smith - Thursday, May 3, 2012 - link

    The card is 10" long, the same length as the GTX 590 (that should be listed on page 2). But I'll take that under consideration for future articles.
  • ueharaf - Thursday, May 3, 2012 - link

    why they back to 256 bits and the gtx 590 have 384 bits?!?!
    cause they dont want to have a lot of advantage?
    maybe the next gtx 790 will have again 384 bits and it would be better than gtx690 ....come on!!!

Log in

Don't have an account? Sign up now