Crysis: Warhead

Kicking things off as always is Crysis: Warhead, still one of the toughest games in our benchmark suite. Even three years since the release of the original Crysis, “but can it run Crysis?” is still an important question, and for three years the answer was “no.” Dual-GPU halo cards can now play it at Enthusiast settings at high resolutions, but for everything else max settings are still beyond the grasp of a single card.

Unlike NVIDIA, AMD doesn’t advertise their cards around specific resolutions, however from Crysis it’s quickly apparent that the 6790 is better suited for 1680 than it is for 1920, particularly when anti-aliasing is involved.

Overall the 6790 is quite competitive with the 5830, the GTX 285, and the GTX 460 768MB here; 36.9fps at 1680 isn’t great, but it’s going to be playable. The problem for the 6790 is that the 6850 is 20% faster for around $10 more, and this is going to be a repeating scenario. If AMD dropped the price by $20 it would be a much better fit between the 6850 and 5770, and it would easily vanquish the GTX 550 Ti at that price.

The story with minimum framerates is much the same as it is with the averages. The 6790 actually manages to edge out the 5830 here, but the 6850 is still 20% ahead.

The Test BattleForge
Comments Locked

69 Comments

View All Comments

  • silverblue - Tuesday, April 5, 2011 - link

    Until we shift from 40nm, probably.
  • mgl888 - Tuesday, April 5, 2011 - link

    Not much they can do without a process shrink. Architectural improvements can only go so far.
  • Taft12 - Tuesday, April 5, 2011 - link

    No, there is no wall to speak of. The 4870 was a "first tier" part and this 6790 is "third tier". Compare the performance of the 4870 with a 6970 instead (and indeed the launch price of the 4870 with the launch price of the 6970) and you'll see we are doing just fine thank you very much.
  • tno - Tuesday, April 5, 2011 - link

    But not long after release the 4890 was retailing after discounts for $160-150. That's what I bought mine at and I have yet to find a compelling card to take its place. Part of that does have to do with a decrease in my gaming, but if I was a budget gamer, I would look long and hard at a used 4890.
  • tno - Tuesday, April 5, 2011 - link

    +1
  • pandemonium - Wednesday, April 6, 2011 - link

    Only if you don't consider the broad picture and are looking at performance individually.
  • marc1000 - Tuesday, April 5, 2011 - link

    I have a 5770. and even with the desire to upgrade to 6950/gtx570, i try to remain honest with me and tell to myself: "I don't need it"

    this level of performance if perfectly fine to play all current games, because we are all stuck with console ports...
  • jordanclock - Tuesday, April 5, 2011 - link

    I have a 5770 as well, and at 1680x1050, almost everything runs flawlessly, even with AA at 2x or 4x. The situation with multiplatform development is starting to really agitate PC gamers, I think. Crysis 2 looks infinitely better on PC (And ultra high end setups run tri-monitor quite well!), Dragon Age 2 has a texture pack that consoles wouldn't even have the memory to use, and so many games are looking identical on PC and console, meaning that while they run at high framerates on modest hardware, there is no option to increase visual fidelity to offset the increased hardware.
  • tno - Tuesday, April 5, 2011 - link

    I concur. I have a 4890 that I picked up for $160 not long after its release after discounts. This thing was top tier, and it was pretty unique product coming out when ATI started treating their multi-GPU single card solutions as their true halo products. And for its flaws (noisy, power hungry and no DX11) it competes for performance at its original price.

    This is mirrored, frankly, in the PC market where the effective performance increase, that is performance that the average PC user (not us) will notice, has remained fairly flat since Conroe. What has improved is features. For that same budget dual-core Conroe price you get an integrated GPU worth its salt, improved efficiency, improved encoding/decoding performance (the thing users might notice most) and, possibly, more cores.
  • kmmatney - Tuesday, April 5, 2011 - link

    I also have a HD4890, bought for $170, and attached an Accelero S1 cooler, so its virtually silent (only a very slow spinning fan that I can't hear across the heatsink). I'm still amazed after so much time that I cannot get a better card for the same price - things just haven't progressed much in the bang-for-buck department.

Log in

Don't have an account? Sign up now