New Testing Methodology

Every twelve to eighteen months it makes sense to upgrade our test beds in order to best represent what is available on the market. How the upgrade occurs depends on what is being tested, and in the case of our APU reviews it is clear that due to the wide range of graphics options available, as well as at different price points, that we have to adjust our gaming testing.

For 2015 our CPU performance testing regime remains untouched aside from the late 2014 addition of Linux-Bench for a glimpse into Linux based performance. On the gaming side, our games have been updated to the following:

  • Alien Isolation (First Person Survival-Horror)
  • Total War: Attila (Strategy)
  • Grand Theft Auto V (Open World Sandbox)
  • GRID: Autosport (Driving)
  • Middle-Earth: Shadows of Mordor (Action-Adventure)

Because budgets for gaming graphics cards can vary, or users decide to keep the same card for several generations, we will be testing each of these titles in both low, medium and high end graphics setups. This means we can see where the bottlenecks are for CPU performance at each stage. We have also been able to source both AMD and NVIDIA cards for most of these areas, should one side of the equation scale more than the other.

The GPU sections are split into three based on where they fit in their independent stacks rather than for direct competition:

Low-end:
 - Integrated Graphics
 - ASUS R7 240 2GB DDR3 ($70)
 - Dual Graphics (where applicable)

Mid-range:
 - MSI GTX 770 Lightning 2GB ($245-$255 on eBay/Amazon, $330 new)
 - MSI R9 285 Gaming 2GB ($240)

High-end:
 - ASUS GTX 980 Strix 4GB ($560)
 - MSI R9 290X Gaming LE 4GB ($380)

On the low end, we have selected settings in order to make the current best integrated graphics solutions score between 45 and 60 frames per second. On the mid-range and high-end, we typically pull out 1080p maximum settings or almost-maximum.

The Shadows of Mordor (SoM) benchmark throws up a little interesting teaser as well due to the use of its Dynamic Super Resolution technique. This allows us to render at 3840x2160 (Ultra-HD, or ‘4K’) with our settings despite using a 1080p monitor. As a result, we also test SoM at 4K ultra with our mid-range and high-end graphics setups.

For the high-end setups, as we have managed to source 2 cards of each, means that where applicable we can test both SLI and Crossfire setups. We apply this to Shadows of Mordor at 4K as an extra data point.

For clarity, this means:

  Integrated
R7 240 2GB
Dual Graphics
GTX 770 2GB
R9 285 2GB
GTX 980 4GB
R9 290X 4GB
Alien Isolation 720p Ultra 1080p Ultra 1080p Ultra
Average Frame Rate Average Frame Rate Average Frame Rate
Total War: Attila 720p Performance 1080p Quality 1080p Quality
Average Frame Rate Average Frame Rate Average Frame Rate
Grand Theft Auto V 720p Low 1080p Very High 1080p Very High
Average Frame Rate
%FPS <60 FPS
Average Frame Rate
%FPS <60 FPS
Average Frame Rate
%FPS <60 FPS
GRID: Autosport 1080p Medium 1080p Ultra 1080p Ultra
Average Frame Rate
Minimum Frame Rate
Average Frame Rate
Minimum Frame Rate
Average Frame Rate
Minimum Frame Rate
Middle-Earth:
Shadows of Mordor
720p Low
 
1080p Ultra
4K Ultra
1080p Ultra
4K Ultra
4K SLI/CFX
Average Frame Rate
Minimum Frame Rate
Average Frame Rate
Minimum Frame Rate
Average Frame Rate
Minimum Frame Rate

For drivers, we locked down the 350.12 WHQL versions from NVIDIA soon after the launch of GTA V. Similarly, the 15.4 Beta drivers from AMD are also being used. These will remain consistent over the next 12-18 months until the next update.

All of our old (and new) benchmark data, both for CPU and graphics performance, can be found in our benchmark database, Bench.

We have a variety of benchmarks here, including legacy benchmarks such as CineBench 11.5 and TrueCrypt, which are not published in the main review. All CPUs/APUs that have been tested in our new 2015 style will be labeled in the dropdown menus by having its launch price listed, e.g. ’AMD A10-7850K (95W, $173)’. With any luck over the course of the next six months we will be adding new data and re-testing older processors for the database in order for our readers to compare old with new.

AMD A8-7650K Review AMD A8-7650K Test Setup, Overclocking
Comments Locked

177 Comments

View All Comments

  • Gigaplex - Tuesday, May 12, 2015 - link

    What happened to the DX12 benchmarks? Do we need to remind you that DX12 hasn't even been released yet, so is completely unsuitable for comparing hardware?
  • akamateau - Tuesday, May 12, 2015 - link

    Porting a CURRENT game designed and CODED to DX11 MAX SPEC to DX12 does not mean that it will automatically look better or play better if you do not consider faster fps as the main criteria for quality game play. In fact DX11 Game benchmarks will not show ANY increase in performance using Mantle or DX12
    And logically, continuing to write to this DX11 MAXSPEC will NOT improve gaming community-wide in general. Let’s be clear, a higher spec game will cost more money. So the studio must balance cost and projected sales. So I would expect that incremental increases in game quality may occur over the next few years as studios become more confident with spending more of the gaming budget on a higher MINSPEC DX12 game. Hey, it is ALL ABOUT THE MONEY.
    If a game was written with the limitations or, better, say the maximums or MAXSPEC of DX11 then that game will in all likelihood not look any better with DX12. You will run it at faster frame rates but if the polygons, texture details and AI objects aren't there then the game will only be as detailed as the original programming intent will allow.
    However, what DX12 will give you is a game that is highly playable with much less expensive hardware.
    For instance using 3dMark API Overhead test, it is revealed with DX11 Intel i7-4960 with a GTX 980 can produce 2,000,000 draw calls at 30fps. Switch to DX12 and it is revealed that a single $100 AMD A6-7400 APU can produce 4,400,000 draw calls and get 30 fps. Of course these aren't rendered but you can't render the object if hasn;t been drawn.
    If you are happy with the level of performance that $1500 will get you with DX11 then you should be ecstatic to get very close to the same level of play that DX12and a $100 A6 AMD APU will get you!!!!
    That was the whole point behind Mantle, er (cough, cough) DX12. Gaming is opened up to more folks without massive amounts of surplus CASH.
  • silverblue - Tuesday, May 12, 2015 - link

    Yes, yes, I see your point about AMD's iGPUs benefitting a lot from DirectX 12/Mantle, however I don't think you needed so many posts to make it. Additionally, not benchmarking a specific way doesn't make somebody a liar, it just means they didn't benchmark a specific way.

    Draw calls don't necessarily mean better performance, and if you're memory or ROP limited to begin with... what's more, the performance difference between the 384-shader 7600 and the 512-shader 7850K is practically nothing. Based off this, why would I opt for the 7850K when the 7600 performs similarly for less power? The 7400K is only a little behind but is significantly slower in DX11 testing. Does that mean we don't need the 7600 either if we're playing DX12 titles? Has the test highlighted a significant memory bottleneck with the whole Kaveri product stack that DX12 simply cannot solve?

    In addition, consider the dGPU results. Intel still smokes AMD on a per-FPU basis. By your own logic, AMD will not gain any ground on Intel at all in this area if we judge performance purely on draw calls.

    DirectX 11 is still current. There aren't many Mantle games out there to provide much for this comparison, but I'm sure somebody will have those results on another site for you to make further comparisons.
  • akamateau - Tuesday, May 12, 2015 - link

    There is ONLY ONE BENCHMARK that is relevant to gamers.

    3dMark API Overhead Test!

    If I am considering a GPU purchase I am not buying it becasue I want to Calculate Pi to a BILLION decimal places. I want better gameplay.

    When I am trying to decide on an AMD APU or Intel IGP then that decision is NOT based on CineBench but rather what siliocn produces QUALITY GAMEPLAY.

    You are DELIBERATELY IGNORING DX12 API Overhead Tests and that makes you a liar.

    The 3dMark API Overhead Test measures the draw calls that are produced when the FPS drops below 30. As the following numbers will show the AMD APU will give the BEST GAMING VISUAL EXPERIENCE.

    So what happens when this benchmark is run on AMD APU’s and Intel IGP?
    AMD A10-7700k
    DX11 = 655,000 draw calls.
    Mantle = 4,509,000 Draw calls.
    DX11 = 4,470,000 draw calls.

    AMD A10-7850K
    DX11 = 655,000 draw calls
    Mantle = 4,700,000 draw calls
    DX12 = 4,454,000 draw calls.

    AMD A8-7600
    DX11 = 629,000 draw calls
    Mantle = 4,448,000 draw calls.
    DX12 = 4,443,000 draw calls.

    AMD A6-7400k
    DX11 = 513,000 draw calls
    Mantle = 4,047,000 draw calls
    DX12 = 4,104,000 draw calls

    Intel Core i7-4790
    DX11 = 696,000 draw calls.
    DX12 = 2,033,000 draw calls

    Intel Core i5-4690
    DX11 = 671,000 draw calls
    DX12 = 1,977,000 draw calls.

    Intel Core i3-4360
    DX11 = 640,000 draw calls.
    DX12 = 1,874,000 draw calls

    Intel Core i3-4130T
    DX11 = 526,000 draw calls.
    DX12 = 1,692,000 draw calls.

    Intel Pentium G3258
    DX11 = 515,000 draw calls.
    DX12 = 1,415,000 draw calls.

    These numbers were gathered from AnandTech piece written on March 27, 2015.
    Intel IGP is hopelessly outclassed by AMD APU’s using DX12. AMD outperforms Intel by 100%!!!
  • JumpingJack - Wednesday, May 13, 2015 - link

    "There is ONLY ONE BENCHMARK that is relevant to gamers.

    3dMark API Overhead Test!"

    NO, that is a syntethic, it simply states how many draw call can be made. It does not measure the capability of the entire game engine.

    There is only ONE benchmark of concern to gamers -- actual performance of the games they play. Period.

    Get ready for a major AMD DX12 let down if this is your expectation.
  • akamateau - Tuesday, May 12, 2015 - link

    Legacy Benchmarks?????? i am going to spend money based on OBSOLETE BENCHMARKS???

    CineBench 11.5 was released in 2010 and is obsolete. It is JUNK
    TrueCrypt???? TreuCrypt development was ended in MAY 2014. Another piece of JUNK.

    Where is 3dMark API Overhead Test? That is brand new.

    Where Is STARSWARM?? That is brand new.
  • akamateau - Tuesday, May 12, 2015 - link

    Where are your DX12 BENCHMARKS?
  • akamateau - Tuesday, May 12, 2015 - link

    Where are your DX12 BENCHMARKS?
  • rocky12345 - Tuesday, May 12, 2015 - link

    whining about no DX12 test just take the info that was given & learn from that and wait for a released DX12 program that can truely be tested. testing DX12 at this point has very little to offer because it is still a beta product & the code is far from finished & by the time it is done all the tests you are screaming to have done will not be worth a pinch of racoon crap.
  • galta - Tuesday, May 12, 2015 - link

    Back when DX11 was about be released, AMD fans said the same: nVidia is better @DX10, but with DX11, Radeons superior I-don't-know-what will rule.
    Time passed and nVidia smashed Radeons new - and rebranded - GPUs.
    I suspect it will be the same this time.

Log in

Don't have an account? Sign up now