New Testing Methodology

Every twelve to eighteen months it makes sense to upgrade our test beds in order to best represent what is available on the market. How the upgrade occurs depends on what is being tested, and in the case of our APU reviews it is clear that due to the wide range of graphics options available, as well as at different price points, that we have to adjust our gaming testing.

For 2015 our CPU performance testing regime remains untouched aside from the late 2014 addition of Linux-Bench for a glimpse into Linux based performance. On the gaming side, our games have been updated to the following:

  • Alien Isolation (First Person Survival-Horror)
  • Total War: Attila (Strategy)
  • Grand Theft Auto V (Open World Sandbox)
  • GRID: Autosport (Driving)
  • Middle-Earth: Shadows of Mordor (Action-Adventure)

Because budgets for gaming graphics cards can vary, or users decide to keep the same card for several generations, we will be testing each of these titles in both low, medium and high end graphics setups. This means we can see where the bottlenecks are for CPU performance at each stage. We have also been able to source both AMD and NVIDIA cards for most of these areas, should one side of the equation scale more than the other.

The GPU sections are split into three based on where they fit in their independent stacks rather than for direct competition:

Low-end:
 - Integrated Graphics
 - ASUS R7 240 2GB DDR3 ($70)
 - Dual Graphics (where applicable)

Mid-range:
 - MSI GTX 770 Lightning 2GB ($245-$255 on eBay/Amazon, $330 new)
 - MSI R9 285 Gaming 2GB ($240)

High-end:
 - ASUS GTX 980 Strix 4GB ($560)
 - MSI R9 290X Gaming LE 4GB ($380)

On the low end, we have selected settings in order to make the current best integrated graphics solutions score between 45 and 60 frames per second. On the mid-range and high-end, we typically pull out 1080p maximum settings or almost-maximum.

The Shadows of Mordor (SoM) benchmark throws up a little interesting teaser as well due to the use of its Dynamic Super Resolution technique. This allows us to render at 3840x2160 (Ultra-HD, or ‘4K’) with our settings despite using a 1080p monitor. As a result, we also test SoM at 4K ultra with our mid-range and high-end graphics setups.

For the high-end setups, as we have managed to source 2 cards of each, means that where applicable we can test both SLI and Crossfire setups. We apply this to Shadows of Mordor at 4K as an extra data point.

For clarity, this means:

  Integrated
R7 240 2GB
Dual Graphics
GTX 770 2GB
R9 285 2GB
GTX 980 4GB
R9 290X 4GB
Alien Isolation 720p Ultra 1080p Ultra 1080p Ultra
Average Frame Rate Average Frame Rate Average Frame Rate
Total War: Attila 720p Performance 1080p Quality 1080p Quality
Average Frame Rate Average Frame Rate Average Frame Rate
Grand Theft Auto V 720p Low 1080p Very High 1080p Very High
Average Frame Rate
%FPS <60 FPS
Average Frame Rate
%FPS <60 FPS
Average Frame Rate
%FPS <60 FPS
GRID: Autosport 1080p Medium 1080p Ultra 1080p Ultra
Average Frame Rate
Minimum Frame Rate
Average Frame Rate
Minimum Frame Rate
Average Frame Rate
Minimum Frame Rate
Middle-Earth:
Shadows of Mordor
720p Low
 
1080p Ultra
4K Ultra
1080p Ultra
4K Ultra
4K SLI/CFX
Average Frame Rate
Minimum Frame Rate
Average Frame Rate
Minimum Frame Rate
Average Frame Rate
Minimum Frame Rate

For drivers, we locked down the 350.12 WHQL versions from NVIDIA soon after the launch of GTA V. Similarly, the 15.4 Beta drivers from AMD are also being used. These will remain consistent over the next 12-18 months until the next update.

All of our old (and new) benchmark data, both for CPU and graphics performance, can be found in our benchmark database, Bench.

We have a variety of benchmarks here, including legacy benchmarks such as CineBench 11.5 and TrueCrypt, which are not published in the main review. All CPUs/APUs that have been tested in our new 2015 style will be labeled in the dropdown menus by having its launch price listed, e.g. ’AMD A10-7850K (95W, $173)’. With any luck over the course of the next six months we will be adding new data and re-testing older processors for the database in order for our readers to compare old with new.

AMD A8-7650K Review AMD A8-7650K Test Setup, Overclocking
Comments Locked

177 Comments

View All Comments

  • Gigaplex - Tuesday, May 12, 2015 - link

    Mantle for AMD discrete GPUs runs on Intel CPUs so is a completely valid test for CPU gaming performance.
  • CPUGPUGURU - Tuesday, May 12, 2015 - link

    Mantle is developed as AMD GCN API so don't go telling us its optimized for Intel or Nvidia because its NOT! Mantle is DOA, dead and buried, stop pumping a Zombie API.
  • silverblue - Wednesday, May 13, 2015 - link

    You've misread Gigaplex's comment, which was stating that you can run an AMD dGPU on any CPU and still use Mantle. It wasn't about using Mantle on Intel iGPUs or NVIDIA dGPUs, because we know that functionality was never enabled.

    Mantle isn't "dead and buried"; sure, it may not appear in many more games, but considering it's at the very core of Vulkan... though that could be just splitting hairs.
  • TheJian - Friday, May 15, 2015 - link

    Incorrect. The core of Mantle sales pitches was HLSL. You only think Mantle is Vulkan because you read Mantle/Vulkan articles on Anandtech...LOL. Read PCPER's take on it, and understand how VASTLY different Vulkan (Headed by Nvidia's Neil Trevett, who also came up with OpenGL ES BTW) is from Mantle. At best AMD ends up equal here, and worst Nvidia has an inside track always with the president of Khronus being the head of Nvidia's mobile team too. That's pretty much like Bapco being written by Intel software engineers and living on Intel Land across the street from Intel itself...ROFL. See Van Smith Articles on Bapco/sysmark etc and why tomshardware SHAMEFULLY dismissed him and removed his name from his articles ages ago

    Anandtech seems to follow this same path of favoritism for AMD these days since 660ti article - having AMD portal etc no Nvidia portal - mantle lovefest articles etc, same reason I left toms years ago circa 2001 or so. It's not the same team at tomshardware now, but the damage done then is still in many minds today (and shown at times in forum posts etc). Anandtech would be wise to change course, but Anand isn't running things now, and doesn't even own them today. I'd guess stock investors in the company that bought anandtech probably hold massive shares in sinking AMD ;) But that's just a guess.

    http://www.pcper.com/reviews/General-Tech/GDC-15-W...
    Real scoop on Vulkan. A few bits of code don't make Vulkan Mantle...LOL. If it was based on HLSL completely you might be able to have a valid argument but that is far from the case here. It MIGHT be splitting hairs if this was IN, but it's NOT.

    http://www.pcper.com/category/tags/glnext
    The articles on glNext.:
    "Vulkan is obviously different than Mantle in significant ways now, such as its use of SPIR-V for its shading language (rather than HLSL)."
    CORE? LOL. Core of Vulkan would be HLSL and not all the major changes due to the GROUP effort now.

    Trevett:
    "Being able to start with the Mantle design definitely helped us get rolling quickly – but there has been a lot of design iteration, not the least making sure that Vulkan can run across many different GPU architectures. Vulkan is definitely a working group design now."

    Everything that was AMD specific is basically gone as is the case with DX12 (mantle ideas, but not direct usage). Hence NV showing victories in AMD's own mantle showcase now (starswarm)...ROFL. How bad is that? Worse NV was chosen for DX12 Forza Demo which is an AMD console game. Why didn't MS chose AMD?

    They should have spent the time they wasted on Mantle making DX12/Vulkan driver advances, not to mention DX11 driver improvements which affect everything on the market now and probably for a while into the future (until win10 takes over at least if ever if vulkan is on billions of everything else first), rather than a few mantle games. Nvidia addressed the entire market with their R&D while AMD wasted it on Mantle, consoles & apu. The downfall of AMD started with a really bad ATI price and has been killing them since then.
  • TheJian - Friday, May 15, 2015 - link

    Mantle is almost useless for FAST cpus and is dead now (wasted R&D). It was meant to help AMD weak cpus which only needed to happen because they let guys like Dirk Meyer (who in 2011 said it was a mistake to spend on anything but CORE cpu/gpu, NOT APU), & Keller go ages ago. Adding Papermaster might make up for missing Meyer though. IF they would NOT have made these mistakes, we wouldn't even have needed Mantle because they'd still be in the cpu race with much higher IPC as we see with ZEN. You have no pricing power in APU as it feeds poor people and is being crushed by ARM coming up and Intel going down to stop them. GAMERS (and power users) will PAY a premium for stuff like Intel and Nvidia & AMD ignored engineers who tried to explain this to management. It is sad they're now hiring them back to create again what they never should have left to begin with. The last time they made money for the year was Athlon's and high IPC. Going into consoles instead of spending on CORE products was a mistake too. Which is why Nvidia said they ignored it. We see they were 100% correct as consoles have made amd nothing and lost the CPU & GPU race while dropping R&D on both screwing the future too. The years spent on this crap caused AMD's current problems for 3yrs on cpu/gpu having zero pricing power, selling off fabs, land, laying off 1/3 of employees etc. You can't make a profit on low margin junk without having massive share. Now if AMD had negotiated 20%+ margins from the get-go on consoles, maybe they'd have made money over the long haul. But as it stands now they may not even recover R&D and time wasted as mobile kills consoles at 1/2 through their life with die shrinks+revving yearly, far cheaper games and massive numbers sold yearly that is drawing devs away from consoles.

    Even now with 300's coming (and only top few cards are NOT rebadges which will just confuse users and piss them off probably), Nvidia just releases a faster rehash of tech waiting to answer and again keep a great product down in pricing. AMD will make nothing from 300's. IF they had ignored consoles/apus they would have ZEN out already (2yrs ago? maybe 3?) and 300's would have been made on 28nm optimized possibly like maxwell squeezed out more perf on the same process 6 months ago. Instead NV has had nearly a year to just pile up profits on an old process and have an answer waiting in the wings (980ti) to make sure AMD's new gpu has no pricing power.

    Going HBM when it isn't bandwidth starved is another snafu that will keep costs higher, especially with low yields on that and the new process. But again because of lack of R&D (after blowing it on consoles/apu), they needed HBM to help drop the wattage instead of having a great 28nm low watt alternative like maxwell that can still milk a very cheap old DDR5 product which has more than enough bandwidth as speeds keep increasing. HBM is needed at some point, just not today for a company needing pofits that has no cash to burn on low yields etc. They keep making mistakes and then having to make bad decisions to make up for them that stifle much needed profits. They also need to follow Nvidia in splitting fp32 from fp64 as that will further cement NV gpus if they don't. When you are a professional at both things instead of a jack of all trades loser in both, you win in perf and can price accordingly while keeping die size appropriate for both.

    Intel hopefully will be forced back to this due to ZEN also on the cpu side. Zen will cause Intel to have to respond because they won't be able to shrink their way to keeping the gpu (not with fabs catching Intel fabs) and beat AMD with a die fully dedicated to CPU and IPC. Thank god too, I've been saying AMD needed to do this for ages and without doing it would never put out another athlon that would win for 2-3yrs. I'm not even sure Zen can do this but at least it's a step in the right direction for profits. Fortunately for AMD an opening has been created by Intel massively chasing ARM and ignoring cpu enthusiasts and desktop pros. We have been getting crap on cpu side since AMD exited, while Intel just piled on gpu side which again hurt any shot of AMD making profits here...LOL. They don't seem to understand they make moves that screw themselves longer term. Short term thinking kills you.
  • ToTTenTranz - Wednesday, May 13, 2015 - link

    Yes, and the APU being reviewed, the A8-7650K also happens to be "AMD ONLY", so why not test mantle? There's a reasonable number of high-profile games that support it:

    - Battlefield 4 and Hardline
    - Dragon Age: Inquisition
    - Civilization: Beyond Earth
    - Sniper Elite III

    Plus another bunch coming up, like Star Wars Battlefront and Mirror's Edge.

    So why would it hurt so much to show at least one of these games running Mantle with a low-specced CPU like this?

    What is anandtech so afraid to show, by refusing to test Mantle comparisons with anything other than >$400 CPUs?
  • V900 - Thursday, May 14, 2015 - link

    There isn't anyth to be scared off, but Mantle is only available on a handful of games, and beyond those it's dead and buried.

    Anandtech doesn't run Mantle benchmarks for the same reason they don't review AGP graphics cards: It's a dead technology aside from the few people who currently use it...
  • chizow - Tuesday, May 12, 2015 - link

    I seriously considered an A10-7850K Kaveri build last year around this time for a small power-efficient HTPC to stream DVR'd shows from my NAS, but in the end a number of issues steered me away:

    1) Need for chassis, PSU, cooler.
    2) Lack of good mini-ITX options at launch.
    3) Not good enough graphics for gaming (not a primary consideration anyways, but something fast enough might've changed my usage patterns and expectations).

    Sadly, this was the closest I've gotten to buying an AMD CPU product in a long, long time but ultimately I went with an Intel NUC that was cheaper to build, smaller form factor, and much less power usage. And all I gave up was GPU performance that wasn't realistically good enough to change my usage patterns or expectations anyways.

    This is the problem AMD's APUs face in the marketplace today though. That's why I think AMD made a big mistake in betting their future on Fusion, people just aren't willing to trade fast efficient or top-of-the-line CPUs for a mediocre CPU/GPU combo.

    Today, there's even bigger challenges out there for AMD. You have Alienware that offers the Alpha with an i3 and GTX 860+M that absolutely destroys these APUs in every metric for $500, $400 on sale, and it takes care of everything from chassis, PSU, cooling, even Windows licensing. That's what AMD is facing now though in the low-end PC market, and I just can't see them competing with that kind of performance and value.
  • silverblue - Tuesday, May 12, 2015 - link

    I would have opted for the A8-7600 instead of the 7850K, though I do admit it was very difficult to source back then. 65W mode doesn't perform much faster than 45W mode. I suppose it's all about what you want from a machine in the end, and AMD don't make a faster CPU with weaker iGPU which might make more sense.

    The one thing stopping AMD from releasing a far superior product, in my eyes, was the requirement to at least try to extract as much performance from a flawed architecture so they could say it wasn't a complete waste of time.
  • galta - Tuesday, May 12, 2015 - link

    +1
    Fusion was not only poor strategy, it was poor implementation.
    Leaving aside the discussion of the merits integrated GPU, if AMD had done it right we would have seen Apple adopting their processor on their Macbook series, given their obsession with slim hardware, with no discrete graphics.
    Have we seen that? No.
    You see, even though Intel has never said that integrated GPU was the future, the single most important customer on that market segment was claimed by them.

Log in

Don't have an account? Sign up now