Test Setup

Test Setup
Processor AMD A8-7650K
2 Modules, 4 Threads
3.3 GHz Base, 3.7 GHz Turbo
95W, MSRP $105
Motherboard GIGABYTE F2A88X-UP4
DRAM G.Skill RipjawsZ 4x4GB DDR3-2133 9-11-10
Low End GPU Integrated
ASUS R7 240 2GB DDR3
Dual Graphics with R7 240
Mid Range GPU MSI R9 285 Gaming 2GB
MSI GTX 770 Lightning 2GB
High End GPU MSI R9 290X Gaming LE 4GB
ASUS GTX 980 Strix 4GB
Power Supply OCZ 1250W Gold
Storage Drive Crucial MX200 1TB
Operating System Windows 7.1 64-bit, Build 7601
CPU Cooler Cooler Master Nepton 140XL CLC

Many thanks to...

We must thank the following companies for kindly providing hardware for our test bed:

Thank you to AMD for providing us with the R9 290X 4GB GPUs.
Thank you to ASUS for providing us with GTX 980 Strix GPUs and the R7 240 DDR3 GPU.
Thank you to ASRock and ASUS for providing us with some IO testing kit.
Thank you to Cooler Master for providing us with Nepton 140XL CLCs.
Thank you to Corsair for providing us with an AX1200i PSU.
Thank you to Crucial for providing us with MX200 SSDs.
Thank you to G.Skill and Corsair for providing us with memory.
Thank you to MSI for providing us with the GTX 770 Lightning GPUs.
Thank you to OCZ for providing us with PSUs.
Thank you to Rosewill for providing us with PSUs and RK-9100 keyboards.

AMD A8-7650K Overclocking

Methodology

Our standard overclocking methodology is as follows. We select the automatic overclock options and test for stability with PovRay and OCCT to simulate high-end workloads. These stability tests aim to catch any immediate causes for memory or CPU errors.

For manual overclocks, based on the information gathered from previous testing, starts off at a nominal voltage and CPU multiplier, and the multiplier is increased until the stability tests are failed. The CPU voltage is increased gradually until the stability tests are passed, and the process repeated until the motherboard reduces the multiplier automatically (due to safety protocol) or the CPU temperature reaches a stupidly high level (100ºC+). Our test bed is not in a case, which should push overclocks higher with fresher (cooler) air.

Overclock Results

The base frequency of the A8-7650K goes up to 3.7 GHz in the highest turbo mode, and we were able to jump right into 4.0 GHz without much problem. That being said, our sample did not move much above that, giving 4.1 GHz but at 4.2 GHz we noticed that the CPU frequency would decrease during sustained workloads, resulting in a zero performance increase overall.

New Testing Methodology Office and Web Performance
Comments Locked

177 Comments

View All Comments

  • TrackSmart - Wednesday, May 13, 2015 - link

    This comment is for Ian Cutress,

    First, thank you for the review, which was rich with performance figures and information. That said, something seems missing in the Conclusion. To be precise, the article doesn't really have a clear conclusion or recommendation, which is what many of us come here for.

    It's nice to hear about your cousin-in-law's good experiences, but the conclusion doesn't clearly answer the key question I think many readers might have: Where does this product fit in the world of options to consider when buying a new processor? Is it a good value in its price range? Should it be ignored unless you plan to use the integrated graphics for gaming? Or does it offer enough bang-for-the-buck to be a viable alternative to Intel's options for general non-gaming usage, especially if motherboard costs are considered? Should we consider AMD again, if we are in a particular niche of price and desired features?

    Basically, after all of your time with this chip and with your broader knowledge of the market offerings, what is your expert interpretation of the merits or demerits of considering this processor or its closely related AMD peers?
  • Nfarce - Thursday, May 14, 2015 - link

    " Ultimately AMD likes to promote that for a similarly priced Intel+NVIDIA solution, a user can enable dual graphics with an APU+R7 discrete card for better performance."

    I have *long* wondered why Intel and Nvidia don't get together and figure out a way to pair up the on-board graphics power of their CPUs with a discrete Nvidia GPU. It just seems to me such a waste for those of us who build our rigs for discrete video cards and just disable the on-board graphics of the CPU. Game developers could code their games based on this as well for better performance. Right now game developer Slightly Mad Studios claims their Project Cars racing simulation draws PhysX from the CPU and not a dedicated GPU. However, I have yet to find that definitively true based on benchmarks...I see no difference in performance between moving PhysX resources to my GPUs (970 SLI) or CPU (4690K 4.7GHz) in the Nvidia control panel in that game.
  • V900 - Thursday, May 14, 2015 - link

    Something similar to what you're describing is coming in DX12...

    But the main reason they haven't is because unless youre one of the few people who got an AMD APU because your total CPU+GPU budget is around 100$ it doesn't make any sense.

    First if all, the performance you get from an Intel igpu in a desktop system will be minimal, compared to even a 2-300$ Nvidia card. And secondly, if you crank up the igpu on an Intel CPU, it may take away some of the CPUs performance/overhead.

    If we're talking about a laptop, taking watts away from the CPU, and overall negatively impacting battery life will be even bigger drawbacks.
  • Nfarce - Thursday, May 14, 2015 - link

    "But the main reason they haven't is because unless youre one of the few people who got an AMD APU because your total CPU+GPU budget is around 100$ it doesn't make any sense."

    Did you even read the hardware I have? Further, reading benchmarks from the built in 4600 graphics of i3/i5/i7 CPUs shows me that it is a wasted resource. And regarding impact on CPU performance, considering that higher resolutions (1440p and 4K) and higher quality/AA settings are more dependent on GPU performance than CPU performance, the theory that utilizing onboard CPU graphics with a dedicated GPU would decrease overall performance is debatable. I see little gains in my highly overclocked 4690K running at 4.7GHz and running at the stock 3.9GHz turbo frequency in most games.

    All we have to go on currently is 1) Intel HD 4600 performance alone in games, and 2) CPU performance demands at higher resolutions on games with dedicated cards.
  • UtilityMax - Friday, May 15, 2015 - link

    I am guessing that they didn't get together because dual-graphics is very difficult to make to work right. AMD is putting effectively the same type of GPU cores on the discrete GNUs and integrated APUs, and it still took them a while to make it work at all.
  • V900 - Thursday, May 14, 2015 - link

    I guess one thing we all learned today, besides the fact that AMDs APUs still kinda blow, is that there is a handful of people, who are devoted enough to their favorite processor manufacturer to seriously believe that:

    A: Intel is some kind of evil and corrupt empire ala Star Wars.

    B: They're powerful enough to bribe/otherwise silence "da twooth" among all of Anandtech and most of the industry.

    C: 95% of the tech press is corrupt enough to gladly do their bidding.

    D: Mantle was an API hardcoded by Jesus Christ himself in assembler language. It's so powerful that if it got widespread, no one would need to buy a new CPU or GPU the rest of this decade. Which is why "they" forced
  • V900 - Thursday, May 14, 2015 - link

    Which is why "they" forced AMD to cancel Mantle. Then Microsoft totally 110% copied it and renamed it "DX12".

    Obviously all of the above is 100% logical, makes total sense and is much more likely than AMD releasing shoddy CPUs the last decade, and the press acknowledging that.
  • wingless - Thursday, May 14, 2015 - link

    AMD DOMINATION!!!!! If only the charts looked like that with discrete graphics as well....
  • Vayra - Friday, May 15, 2015 - link

    Still really can't see a scenario where the APU would be the best choice. Well, there may be one: for those with a very tight budget and wish for playing games on PC regardless. But this would mean that AMD has designed and reiterated a product that would only find its market in the least interesting group of consumers: those that want everything for nothing... Not really where you want to be.
  • UtilityMax - Friday, May 15, 2015 - link

    Well, right now arguably, if one has $500 bucks or less for a gaming PC build, it would be better to buy a Playstation 4. High end builds is where the money is in the enthusiast gaming market.

Log in

Don't have an account? Sign up now