At the time of our Skylake review of both the i7-6700K and the i5-6600K, due to the infancy of the platform and other constraints, we were unable to probe the performance uptake of the processors as they were overclocked. Our overclock testing showed that 4.6 GHz was a reasonable marker for our processors; however fast forward two weeks and that all seems to change as updates are released. With a new motherboard and the same liquid cooler, the same processor that performed 4.6 GHz gave 4.8 GHz with relative ease. In this mini-test, we tested our short-form CPU workload as well as integrated and discrete graphics at several frequencies to see where the real gains are.

In the Skylake review we stated that 4.6 GHz still represents a good target for overclockers to aim for, with 4.8 GHz being indicative of a better sample. Both ASUS and MSI have also stated similar prospects in their press guides that accompany our samples, although as with any launch there is some prospect that goes along with the evolution of understanding the platform over time.

In this mini-test (performed initially in haste pre-IDF, then extra testing after analysing the IGP data), I called on a pair of motherboards - ASUS's Z170-A and ASRock's Z170 Extreme7+ - to provide a four point scale in our benchmarks. Starting with the 4.2 GHz frequency of the i7-6700K processor, we tested this alongside every 200 MHz jump up to 4.8 GHz in both our shortened CPU testing suite as well as iGPU and GTX 980 gaming. Enough of the babble – time for fewer words and more results!


We actually got the CPU to 4.9 GHz, as shown on the right, but it was pretty unstable for even basic tasks.
(Voltage is read incorrectly on the right.)

OK, a few more words before results – all of these numbers can be found in our overclocking database Bench alongside the stock results and can be compared to other processors.

Test Setup

Test Setup
Processor Intel Core i7-6700K (ES, Retail Stepping), 91W, $350
4 Cores, 8 Threads, 4.0 GHz (4.2 GHz Turbo)
Motherboards ASUS Z170-A
ASRock Z170 Extreme7+
Cooling Cooler Master Nepton 140XL
Power Supply OCZ 1250W Gold ZX Series
Corsair AX1200i Platinum PSU
Memory Corsair DDR4-2133 C15 2x8 GB 1.2V or
G.Skill Ripjaws 4 DDR4-2133 C15 2x8 GB 1.2V
Memory Settings JEDEC @ 2133
Video Cards ASUS GTX 980 Strix 4GB
ASUS R7 240 2GB
Hard Drive Crucial MX200 1TB
Optical Drive LG GH22NS50
Case Open Test Bed
Operating System Windows 7 64-bit SP1

The dynamics of CPU Turbo modes, both Intel and AMD, can cause concern during environments with a variable threaded workload. There is also an added issue of the motherboard remaining consistent, depending on how the motherboard manufacturer wants to add in their own boosting technologies over the ones that Intel would prefer they used. In order to remain consistent, we implement an OS-level unique high performance mode on all the CPUs we test which should override any motherboard manufacturer performance mode.

Many thanks to...

We must thank the following companies for kindly providing hardware for our test bed:

Thank you to AMD for providing us with the R9 290X 4GB GPUs.
Thank you to ASUS for providing us with GTX 980 Strix GPUs and the R7 240 DDR3 GPU.
Thank you to ASRock and ASUS for providing us with some IO testing kit.
Thank you to Cooler Master for providing us with Nepton 140XL CLCs.
Thank you to Corsair for providing us with an AX1200i PSU.
Thank you to Crucial for providing us with MX200 SSDs.
Thank you to G.Skill and Corsair for providing us with memory.
Thank you to MSI for providing us with the GTX 770 Lightning GPUs.
Thank you to OCZ for providing us with PSUs.
Thank you to Rosewill for providing us with PSUs and RK-9100 keyboards.

Frequency Scaling and the Handbrake Problem
POST A COMMENT

107 Comments

View All Comments

  • V900 - Saturday, August 29, 2015 - link

    I seriously doubt any animation shop that actually employs people for a real salary would do that.

    They won't be in business for long, that's for sure.

    If the boss is too dumb to realize, that it's way easier and cheaper due to taxwriteoffs, to just go out and buy the extra computing power they need, rather than let an intern futz around with overclocking CPUs, then I have no doubt whatsoever that he could manage a hotdog stand into bancruptcy in a few weeks, let alone an animationshop!
    Reply
  • npz - Sunday, August 30, 2015 - link

    First, Intel officially supporting, marketing and demo-ing the overclocking of their K-series processors, therefore one expects some form of actual stability. Second, "stable" is not stable.

    Finally, I also hope you realize "professional software" is NOT limited to people who's line of business allows them to write off their computers as capital expense! Do you think everyone who owns Adobe Creative Suite can claim their computers as an expense? The last survey I read stated that the majority of users were hobbyists or prosumers (who may occasionally make money on the side with freelancing / commissioned work, but whose income is not derived from their PC)
    Reply
  • npz - Friday, August 28, 2015 - link

    It is also a bigger deal than most people think. "Stable enough" is not good enough since you can have silent data corruption.

    It is possible that you can a long rendering job (video, 3D, other apps, etc) or gaming session and the unstable portion only momentarily affected non-OS critical instruction or data, but still affected the program data. So the OS continues to run and maybe the program itself too, but now you have a unknown glitch or silently corrupted output.

    That's why I think a week of Prime95 and other methods that *verifies* cpu output is critical when overclocking.
    Reply
  • npz - Friday, August 28, 2015 - link

    For example, how do we know that Povray rendered its output correctly in these tests?

    For the same input and settings, you should get the same output, so a final image comparison could be done against the output of stock freq vs OC frequencies.
    Reply
  • Impulses - Saturday, August 29, 2015 - link

    Problem is many systems can now pass dozens of hours of Prime but choke on a Handbrake encode, even at lower temps, so we can't really rely on a single test to verify an OC (not that I ever did, but many did). Reply
  • Beaver M. - Saturday, August 29, 2015 - link

    Exactly. Fact of the matter is that proper overclocking takes a LONG LONG time to get stable, unless you get extremely lucky. I sometimes spend months to get it stable. Even when testing with Prime95 like theres no tomorrow, it still wont prove that the system is 100% stable. You also have to test games for hours for several days and of course other applications. But you cant really play games 24/7, so it takes quite some time. Reply
  • sonny73n - Sunday, August 30, 2015 - link

    If you have all power saving features disabled, you only have to worry about stability under load. Otherwise, as CPU voltage and frequency fluctuate depend on each application, it maybe a pain. Also most mobos have issues with RAM together with CPU OCed to certain point. Reply
  • V900 - Saturday, August 29, 2015 - link

    Thats an extremely theoretical definition of "production software".

    No professional or production company would ever overclock their machines to begin with.

    For the hobbyist overclocker who on a rare occasion needs to encode something in 4K60, the problem is solved by clicking a button in his settings and rebooting.

    I really don't see the big deal here.
    Reply
  • Oxford Guy - Saturday, August 29, 2015 - link

    The problem is that overclocks should NEVER be called stable if they aren't.

    And, I don't like the way Anandtech pumps ridiculous amounts of voltage into chips (like they did with the 8320E).
    Reply
  • Gigaplex - Sunday, September 27, 2015 - link

    Production software in my books is any released software that completes a useful task, rather than just run synthetic tests. Reply

Log in

Don't have an account? Sign up now