The Claims

As with any launch, there are numbers abound from Intel to explain how the performance and experience of Skylake is better than previous designs as well as the competition.

As with Haswell and Broadwell, Intel is implementing a mobile first design with Skylake. As with any processor development structure the primal idea is to focus on one power point as being the most efficient and extend that efficiency window as far in either direction as possible. During IDF, Intel stated that having an efficiency window from 4.5W to 91W is a significant challenge, to which we agree, as well as improving both performance and power consumption over Broadwell at each stage.

Starting at 4.5W, we spoke extensively with parts of Intel at IDF due to our Broadwell-Y coverage. From their perspective Broadwell-Y designs were almost too wide ranging, especially for what is Intel’s premium low-power high performance product, and for the vendors placing it in an ill-defined chassis far away from Intel’s recommended designs gave concern to the final performance and user experience. As a result, Intel’s guidelines to OEMs this generation are tightened so that the designers looking for the cheaper Core M plastic implementations can tune their design to get the best out of it. Intel has been working with a few of these (both entry Core M and premium models) to enact the user experience model.

Overall however, Intel is claiming 40% better graphics performance for Core M with the new Generation 9 (Gen9) implementation, along with battery saving and compatibility with new features such as RealSense. Because Core-M will find its way into products from tablets to 2-in-1s and clamshells, we’ve been told that the Skylake design should hit a home-run against the best-selling tablets in the market, along with an appropriate Windows 10 experience. When we get units in to review, we will see what the score is from our perspective on that one.

For the Skylake-Y to Skylake-U transition (and in part, Skylake-H), Intel is claiming a 60% gain in efficiency over Haswell-U. This means either 60% less active power during media consumption or 60% more CPU performance at the same power (measured by synthetics, specifically SPECint_base_rate2006). The power consumption metrics comes from updates relating to the Gen9 graphics, such as multi-plane overlay and fixed-function decoders, as well as additional power/frequency gating between the unslice and slices. We will cover this later in the review.  The GPU itself, due to the new functionality, is claiming 40% better graphics performance for Core M during 3DMark synthetic tests.

While not being launched today, Intel’s march on integrated graphics is also going to continue. With the previous eDRAM parts, Intel took the crown for absolute IGP performance from AMD, albeit being in a completely different price band. With Skylake, the introduction of a 4+4e model means that Intel’s modular graphics design will now extend from GT1 to GT4, where GT4e has 72 execution units with 128MB of eDRAM in tow. This leads to the claim that GT4e is set to match/beat a significant proportion of the graphics market today.

Back in our Skylake-K review, we were perhaps unimpressed with the generational gain in clock-for-clock performance, although improved multi-threading and frequency ranges helped push the out-of-the-box experience. The other side of that performance is the power draw, and because Skylake is another mobile-first processor, the power aspect becomes important down in mobile devices. We will go through some of these developments to improve power consumption in this article.

The Intel Skylake Launch The Skylake Package: High Level Core and Power Delivery
Comments Locked

173 Comments

View All Comments

  • jimmy$mitty - Thursday, September 3, 2015 - link

    Is it your love of AMD that makes you say this? Think about it. The XB1 uses DDR3 for its GPU. This will use DDR4. The XB1 has a small eDRAM cache. Skylake has a small eDRAM cache. The XB1 has a very weak AMD Jaguar based CPU. This will have a much stronger Skylake based CPU.

    So why is it so far fetched to think that Skylake could get close to matching the XB1? It wont outright beat it, not this one maybe the next one, but it could get close with proper optimizations and DX12.

    http://www.anandtech.com/show/6993/intel-iris-pro-...

    http://www.anandtech.com/show/9320/intel-broadwell...

    Haswell beat the top end AMD APU at the time and Broadwell makes the current A10 look even worse.

    AMD is great if you are on a budget. But if you are looking simply for performance they are lagging behind in a lot of ways.
  • JKflipflop98 - Sunday, September 6, 2015 - link

    Ah, I wondered who would make an actually well-reasoned posting. I am not surprised to see it's you.
  • tipoo - Wednesday, September 2, 2015 - link

    I didn't say it was a good value. Just interesting how times have changed, that Intel integrated graphics are this close to a two year old console already.
  • eddman - Thursday, September 3, 2015 - link

    Yes, they "could" care less.
  • MobiusPizza - Friday, September 4, 2015 - link

    As ArsTechnica and TechReport (http://arstechnica.co.uk/gadgets/2015/09/intels-sk... has noted, eDRAM has performance advantage even for people with discrete GPUs
  • anubis44 - Tuesday, September 8, 2015 - link

    "I guarantee it that anyone interested in PC gaming could care less about Intel's IGP as any serious gamer will be getting a Skylake laptop with a Maxwell and next year a Pascal GPU."

    I would argue that anyone interested in PC gaming will avoid laptops like the plague and buy/build a desktop PC so they can replace graphics/ram/CPU easily and pay a lot less for a DX12 card, and on that note, anyone wanting to build a DX12-ready gaming machine right now will be getting a Radeon 290/390(X) series card and skipping Maxwell altogether, as it doesn't support hardware asynchronous shaders.
  • ered - Sunday, February 14, 2016 - link

    Well, when the Macbook gets it, you can stream your screen to the Apple TV connect an Xbox One/PS4 controller and play like you're on console. Having similar graphics and at the same time a computer for school etc. But of course these devices are not competitors to consoles, it's just interesting what is possible.
  • TallestJon96 - Wednesday, September 2, 2015 - link

    You actually make a great point. Despite the fact that on a desktop an i5 paired with a $200 dollar gpu will crush integrated graphics, on a laptop a 72 EU cpu could do some serious work. This paired with ddr4 could kicked integrated graphics up a notch, which is good for everyone, as it raises the lowest common denominator.

    Like you say, it probably won't be long until integrated graphics catch up with the Xbone, especially as they have a CPU advantage in many cases, and with ddr4 they have VERY similar system memory. It'll be a few more years after that til ps4 is caught up with. I would add that tablets will probably catch the xbone before the end of this generation. It could be an interesting future, where games could come to tablet, pc, and consoles simultaneously.
  • Stochastic - Wednesday, September 2, 2015 - link

    "... as it raises the lowest common denominator." That's the important bit. One reason there aren't more PC gamers is simply that there aren't that many people who have modern PCs powerful enough to run today's games. This limits the technical ambition of PC games as developers have to keep in mind the wider PC audience and not just us tech enthusiasts. If integrated graphics can continue improving generation to generation, in a few years time even $600 laptops will be capable of running games at comparable fidelity to the Xbox One. Adding substantive amounts of eDRAM to all integrated GPUs would go a long ways towards making that dream a reality.
  • flashpowered - Wednesday, September 2, 2015 - link

    I am hoping to replace my Arrandale laptop with an ultrabook, and really hope that the 15w or 28w Iris with eDRAM can give me something with a high resolution display and smoother running UI than Retina Haswell/Broadwell.

Log in

Don't have an account? Sign up now