Bill Kircos, Intel’s Director of Product & Technology PR, just posted a blog on Intel’s site entitled “An Update on our Graphics-Related Programs”. In the blog Bill addresses future plans for what he calls Intel’s three visual computing efforts:

The first is the aforementioned processor graphics. Second, for our smaller Intel Atom processor and System on Chip efforts, and third, a many-core, programmable Intel architecture and first product both of which we referred to as Larrabee for graphics and other workloads.

There’s a ton of information in the vague but deliberately worded blog post, including a clear stance on Larrabee as a discrete GPU: We will not bring a discrete graphics product to market, at least in the short-term. Kircos goes on to say that Intel will increase funding for integrated graphics, as well as pursue Larrabee based HPC opportunities. Effectively validating both AMD and NVIDIA’s strategies. As different as Larrabee appeared when it first arrived, Intel appears to be going with the flow after today’s announcement.

My analysis of the post as well as some digging I’ve done follows.

Intel Embraces Fusion, Long Live the IGP

Two and a half years ago Intel put up this slide that indicated the company was willing to take 3D graphics more seriously:

By 2010, on a 32nm process, Intel’s integrated graphics would be at roughly 10x the performance of what it was in 2006. Sandy Bridge was supposed to be out in Q4 2010, but we’ll see it shipping in Q1 2011. It’ll offer a significant boost in integrated graphics performance. I’ve heard it may finally be as fast as the GPU in the Xbox 360.

Intel made huge improvements to its integrated graphics with Arrandale/Clarkdale. This wasn’t an accident, the company is taking graphics much more seriously. The first point in Bill’s memo clarifies this:

Our top priority continues to be around delivering an outstanding processor that addresses every day, general purpose computer needs and provides leadership visual computing experiences via processor graphics. We are further boosting funding and employee expertise here, and continue to champion the rapid shift to mobile wireless computing and HD video – we are laser-focused on these areas.

There’s a troublesome lack of addressing the gaming market in this statement. A laser focus in mobile wireless computing and HD video sounds a lot like an extension of what Intel integrated graphics does today, and not what we hope it will do tomorrow. Intel does have a fairly aggressive roadmap for integrated graphics performance, so perhaps missing the word gaming was intentional to downplay the importance of the market that its competitors play in for now.


The current future of Intel graphics

The second point is this:

We are also executing on a business opportunity derived from the Larrabee program and Intel research in many-core chips. This server product line expansion is optimized for a broader range of highly parallel workloads in segments such as high performance computing. Intel VP Kirk Skaugen will provide an update on this next week at ISC 2010 in Germany.

In a single line Intel completely validates NVIDIA’s Tesla strategy. Larrabee will go after the HPC space much like NVIDIA has been doing with Fermi and previous Tesla products. Leveraging x86 can be a huge advantage in HPC. If both Intel and NVIDIA see so much potential in HPC for parallel architectures, there must be some high dollar amounts at stake.

NVIDIA Tesla Seismic Supercomputing Universities Defence Finance
GPU TAM $300M $200M $150M $250M $230M
NVIDIA's calculated TAM for HPC applications for GPUs

The third point is the one that drives the nail in the coffin of the Larrabee GPU:

We will not bring a discrete graphics product to market, at least in the short-term. As we said in December, we missed some key product milestones. Upon further assessment, and as mentioned above, we are focused on processor graphics, and we believe media/HD video and mobile computing are the most important areas to focus on moving forward.

Intel wasn’t able to make Larrabee performance competitive in DirectX and OpenGL applications, so we won’t be getting a discrete GPU based on Larrabee anytime soon. Instead, Intel will be dedicating its resources to improving its integrated graphics. We should see a nearly 2x improvement in Intel integrated graphics performance with Sandy Bridge, and greater than 2x improvement once more with Ivy Bridge in 2012.

All isn’t lost though. The Larrabee ISA, specifically the VPU extensions, will surface in future CPUs and integrated graphics parts. And Intel will continue to toy with the idea of using Larrabee in various forms, including a discrete GPU. However, the primary focus has shifted from producing a discrete GPU to compete with AMD and NVIDIA, to integrated graphics and a Larrabee for HPC workloads. Intel is effectively stating that it sees a potential future where discrete graphics isn’t a sustainable business and that integrated graphics will become good enough for most of what we want to do, even from a gaming perspective. In Intel’s eyes, discrete graphics would only serve the needs of a small niche if we reach this future where integrated graphics is good enough.

Much like the integration of cache controllers and FPUs into the CPU, Intel expects the GPU to take the same path. The days of discrete coprocessors have always been numbered. One benefit of a tightly coupled CPU-GPU is the bandwidth present between the two, an advantage used by game consoles for years.

This does conflict (somewhat) with AMD’s strategy of a functional Holodeck in 6 years, but that’s why Intel put the “at least in the short-term” qualifier on their statement. I believe Intel plans on making integrated graphics, over the next 5 years, good enough for nearly all 3D gaming. I’m not sure AMD’s Fusion strategy is much different.

For years Intel made a business case for delivering cheap, hardly accelerated 3D graphics on aging process technologies. Intel has apparently recognized the importance of the GPU and is changing directions. Intel will commit more resources (both in development and actual transistor budget) to the graphics portion of its CPUs going forward. Sandy Bridge will be the beginning, the ramp from there will probably mimic what we saw ATI and NVIDIA do with their GPUs over the years with a constant doubling of transistor count. Intel has purposefully limited the GPU transistor budget in the past. From what I’ve heard, that limit is now gone. It will start with Sandy Bridge, but I don’t think we’ll be impressed until 2013.

What About Atom & Moorestown?

Anything can happen, but by specifically calling out the Atom segment I get the impression that Intel is trying to build its own low power GPU core for use in SoCs. Currently the IP is licensed from Imagination Technologies, a company Intel holds a 16% stake in, but eventually Intel may build its own integrated graphics core here.

Previous Intel graphics cores haven’t been efficient enough to scale down to the smartphone SoC level. I get the impression that Intel has plans (if it is not doing so already) to create its own Atom-like GPU team to work on extremely low power graphics cores. This would ultimately eliminate the need for licensing 3rd party graphics IP in Intel’s SoCs. Planning and succeeding are two different things so only time will tell if Imagination has a long term future at Intel. The next 3 years are pretty much guaranteed to be full of Imagination graphics, at least in the Intel smartphone/SoC products.

Final Words

Intel cancelled plans for a discrete Larrabee graphics card because it could not produce one that was competitive with existing GPUs from AMD and NVIDIA in current games. Why Intel lacked the foresight to stop from even getting to this point is tough to say. The company may have been too optimistic or genuinely lacked the experience in building discrete GPUs, something it hadn’t done in more than a decade. Maybe it truly was Pat Gelsinger's baby.

This also validates AMD and NVIDIA’s strategy and their public responses to Larrabee. Those two often said that the most efficient approach to 3D graphics was not through x86 cores but through their own specialized, but programmable hardware. The x86-tax would effectively always put Larrabee at a disadvantage. When running Larrabee native code this would be less of an issue, but DX and OpenGL performance is another situation entirely. Intel executed too poorly, NVIDIA and most definitely AMD executed too well. Intel couldn’t put out a competitive Larrabee quickly enough, it fell too far behind.

A few years ago Intel attempted to enter the ARM SoC race with an ARM based chip of its own: XScale. Intel admitted defeat and sold off XScale, stating that it was too late to the market. Intel has since focused on the future of the SoC market with Moorestown. Rather than compete in a maturing market, Intel is now attempting to get a foot in the door on the next evolution of that market: high performance SoCs.

I believe this may be what Intel is trying with its graphics strategy. Seeing little hope for a profitable run at discrete graphics, Intel is now turning its eye to unexplored territory: the hybrid CPU-GPU. Focusing its efforts there, if successful, would be far easier and far more profitable than struggling to compete in the discrete GPU space.

The same goes for using Larrabee in the HPC space. NVIDIA is the most successful GPU company in HPC and even its traction has been limited. It’s still early enough that Intel could show up with Larrabee and take a big slice of the pie.

Clearly AMD sees value in the integrated graphics strategy as it spent over $5 billion acquiring ATI in order to bring Fusion to market. Next year we’ll see the beginnings of that merger come to fruition. Not only does Intel’s announcement today validate NVIDIA’s HPC strategy, but it also validates AMD’s acquisition of ATI. While Larrabee as a discrete GPU cast a shadow of confusion over the future of the graphics market, Intel focusing on integrated graphics and HPC is much more harmonious with AMD and NVIDIA’s roadmaps. We used to not know who had the right approach, now we have one less approach to worry about.

Whether Intel is committed enough to integrated graphics remains to be seen. NVIDIA has no current integrated graphics strategy (unless it can work out a DMI/QPI license with Intel). AMD’s strategy is very similar to what Intel is proposing today and it has been for some time, but AMD at least has a far more mature driver and ISV compatibility teams with its graphics cores. Intel has a lot of catching up to do in this department.

I’m also unsure what AMD and Intel see as the future of discrete graphics. Based on today’s trajectory you wouldn’t have high hopes for discrete graphics, but as today’s announcement shows: anything can change. Plus, I doubt the Holodeck will run optimally on an IGP.

Comments Locked

55 Comments

View All Comments

  • ImSpartacus - Tuesday, May 25, 2010 - link

    I'm kinda liking these shorter articles. Adaboy Ryan!

    ...just don't completely kill off your in-depth reviews, OK AT?
  • formulav8 - Tuesday, May 25, 2010 - link

    I'm not even alittle bit surprised. Not much more to say I guess...

    Jason
  • RaiderJ - Tuesday, May 25, 2010 - link

    My guess is that AMD is going to have some major design wins with Fusion. Having a single chip that will meet the needs of a large section of the computing population will be very valuable, in both cost and power consumption.

    Apple especially may be interested in Fusion for their products. They do not need a large number of chips compared to a vendor such as Dell, but they demand a certain amount of performance and battery life (in the case of their laptops). Look at the current MacBooks, they have three chips that could potentially be replaced with a single chip (Intel CPU + Intel GPU + NVIDIA GPU). Not hard to imagine the Fusion chip could replace all three.

    With Intel and NVIDIA not playing nicely, it's not hard to imagine that their products will be even harder to integrate in the future - Optimus isn't exactly and ideal approach. Makes me wonder if Intel would eventually buy out NVIDIA, even if that isn't something NVIDIA will probably ever swing for.

    As for integrated GPU products eliminating the need for discrete GPUs, I think that is very unlikely. With technology such as Crossfire or SLI, it doesn't seem hard to imagine a situation where you could add in a discrete GPU and have it work in tandem with an integrated GPU.
  • stalker27 - Tuesday, May 25, 2010 - link

    How does "We will not bring a discrete graphics product to market, at least in the short-term" translate to "Intel Kills Larrabee GPU"?

    Killing a product means terminating it completely... Larrabee might not show its face now or next year, but by 2015 it should have evolved a few shrinks and revisions to be considered something.
  • Anand Lal Shimpi - Tuesday, May 25, 2010 - link

    From what I have heard the folks working on Larrabee graphics are being transitioned to integrated graphics. The Larrabee HPC product still has all of the graphics stuff in it but there is no active development on a discrete GPU product. There will be some path finding missions and experiments for sure but no products planned.

    The Larrabee add in card is off the roadmaps.
  • MamiyaOtaru - Thursday, May 27, 2010 - link

    but how is this any different from what we heard in December? ( http://arstechnica.com/hardware/news/2009/12/intel... ). Is it just more official now?
  • philosofool - Tuesday, May 25, 2010 - link

    "Intel is effectively stating that it sees a potential future where discrete graphics isn’t a sustainable business and that integrated graphics will become good enough for most of what we want to do, even from a gaming perspective."

    This is key. Intel is predicting a future in which the discrete graphics card goes the way of the discrete sound card. Remember those? They're basically a niche solution today and only a handful of people need them.

    I think Intel is right to think that graphics will be similar. As PC gaming loses market to consoles and integrated solutions are more and more capable of handling PC games, the size of a market for discrete cards will shrink. The vast majority of all discrete cards sold are in the sub-$100 market; the market will cease to exist in about five years. I'm not saying that discrete cards won't exist--they will, at least for the next few years--but Intel doesn't see this as a market worth investing in because only highly specialized applications with a fairly small audience will have an interest in these products. The barriers to entry in the discrete card market are large because it takes a lot to get a graphics platform working, and the future is neither long nor bright for discrete graphics cards in the sector where they currently make money. There's just not much money to be made competing with two established manufacturers like Nvidia and ATI.
  • AlexWade - Wednesday, May 26, 2010 - link

    Discrete graphics aren't going to go anywhere because integrated graphics cannot be as good as add-on graphics. Discrete graphics are going to exist so long as computer game producers keep pushing the boundaries of realism with each new game. Sound cards died because the integrated ones were just as good as the add-on ones. The same can never be said of integrated graphics.

    Until people stop wanting to play computer games at the highest settings, discrete graphics will not go away. The people who demand the best may be a small market, but there are enough of them to be very profitable.
  • TinyTeeth - Wednesday, May 26, 2010 - link

    Just want to say that I agree. What philosofool is saying is completely off and Intel's take on this is obviously influenced by them just having shut down their own venture into discrete graphics, possibly because of technological challenges that simply became too much to handle.

    Larrabee was an interesting project and I'm sorry to see it go, but the GPU market will do pretty well even without Intel..
  • philosofool - Thursday, May 27, 2010 - link

    You misunderstood. I didn't say that they were going away, I said that they would become a decreasingly important segment of the market--it's *not* about the continued existence of discrete cards, it's about the continued existence of a large and profitable section of the market.

    Currently, Nvidia and ATI sell about 90% of the graphics cards in the sub-$100 range. That means that most of the money in the discrete card business is in the cheap cards, not the gamer cards. But integrated graphics will compete in the non-gamer level performance within the next few years. As memory bandwidth of new DDR increases and integrated graphics move on chip where they can share a cooling solution with the CPU, the limitations of integrated graphics will shrink and the gap between an Intel IGP and a low end discrete card will shrink with it. That means a smaller market and a small market means less money.

Log in

Don't have an account? Sign up now