AMD Radeon HD Mobile Graphics Introduction

While on the desktop, AMD's Radeon HD cards are extremely competitive, the notebook space is far more complicated. Mobile Radeons and GeForces are both fairly common with neither one owning the market more aggressively than the other; this is actually an unusual equilibrium, as each generation of notebooks has typically seen a massive swing favoring one vendor over the other.

So what is AMD offering that you can't find on NVIDIA hardware? Arguably superior anti-aliasing and image quality, typically slightly higher performance than competing mobile parts, and support for Eyefinity. You'll find GDDR5 more frequently employed with AMD's chips to help mitigate narrow memory bus widths, too.

The essential problem with Radeons right now is that outside of Eyefinity they're still playing catch-up with NVIDIA's mobile solutions. Performance may be excellent in some cases, but NVIDIA leverages Optimus across their 500M line while support for switchable graphics in the Radeon HD 6000M series is spotty. NVIDIA's Verde mobile graphics driver initiative is also very mature, while support for AMD's mobile graphics driver across vendors is again spotty. That last point isn't entirely AMD's fault: vendors like Toshiba and Sony inexplicably opt out of the program despite the drivers working fine on their hardware. Finally, there are going to be niche cases where NVIDIA's support for CUDA and PhysX are relevant. OpenCL may eventually become the standard, but professional grade applications like Adobe Premiere Pro CS5 and CS5.5 can get a substantial boost from NVIDIA kit (provided you hack the "validated GPU" list to include yours.)

There's one more comparatively problem with AMD's lineup: while NVIDIA took their 500M series (largely an exercise in rebranding) as an opportunity to do some housekeeping, AMD basically integrated the entire Mobility Radeon HD 5000 line into the 6000Ms. Feature-wise this isn't a major issue, but it results in an incredibly bloated mobile lineup, with mobile chips from the Evergreen line occupying the same series as newer chips from the Northern Islands refresh.

AMD Radeon HD 6300M
80 Shaders, 8 TMUs, 4 ROPs, Core Clocks: 500MHz (6330M/6350M) or 750MHz (6370M)
64-bit Memory Bus, DDR3, Effective Memory Clocks: 1.6GHz (6330M/6350M) or 1.8GHz (6350M/6370M)
Desktop Counterpart: Radeon HD 5450 (Cedar)

The 6300M series is the carryover/rebadging of the Mobility Radeon HD 5400 line. This is roughly the same graphics core as is integrated into Brazos, featuring a memory bus that's honestly just too narrow to really handle any serious gaming. With the advent of Sandy Bridge, it's also outclassed by Intel's integrated graphics hardware and as a result remains more of a solution for corner cases where an inexpensive dedicated graphics processor is needed. (No review available, but the Mobility Radeon HD 5470 in the Dell Studio 14 is comparable.)

AMD Radeon HD 6400M
160 Shaders, 8 TMUs, 4 ROPs, Core Clocks: 480MHz-800MHz
64-bit Memory Bus, DDR3 or GDDR5 (6490M only), Effective Memory Clocks: 1.6GHz (DDR3) or 3.2GHz (GDDR5)
Desktop Counterpart: Radeon HD 6450 (Caicos)

Doubling the shader count of Cedar helps the mobile Caicos reach parity with Sandy Bridge's IGP in the 6430M and 6450M and then beat it with the 6470M and GDDR5-equipped 6490M. What the 6400M brings to the table is what AMD as a whole brings to the table compared to Intel's graphics: better game compatibility and Eyefinity multi-monitor support. Hardware with 64-bit memory buses should still be confined to running games at 1366x768, and heavier games are going to be off limits, but the 6400M series should satisfy more casual players. (Toshiba Tecra R850 for the HD 6450M; HP EliteBook 8460p for the HD 6470M.)

AMD Radeon HD 6500M
400 Shaders, 20 TMUs, 8 ROPs, Core Clocks: 500-650MHz
128-bit Memory Bus, DDR3 or GDDR5 (6570M only), Effective Memory Clocks: 1.8GHz (DDR3) or 3.6GHz (GDDR5)
Desktop Counterpart: Radeon HD 5570/5670 (Redwood)

AMD opted to employ a very close derivative of this core for Llano, and it should really be the minimum for gamers looking to play on a Radeon. A GDDR5-equipped model will go a long way towards improving performance at higher resolutions, but generally speaking the 6500M series will at least be fine for pushing settings at 1366x768 and most games at 1600x900. This is a rebadge of the Mobility Radeon 5600/5700 series. (No review available, but the Mobility Radeon HD 5650 in the Compal NBLB2 is comparable.)

AMD Radeon HD 6600M/6700M
480 Shaders, 24 TMUs, 8 ROPs, Core Clocks: 500-725MHz
128-bit Memory Bus, DDR3 or GDDR5, Effective Memory Clocks: 1.6GHz (6630M/6730M) or 1.8GHz (6650M) or 3.6GHz (6750M/6770M)
Desktop Counterpart: Radeon HD 6570/6670 (Turks)

Bifurcating a single chip into two lines and then not even using the class of memory as a signifier is one of the more baffling decisions you'll find in this guide (though the prize has to go to NVIDIA's GT 555M), but AMD did the same thing with the 5600M/5700M series. GDDR5 is always going to be preferable to allow the graphics core to stretch its legs, but generally speaking this is a more minor, incremental improvement on its predecessor than Caicos was on Cedar, and the same rules for the 6500M apply here. (Look at the results for the 6630M in our Llano review.)

AMD Radeon HD 6800M
800 Shaders, 40 TMUs, 16 ROPs, Core Clocks: 575MHz-675MHz
128-bit Memory Bus, DDR3 or GDDR5, Effective Memory Clocks: 1.6GHz (DDR3) or 3.2GHz (6850M GDDR5) or 4GHz (6870M)
Desktop Counterpart: Radeon HD 5770 (Juniper)

The astute reader is going to notice that, once again, AMD has rebranded their last generation, this time the 5800M series. While there are specs for DDR3-powered versions, the GDDR5-based ones are far more common in the wild. That's good, because the 128-bit memory bus is too anemic on its own to feed 800 of AMD's shader cores. Serious gamers are going to want to look at the 6800M as a minimum for gaming at 1080p. It's important to note that the 6800M is still going to be consistently slower than the desktop 5750 and 5770 due to substantially reduced core clocks (the desktop chips start at 700MHz). The 6870M is also just 25MHz slower than the Mobility Radeon HD 5870, so as I mentioned before, these are going to be a solid choice for gamers. (No review available, but the Mobility Radeon HD 5870 in the ASUS G73Jh is comparable.)

AMD Radeon HD 6900M
960 Shaders, 48 TMUs, 32 ROPs, Core Clocks: 580MHz (6950M) or 680MHz (6970M)
256-bit Memory Bus, GDDR5, Effective Memory Clocks: 3.6GHz
Desktop Counterpart: Radeon HD 6850 (Barts)

This is as powerful as it gets on the AMD side. The 6970M is going to be somewhat outclassed by the GTX 580M, but should tangle just fine with the 570M and thoroughly trounce anything slower. Likewise, you're apt to see these employed in a mobile Crossfire solution, leveraging the improvements in Crossfire scaling that AMD brought with the Barts core (along with the rest of Northern Islands.) While it'll never be as fast as a desktop 6850 due to the reduced core clocks, the 6900M series is an extremely potent mobile gaming solution. (Alienware M17x R3 Review)

Introduction and Integrated Graphics NVIDIA GeForce 500M Graphics
Comments Locked

85 Comments

View All Comments

  • anotherfakeaccount - Wednesday, July 6, 2011 - link

    If anyone is buying a laptop, the best deal you can get is the HP Dv6t or dv7t. 6770m, 2630qm processor, matte 1080p screen, you can't beat it and it's under 1000 or barely over. Yes there is a graphics switching problem but it should not affect a typical gamer.

    The Dell XPS 17 is comparable but costs more. Other good choices are ASUS G53/G73, and MSI Force 16F2 for those with bigger budgets who do not care if your laptop looks ugly and is bulky.
  • anotherfakeaccount - Wednesday, July 6, 2011 - link

    "This, or AMD's Radeon HD 6800M, will be the bare minimum for gaming comfortably at 1080p, but honestly the GTX 560M is liable to be the sweet spot in offering the very best balance in form factor favoring performance before you start getting into the huge, heavy desktop replacement notebooks."

    The GTX 560m can hardly be called portable. A 6850m can be put in a laptop with comparable size. And neither laptop is truly portable.
  • Stuka87 - Wednesday, July 6, 2011 - link

    I don't see any mention of the Quadro series of chips? I realize they are somewhat a duplicate of consumer series chips, but they are probably worth a mention.
  • DanNeely - Wednesday, July 6, 2011 - link

    Adding another level of WTF to what's already in the article would cause the servers to explode.
  • Drizzt321 - Wednesday, July 6, 2011 - link

    Heh, yea, I was just asking about that. I have a Lenovo w520 on the way with the 1000m.
  • Arbie - Wednesday, July 6, 2011 - link

    I think you hit the target - pulling together a lot of hard-to-find info and boiling down the choices. This is exactly what I need to even get started on choosing a game-capable laptop / netbook. Thanks.
  • MrTeal - Wednesday, July 6, 2011 - link

    I know that you can't buy these chips yourself, and that OEMs might be able to work out better deals than the list price, but it would be interesting to know what each GPU is listed at in 1000 unit quantities, just to get an idea of the relative cost between them.
  • scook9 - Wednesday, July 6, 2011 - link

    Price is EXTREMELY relevant here. Something that cannot be ignored. Reason being that nvidia prices are out of this world high compared to ATI and that pushes my hand rather often

    I am painfully knowledgeable on notebook hardware (over 10k posts on notebookreview forums under the same username) so I like to think I have some credibility

    When wondering why price matters....just look at the pricing on graphics options for the Alienware M18x (bare in mind these are pricing for 2 cards not 1 but shows the differences)
    -Upgrade from stock to CF 6970m $400
    -Upgrade from stock to SLI GTX 580m $1200

    That is WAY to big of a difference for the spread in performance (5-10% real world?). I know that I have the CF 6970m's (GTX 580m's were not available when I ordered mine so was a very easy choice) with a 2920xm and that laptop screams. And for the gaming laptop haters out there....I get 4.5 hours battery life on the HD 3000 :D
  • randomusername3242 - Wednesday, July 6, 2011 - link

    So you're complaining about prices for upgrades when you bought a 2920xm which you probably paid an exorbitantly high price for? I wouldn't be surprised if you paid over 400 to upgrade from a 2630qm for that.

    I think it's idiotic to buy any high end mobile part, GTX 580m or 2920xm.

    There's a sweet spot in price/performance. It's with the 2630qm + GTX 460m (maybe the 2720qm + 560m). Go any higher and you're throwing money, go any lower and you don't get enough performance.

    And I'll bite. I think it's also dumb to buy a gaming laptop because even if you get 4.5 hours battery life, with the specs thay you say you have your laptop is not portable at all. Sure, you might not have a tower and many wires, but you're overpaying for a big and often ugly piece of metal that will not move around. (You really think you can move around 10 lbs?)

    And how much did you pay? You don't get 2920xm + crossifre 6970ms for less than 2000.

    I'll make a distinction between a gaming laptop and a desktop replacement. Gaming laptops are feasible, sometimes affordable, and moderately portable. Desktop replacements are not portable, not affordable, and considerably inferior to a desktop.
  • seapeople - Wednesday, July 6, 2011 - link

    Wow, you sound somewhat disillusioned. There are millions of people out there spending significantly more money on things they don't need that don't even give them performance benefits (such as a city-slicker buying an F150 or Cadillac SUV, or Joe Smoe spending $3000/yr just so he can get his daily Starbucks coffee).

    In fact, if you are the type of person who can afford such luxury items, spending an extra $500 so your processor can turbo 20% higher and not slow you down wouldn't even register on your radar as being excessive, and rightfully so.

    Finally, you and so many others are completely wrong on the portability of big laptops. I like to watch movies or tv shows while, say, cooking dinner. Picking up a 10 pound laptop and bringing it to the kitchen with me is not even difficult in the slightest, whereas even the smallest portable desktop would require a 10 minute shutdown, transfer, and setup time.

Log in

Don't have an account? Sign up now