ASUS' Transformer Prime: The First Tegra 3 Tablet

With Tegra 2, Motorola was the primary launch partner both for smartphones and tablets. Since then, ASUS has risen in the ranks and is now a serious competitor in the Android tablet space. It's no surprise that the first Tegra 3 tablet out of the gate is ASUS' Transformer Prime.

ASUS will launch the Transformer Prime in the US before the end of the year. The tablet's specs are below:

Tablet Specification Comparison
  ASUS Eee Pad Transformer ASUS Eee Pad Transformer Prime Apple iPad 2 Samsung Galaxy Tab 10.1
Dimensions 271mm x 175mm x 12.95mm 263 x 180.8 x 8.3mm 241.2 x 185.7 x 8.8mm 256.6 x 172.9 x 8.6mm
Display 10.1-inch 1280 x 800 10.1-inch 1280 x 800 Super IPS+ 9.7-inch 1024 x 768 IPS 10.1-inch 1280 x 800 PLS
Weight 675g 586g 601g 565g
Processor 1GHz NVIDIA Tegra 2 (2 x Cortex A9) 1.3GHz NVIDIA Tegra 3 (4 x Cortex A9) 1GHz Apple A5 (2 x Cortex A9) 1GHz NVIDIA Tegra 2 (2 x Cortex A9)
Memory 1GB 1GB 512MB 1GB
Storage 16GB + microSD card 32GB/64GB + microSD slot 16GB 16GB
Pricing $399 $499/$599 $499 $499

Final Words

At a high level Tegra 3 doesn't surprise us much. The improved GeForce GPU should deliver tangible performance gains both through increased operating frequency and more pixel shader hardware. CPU performance should also be better than Tegra 2 based designs thanks to an increase in clock speed, the inclusion of MPE and the availability of more cores for threaded applications. In the move from one to two cores we saw significant performance increases across the board in Android. I don't expect that we'll see gains of a similar magnitude in moving from two to four cores, but there will be some benefit.

For the majority of use cases I believe NVIDIA has done the hardware homework necessary to extend battery life. Individual cores can now be power gated and the companion core should do most of the lifting while your device is locked or mostly idle, processing background tasks.

How much of an impact we'll actually see from all of this remains to be seen. We hope to have our hands on the first Tegra 3 hardware in the coming weeks, so before the year is up we'll hopefully have some answers.

The Tegra 3 GPU: 2x Pixel Shader Hardware of Tegra 2
Comments Locked


View All Comments

  • jcompagner - Thursday, November 10, 2011 - link

    does the OS not do the scheduling?

    I think there are loads of things build in to the OS that schedules the processors threads.. For example the OS must be Numa aware for numa systems so that they keep processes/threads on the right cores that are in the same cpu/memory banks

    If i look at windows, then windows schedules everything all lover the place but it does now about hyper threading because those cores are skipped when i don't use more then 4 cores at the same time.
  • DesktopMan - Wednesday, November 9, 2011 - link

    Seems risky to launch with a GPU that's weaker than existing SOCs. Compared to the Apple A5 performance it looks more like a 2009 product... Exynos also has it beat. The main competitor it beats is Qualcomm, who isn't far from launching new SOCs themselves.
  • 3DoubleD - Wednesday, November 9, 2011 - link

    At least it looks more powerful than the SGX540 which is in the Galaxy Nexus. I'll wait and see what the real world performance is before writing it off. I suspect it will have "good enough" performance. I doubt we will see much improvement in Android devices until 28 nm as die sizes seem to be the limiting factor. Fortunately Nvidia has their name on the line here and they seem to be viciously optimizing their drivers to get every ounce of performance out of this thing.
  • DesktopMan - Wednesday, November 9, 2011 - link

    Totally agree on the Galaxy Nexus. That GPU is dinosaur old though. Very weird to use it in a phone with that display resolution. Any native 3d rendering will be very painful.
  • eddman - Wednesday, November 9, 2011 - link

    "Exynos also has it beat"

    We don't know that. On paper kal-el's geforce should be at least as fast as exynos. Better wait for benchmarks.
  • mythun.chandra - Wednesday, November 9, 2011 - link

    It's all about the content. While it would be great to win GLBench and push out competition-winning benchmarks scores, what we've focused on is high quality content that fully exploits everything Tegra 3 has to offer.
  • psychobriggsy - Friday, November 11, 2011 - link

    I guess it depends on the clock speed the GPU is running at, and the efficiency it achieves when running. Whilst not as powerful per-clock (looking at the table in the article), a faster clock could make up a lot of the difference. Hopefully NVIDIA's experience with GPUs also means it is very efficient. Certainly the demos look impressive.

    But they're going to have to up their game soon considering the PowerVR Series 6, the ARM Mali 6xx series, and so on, as these are far more capable.
  • AmdInside - Wednesday, November 9, 2011 - link

    Anyone else getting an error when opening the Asus Transformer Prime gallery?
  • skydrome1 - Wednesday, November 9, 2011 - link

    I am still quite underwhelmed by it's GPU. I mean, come on NVIDIA. A company with roots in GPU development having the lowest GPU performance?

    They need to up their game. Or everyone's just going to license other's IPs and develop their own SoCs. LG got an ARM license. Sony got an Imagination license. Samsung's even got their own SoCs shipping. Apple is sticking to in-house design. HTC acquired S3.

    After telling the whole world that by the end of next year, there will be phones that will beat consoles in raw graphical performance, I feel like an idiot.

    Please prove me right, NVIDIA.
  • EmaNymton - Wednesday, November 9, 2011 - link

    REALLY getting tired of all the Anandtech articles being overly focused on performance and ignoring battery life or making statements about the technologies that will theoretically increase battery life. Total ACTUAL battery life matters and increases in perf shouldn't come to the detriment of total ACTUAL battery life.

    This over-emphasis on perf and refusing to hold MFGRs to account for battery life is bordering on irresponsible and is driving this behavior in the hardware MFGRs.


Log in

Don't have an account? Sign up now