NVIDIA's Tegra 3 Launched: Architecture Revealedby Anand Lal Shimpi on November 9, 2011 12:34 AM EST
Originally announced in February of this year at MWC, NVIDIA is finally officially launching its next-generation SoC. Previously known under the code name Kal-El, the official name is Tegra 3 and we'll see it in at least one product before the end of the year.
Like Tegra 2 before it, NVIDIA's Tegra 3 is an SoC aimed at both smartphones and tablets built on TSMC's 40nm LPG process. Die size has almost doubled from 49mm^2 to somewhere in the 80mm^2 range.
The Tegra 3 design is unique in the industry as it is the first to implement four ARM Cortex A9s onto a chip aimed at the bulk of the high end Android market. NVIDIA's competitors have focused on ramping up the performance of their dual-core solutions either through higher clocks (Samsung Exynos) or through higher performing microarchitectures (Qualcomm Krait, ARM Cortex A15). While other companies have announced quad-core ARM based solutions, Tegra 3 will likely be the first (and only) to ship in an Android tablet and smartphone in 2011 - 2012.
NVIDIA will eventually focus on improving per-core performance with subsequent iterations of the Tegra family (perhaps starting with Wayne in 2013), but until then Tegra 3 attempts to increase performance by exploiting thread level parallelism in Android.
GPU performance also sees a boon thanks to a larger and more efficient GPU in Tegra 3, but first let's talk about the CPU.
Four Five Cores
The Cortex A9 implementation in Tegra 3 is an improvement over Tegra 2; each core now includes full NEON support via an ARM MPE (Media Processing Engine). Tegra 2 lacked any support for NEON instructions in order to keep die size small.
NVIDIA's Tegra 2 die
NVIDIA's Tegra 3 die, A9 cores highlighted in yellow
L1 and L2 cache sizes remain unchanged. Each core has a 32KB/32KB L1 and all four share a 1MB L2 cache. Doubling core count over Tegra 2 without a corresponding increase in L2 cache size is a bit troubling, but it does indicate that NVIDIA doesn't expect the majority of use cases to saturate all four cores. L2 cache latency is 2 cycles faster on Tegra 3 than 2, while L1 cache latencies haven't changed. NVIDIA isn't commenting on L2 frequencies at this point.
The A9s in Tegra 3 can run at a higher max frequency than those in Tegra 2. With 1 core active, the max clock is 1.4GHz (up from 1.0GHz in the original Tegra 2 SoC). With more than one core active however the max clock is 1.3GHz. Each core can be power gated in Tegra 3, which wasn't the case in Tegra 2. This should allow for lightly threaded workloads to execute on Tegra 3 in the same power envelope as Tegra 2. It's only in those applications that fully utilize more than two cores that you'll see Tegra 3 drawing more power than its predecessor.
The increase in clock speed and the integration of MPE should improve performance a bit over Tegra 2 based designs, but obviously the real hope for performance improvement comes from using four of Tegra 3's cores. Android is already well threaded so we should see gains in portions of things like web page rendering.
It's an interesting situation that NVIDIA finds itself in. Tegra 3 will show its biggest performance advantage in applications that can utilize all four cores, yet it will be most power efficient in applications that use as few cores as possible.
There's of course a fifth Cortex A9 on Tegra 3, limited to a maximum clock speed of 500MHz and built using LP transistors like the rest of the chip (and unlike the four-core A9 cluster). NVIDIA intends for this companion core to be used for the processing of background tasks, for example when your phone is locked and in your pocket. In light use cases where the companion core is active, the four high performance A9s will be power gated and overall power consumption should be tangibly lower than Tegra 2.
Despite Tegra 3 featuring a total of five Cortex A9 cores, only four can be active at one time. Furthermore, the companion core cannot be active alongside any of the high performance A9s. Either the companion core is enabled and the quad-core cluster disabled or the opposite.
NVIDIA handles all of the core juggling through its own firmware. Depending on the level of performance Android requests, NVIDIA will either enable the companion core or one or more of the four remaining A9s. The transition should be seamless to the OS and as all of the cores are equally capable, any apps you're running shouldn't know the difference between them.
Post Your CommentPlease log in or sign up to comment.
View All Comments
dagamer34 - Wednesday, November 9, 2011 - linkTechnically Sony's been planning on having 4 Cortex A9 CPUs inside the Playstation Vita since it was announced in January (plus the very powerful PowerVR SGX 543MP4)
Klinky1984 - Wednesday, November 9, 2011 - link...and where might I obtain a Playstation Vita? What other four core ARM chip is available in mainstream products as of today?
Klinky1984 - Wednesday, November 9, 2011 - linkMaybe I should I have phrased that as "available for use in mainstream products as of today", I don't think Sony is going to let anyone use their SoC in a phone or tablet.
MrMilli - Wednesday, November 9, 2011 - linkBoth NEC (now Renesas Electronics) and Marvell beat nVidia to it. NEC focuses more on the industrial side of things and Marvell (Armada XP) more on storage/server applications. But NEC chips often find their way into automotive electronics (I believe some GPS systems use their ARM11 quad core from years ago).
I don't know what you see as a mainstream products. But GPS systems and low-end servers can be seen as mainstream.
Klinky1984 - Wednesday, November 9, 2011 - linkEnterprise, industrial & embedded products are not mainstream. The Tegra3 is going to be a selling point for the products that use it & those products will be advertised prominently on the TV, print & the Internet. I highly doubt you'll see that for the chips you mentioned, I don't think I've ever seen a car commercial tout that their GPS is powered by NEC or Marvell or whatever platform they're using. I've seen plenty of phone commercials touting Tegra2 or Snapdragon.
Stuka87 - Wednesday, November 9, 2011 - linkI disagree, and I am not sure you are aware of the meaning of mainstream. As its all dependent on point of view.
Mainstream is something which is purchased, used or accepted broadly rather than by a tiny fraction of population or market; common, usual or conventional.
Nowhere is it stated that something has to be on TV to be mainstream. It simply has to be popular in the market that it is aimed at. And NEC is most definitely mainstream in the markets that they target.
eddman - Wednesday, November 9, 2011 - linkBut the question was: can you buy a phone or tablet with an NEC or marvell quad-core SoC?
Maybe Tegra 3 wasn't the first quad ARM chip, but it is the first quad for mobile devices.
TI doesn't have any in its roadmap for now.
Same thing for St-Ericsson as TI.
Qualcomm's quads won't appear until Q4 2012.
Samsung hasn't announced any yet.
Stuka87 - Wednesday, November 9, 2011 - linkThis is true. The NEC would not be suitable for a mobile device, and is not available for them. So nVidia is first in that market space.
Penti - Sunday, November 20, 2011 - linkRenesas do offer quad-core mobile cpus. As does any one else, it's just a matter off when they are actually available. They do have an more impressive overall offer in the mobile space though.
Klinky1984 - Thursday, November 10, 2011 - linkAlmost 1/3rd of the US population has a smartphone, how many of those consumers have an enterprise NAS, SAN or ARM based cloud server or even know what one is & even if they do know what one is do they actually have a desire to buy one? As for embedded GPS devices, which devices used the prototype quad-core ARM11 from Renesas Electronics?
Additionally I am not finding actual products containing a Marvell Armada XP or Renesas Electronics R-Car H1, The R-Car H1 doesn't even start mass production until almost 2013. Renesas Electronics' Quad ARM11 NaviEngine looks like it was used by Alpine Car Information Systems in 2010, but finding out what Alpine product uses it & where I could buy it is a challenge.