A glimpse into ATI's future?

A little over a year ago ATI completed the acquisition of a company called ArtX. You may have heard us talk about ArtX in the past as they produced an integrated graphics core for an ALi Super7 chipset entitled the Aladdin 7 a couple of years back.

While the chipset was too little, too late for the Super7 market, ArtX had actually caught the eye of Nintendo and was contracted to produce the graphics core for their next-generation gaming console. This little known company ended up being acquired by a much better known player in the industry, ATI.

However at the time of acquisition in mid-2000, the design for the GameCube was complete as were the designs for ATI's next-generation desktop chips. So although the two companies were now under one roof, their technology was not able to mix in the production of the GameCube graphics core. One thing that ATI did very wisely was the agreement to place an ATI sticker on the front of the GameCube. This sort of branding is unbeatable since they're essentially raising a group that will eventually grow to millions, to support the ATI name. When a satisfied GameCube owner goes to buy a new video card, it's more than likely that the ATI name will catch their eye first.

The GameCube motherboard is home to only two chips outside of all the memory chips on the board. The first chip you know of, and that is IBM's Gekko processor. The second chip is the integrated North Bridge, I/O controller, and graphics processor produced by the ArtX team that is now a member of ATI. This chip not only dwarfs Gekko in size, but it also is much more interesting to talk about; this chip is named, Flipper.

Flipper is a 51 million transistor chip, again built on a 0.18-micron manufacturing process (this time, not using copper) and it's produced by NEC. In spite of the massive transistor count, and the tremendous amount of functionality of the core, Flipper is about 106 mm^2 in size, making it just about as big as the Xbox CPU. The only letdown here is that Flipper is still built on a 0.18-micron process while ATI's new GPUs as well as the Xbox IGP are built on a 015-micron process which definitely reduces their die size by approximately 30% over their 0.18-micron counterparts. During the course of next year the potential for the Xbox IGP to move down to a 0.13-micron process is there as well since TSMC's 0.13-micron process will have matured considerably by the 2nd half of 2002. We'll explain the reason why Flipper is stuck at 0.18-micron later.

The role of North Bridge is played by Flipper in that it features a 64-bit interface to the Gekko CPU running at 162MHz. The entire Flipper chip runs at 162MHz which lends itself to much lower latency operation since all bus clocks operate in synch with one another. This also means that the graphics core in Flipper runs at 162MHz as well.

The Flipper graphics core is a fairly simple fixed function GPU aided by some very powerful amounts of memory bandwidth, but first onto the architecture of the graphics core. Flipper always operates on 4 pixels at a time using its 4 pixel pipelines; each of those pipelines is capable of applying one texture per pipeline which immediately tips you off that the ArtX design wasn't influenced by ATI at all. Since the Radeon and GeForce2, both ATI and NVIDIA's cores have been able to process a minimum of two textures per pixel in each of their pipelines which came quite in handy since none of today's games are single textured anymore.

The fact that the Flipper's T&L is a fixed function T&L unit is a bit of a disappointment as well but it would have been impossible for ArtX to implement ATI's SmartShader programmable pixel and vertex shaders into their design and still meet Nintendo's strict deadlines. The one thing that is playing to the GameCube's favor is that the Flipper GPU was designed solely with console gaming in mind, and the input that went into the T&L unit was much more closely tied to the developers than some of the earlier T&L units for desktop PC graphics cards. Although it may be better suited for its target use than the earliest T&L units for PCs, there is no skirting the fact that with a fixed function T&L pipeline there are limitations to exactly what game developers will be able to do. After seeing what over two years of fixed function T&L support in games for the PC was like, we'd hope for much more out of developer use of Flipper's GPU.

Anti-aliasing is very important when it comes to console games and it's thus very important that Flipper offer AA support. The core does feature support for a 7-sample multi-sample AA algorithm but it's clear that turning on 7-sample AA isn't exactly the most realistic option. ATI informed us that the number of samples is adjustable and can be set by the developer, but as was the case with Xbox, we did not see any examples of AA being implemented in any of the launch titles.

Based on the operating frequency of the core (162MHz) you can tell that the Flipper graphics core isn't a fill-rate monster, but what it is able to do is portray itself as a very efficient GPU. The efficiency comes from the use of embedded DRAM.

The GameCube CPU If cache is so fast, then why isn't everything made out of it?
Comments Locked

6 Comments

View All Comments

  • cubeguy2k5 - Monday, December 20, 2004 - link

    feel that anandtechs article on xbox vs ps2 vs gamecube didnt go in depth enough, guessed at too many things, and intentionally got others wrong, not sure where to discuss this at, would like to get a thread going.....

    "However details on this processor are sketchy at best but the information we've been able to gather points at a relatively unmodified PowerPC 750CXe microprocessor " - where did they gather this from? gekko isnt a PPC 750CXE or it would be marked as such.

    "The Flipper graphics core is a fairly simple fixed function GPU aided by some very powerful amounts of memory bandwidth, but first onto the architecture of the graphics core. Flipper always operates on 4 pixels at a time using its 4 pixel pipelines; each of those pipelines is capable of applying one texture per pipeline which immediately tips you off that the ArtX design wasn't influenced by ATI at all. Since the Radeon and GeForce2, both ATI and NVIDIA's cores have been able to process a minimum of two textures per pixel in each of their pipelines which came quite in handy since none of today's games are single textured anymore." - who told them that gamecube only has one texture unit per pipeline? it wasnt nintendo, i could just as easily say it has 2, doubling texel bandwidth....... who said it was fixed function?

    "Planet GameCube: In a recent IGNinsider article, Greg Buchner revealed that Flipper can do some unique things because of the ways that the different texture layers can interact. Can you elaborate on this feature? Have you used it? Do you know if the effects it allows are reproducible on other architectures (at decent framerates)?

    Julian Eggebrecht: He was probably referring to the TEV pipeline. Imagine it like an elaborate switchboard that makes the wildest combinations of textures and materials possible. The TEV pipeline combines up to 8 textures in up to 16 stages in one go. Each stage can apply a multitude of functions to the texture - obvious examples of what you do with the TEV stages would be bump-mapping or cel-shading. The TEV pipeline is completely under programmer control, so the more time you spend on writing elaborate shaders for it, the more effects you can achieve. We just used the obvious effects in Rogue Leader with the targeting computer and the volumetric fog variations being the most unusual usage of TEV. In a second generation game we’ll obviously focus on more complicated applications."

    The TEV pipeline is completely under programmer control, so the more time you spend on writing elaborate shaders for it, the more effects you can achieve. COMPLETELY UNDER PROGRAMMER CONTROL MEANS NOT FIXED FUNCTION, and on fixed function GPUs you cannot do advanced shader effects in realtime can you? rogue leader and rebel strike use them EXTENSIVELY.... anandtech.... wheres your explanation?

    ill provide more examples later....



    "Julian Eggebrecht: Maybe without going into too much detail, we don’t think there is anything visually you could do on X-Box (or PS2) which can’t be done on GameCube. I have read theories on the net about Flipper not being able to do cube-mapped environment maps, fur shading, self-shadowing etc... That’s all plain wrong. Rogue does extensive self-shadowing and both cube-maps and fur shading are not anymore complicated to implement on GameCube than on X-Box. You might be doing it differently, but the results are the same. When I said that X-Box and GameCube are on par power-wise I really meant it. " looks like a PROVEN DEVELOPER just proved anandtech is WRONG... nice..... factor5 was involved in the creation of cube, they know it better than ANYONE else, including anandtech....


    come on anandtech, i know you see this article... what about this?

    you clearly state that you believe xbox is ageneration ahead of gamecube technically, when you COULD NOT do any of the shader effects nor the amount of bumpmapping thats in rogue leader even, on a pre GF3 GPU, let alone rebel strike..... what about the water effects in rebel strike, mario sunshine, waverace, i do believe that in 2001, not one game had water even on pc, even CLOSE to waverace in terms of how it looked, and the physics behind it, and in 2002 there wasnt one game close to mario sunshine as far as water goes, wow!..... what about all the nice fully dynamic lighting in RE4, and rebel strike? you couldnt pull that off on a fixed function gpu could you? apparently they cant even pull it off on xbox, when halo2 has massive slowdown, mostly static lighting, an abysmal polygon count, coupled with lod pop in, and various other problems/faked effects.... nice, what about ninja gaiden ? same story, good character models, very bad textures, non existant lighting, shadows that seem to react to non existant lightsources that exist inside of walls..... cute.....

    http://www.geocities.com/cube_guy_2k5/ng3.jpg

    nice textures and lack of lighting... low polycount and invisible lightsources that seem to only allow ryu to cast shadows, not the environment, wow.... what bout the faked reflections used in the game?... neat
  • Cooe - Tuesday, August 18, 2020 - link

    The fanboy delusions are strong with this one...
  • Arkz - Saturday, September 17, 2011 - link

    "the other incorrectly labeled digital AV (it's still an analog signal) for component connections."

    wrong, its purely digital. the component cable has a DAC chip in the connector block. technically they could make a DVI cable for it.
  • Arkz - Saturday, September 17, 2011 - link

    and gc cpu is 485 not 500
  • ogamespec - Thursday, August 8, 2013 - link

    Actually Gekko speed is 486 ( 162 x 3) MHz.

    And Gamecube GPU (Flipper) TEV is fixed stage. No custom shaders.
  • techFan1988 - Wednesday, May 4, 2022 - link

    Mmmm I understand that now we have much better information than back then, but I find this piece of the article a bit skewed towards the Xbox (or against the GC).
    There are a couple of aspects that are factually wrong, for example:
    "However from all of that data that we have seen comparing the PowerPC 750 to even the desktop Intel Celeron processor, it does not seem that the Gekko can compete, performance-wise."

    The original PowerPC 750 didn't even have on-die L2 cache, so saying "it doesn't compete with a Celeron coppermine processor" is absolutely unfair (it would be like comparing the first versions of the P3 -the ones running at 500Mhz- with the Coppermine ones).

    To grab the original PPC 750 and compare it to a coppermine celeron 128 (the ones based on the P3 architecture and the one feeding the Xbox -although with a faster bus which was comparable to that of a regular P3) is not a fair comparison.

    At least, since this was a modification of the PPC750 CXe (and not the original PPC750) the author of the article should have compared that CPU to the Celeron and not the original PPC 750.

    I mean, the difference between P3 first gen and P3 coppermine was even bigger than the difference between P2 and P3 just because of the integrated L2 caché!
    How could this factor be ignored when comparing GC's and Xbox's CPUs?

Log in

Don't have an account? Sign up now