I care about performance. I don't have a windowed case or any glowing lights except my power and HDD activity lights (and even those annoy me sometimes! haha) so I'm only interested in how this massive card handles heat any better (or worse?) than a reference card, or even compared to the Galaxy card.
Also, I don't see much issue in the card taking up an extra slot. The GTX470 uses so much power, who would bother putting any other peripherals in their computer? (I kid, I kid. I know it's not THAT bad...)
Sure, it doesn't look appealing, but at least this card doesn't look like a cheap plastic toy spaceship you'd give to a 3-5 year old.
Any smart buyer should value function before form, this card looks like it will function much better than the Galaxy one will in terms of cooling/noise.
As far as the slot issue? Its about time we started seeing some official 3 slot cooling designs. I'm actually a little bit surprised nVidia didn't make the reference design 3 slots, although its lack luster performance (relative to its hype) wouldn't have mixed well with a 3 slot design on top of that.
More and more periphery parts are being integrated into the motherboard, very few users actually need or bother to use the extra slots anymore. And to top it off, the GPU is becoming more an more important (particularly nVidia's with their pushing CUDA and PhysX. GPGPU, the GPU is evolving into more of a co-processor to the CPU. The more important becomes, the more space it should be able to consume.
Sure, it might hurt SLI options, but I (and many others) would rather have a much faster single GPU than multiple slower ones (assuming they are slower to fit in the same amount of space). This would definitely merit larger cooling solutions, and if Fermi was as fast as it was originally hyped/rumored to be, it would definitely merit a 3 slot design by default.
Well, then we will be one step from socketed GPUs.
Personally I don't understand why motherboards couldn't be MATX or ITX and have the card on an edge connector so it can be bolted down and use standard CPU tower coolers. Then we get the bonus of using the extra ATX standoffs to hold down the videocard.
Is there a good physical reason the market doesn't go this way? Besides the longer PCI-E lines that is (I don't think it is much of an issue for a few reasons: ITX wouldn't / couldn't have longer traces, many forms of Crossfire and SLi already have long PCI-e lines).
I would love to have dual 6-pipe coolers on a GPU and CPU packed into a MATX case, dream come true.
Why not just go watercooling and it won't be a problem. You could make 1-slot GPUs with 250-500W thermal envelopes provided you have enough flow and a good radiator (wich can be placed outside the case, along with the resevoar and pump if space is an issue). You could make a really compact µATX build with Ci7 980x OCd past 4Ghz and a NV 480 or ATI 5970, or two cards in SLI/CF if the board has 2x PCIe x16 slots (only need 1 slot wide each). Throw in 3x x25-M/SF-1200/C300 in RAID-0 from ICH and you have a compact rigg able to easily reach 20K+ PCmark Vantage points.
Almost everyone loves performance, but certain market segments also cater for style - especially the LAN gamer segment. This card will look odd in most cases in most scenarios, so for a LAN gamer, may not be the card of choice.
In terms of SLI, again, there are market segments, particularly those on performance or GPU computing, that prefer two smaller cards rather than a big one. For example, I'm currently running 2x 5850s. Here in the UK, these together cost the same as a GTX480, however in all benchmarks, the 2x 5850 setup works better. If they were triple slot cards, I'd have to make sure I had the right board. This board I have, I bought before the 5xxx series came out, comes with dual PCI-E spacing - so I really wouldn't like to buy another board for triple slot cards.
It's always a case of seeing which areas of the consumer base would preferentially use the card. What may or may not be important to you may or may not be important to others.
That could make for an interesting reader poll, actually. See just how many people are using SLI/Crossfire and/or overclocking their cards. The general assumption(by everyone) is that readers of tech sites automatically have multiple video cards overclocked to the raggedy edge.
Personally, I was rather surprised by Anandtech's reader poll just before the launch of, if I remember correctly, the i7s. Judging from the emphasis many review sites put on overclocking in their reviews, I expected to be in the minority when I cast a vote saying CPU overclocking tests weren't important to me. It turned out about 75% of the reader based agreed!
I've had two cards I've put tri-fan AC Accelero coolers on (an 8800 GTX and GTX 260) and have been very happy with the results. I don't need the other expansion slots so the three slot size isn't a problem either. What I do care about is noise and cooling and they have excelled at those two functional requirements. If this card does as well I'd be happy to give it three slots.
I can imagine the only people buying these cards live in cold regions, pay little for electricity, and only have expensive gas heat (or maybe they hate cutting wood for the stove).
3 fans? Am I the only one who thinks this is ridiculous? I think NVidia must really want to see how far they can push the "too much heat/too much electricity needed" envelope and find out exactly how dumb consumers will be. I would NEVER buy this card. Not when it would force me to jump outside in my neighbors pool every five minutes during the summer just so I don't die of heat exhaustion. Fans might cool the card, but they don't magically get rid of all that wasted electricity and thermal energy being blown off the card into the room. Your air conditioner has to do that. Which increases your utility bill which is already high just from powering the video card. No thanks.
I would add however, that with the crazy coolers that people buy for their CPUs and how games today are still somewhat GPU limited this does sort of make sense. However, when you consider the competitions offerings I think the answer is clear.
Turn that frown upside down, make lemonade of the situation! If I had this card, I would duct the exhaust to an oven. Make dinner by playing games, what could be better?
People like to game. Decent gaming hardware produces heat, regardless of CPU or GPU chosen to do the job. If you're that worried about electrical consumption or taxing your air conditioner I doubt you would understand a gamer's mindset anyway.
LOL? Understand a gamer's mindset? I'm 30. I've been a "gamer" for well over a decade. I played Oregon Trail on an Apple II, Empire Deluxe off a single floppy, the original Alone in the Dark off of about eight floppies, etc. You think all gamers don't care about the environment, or have unlimited funds to spend? Get real.
I wouldn't harp on this particular card for having three fans. This GTX 470 would produce the same amount of heat as any other GTX 470, even though those card only have one fan. Given the choice between a GTX that will slowly roast itself, or one that effectively drops the heat into your room, this would be a better choice.
You obviously missed the part where they said this was just an aftermarket cooler (Arctic Cooling Accelero Xtreme) on a reference GTX 470. The triple fan Accelero Xtreme has been around since the HD2900/9800GTX days, so this is nothing new. And given how effective these coolers are, even more appropriate for cards like the GTX 470/480.
Nope. Didn't miss a thing. I was putting this in the same "ridiculous" category as those cards. I'm excited to see what AMD comes out with next, and am hoping that Nvidia does some real work and plays a little catchup. It would be great to see them leap ahead and slash their little heat issue for good. Maybe you're forgetting all the overheating laptops, the G80, etc...Nvidia doesn't seem to be able to keep their temps under control, that's why we have such ridiculous cards like these. There's a little word I like to throw around called sustainability, and I'm sorry but with the dependence on foreign oil, our little Chevy Volt being released, etc...its really time to start paying attention to power consumption. Nvidia is a tad behind the curve.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
23 Comments
Back to Article
RaistlinZ - Friday, May 7, 2010 - link
It's the Amy Winehouse of video cards.AstroGuardian - Monday, May 10, 2010 - link
Damn right dude.......... hahahhahaheddyg17 - Friday, May 7, 2010 - link
This card is a monstrosity. 3 slots? too big.jordanclock - Friday, May 7, 2010 - link
I care about performance. I don't have a windowed case or any glowing lights except my power and HDD activity lights (and even those annoy me sometimes! haha) so I'm only interested in how this massive card handles heat any better (or worse?) than a reference card, or even compared to the Galaxy card.Also, I don't see much issue in the card taking up an extra slot. The GTX470 uses so much power, who would bother putting any other peripherals in their computer? (I kid, I kid. I know it's not THAT bad...)
bunnyfubbles - Friday, May 7, 2010 - link
Sure, it doesn't look appealing, but at least this card doesn't look like a cheap plastic toy spaceship you'd give to a 3-5 year old.Any smart buyer should value function before form, this card looks like it will function much better than the Galaxy one will in terms of cooling/noise.
As far as the slot issue? Its about time we started seeing some official 3 slot cooling designs. I'm actually a little bit surprised nVidia didn't make the reference design 3 slots, although its lack luster performance (relative to its hype) wouldn't have mixed well with a 3 slot design on top of that.
More and more periphery parts are being integrated into the motherboard, very few users actually need or bother to use the extra slots anymore. And to top it off, the GPU is becoming more an more important (particularly nVidia's with their pushing CUDA and PhysX. GPGPU, the GPU is evolving into more of a co-processor to the CPU. The more important becomes, the more space it should be able to consume.
Sure, it might hurt SLI options, but I (and many others) would rather have a much faster single GPU than multiple slower ones (assuming they are slower to fit in the same amount of space). This would definitely merit larger cooling solutions, and if Fermi was as fast as it was originally hyped/rumored to be, it would definitely merit a 3 slot design by default.
nubie - Saturday, May 8, 2010 - link
Well, then we will be one step from socketed GPUs.Personally I don't understand why motherboards couldn't be MATX or ITX and have the card on an edge connector so it can be bolted down and use standard CPU tower coolers. Then we get the bonus of using the extra ATX standoffs to hold down the videocard.
Is there a good physical reason the market doesn't go this way? Besides the longer PCI-E lines that is (I don't think it is much of an issue for a few reasons: ITX wouldn't / couldn't have longer traces, many forms of Crossfire and SLi already have long PCI-e lines).
I would love to have dual 6-pipe coolers on a GPU and CPU packed into a MATX case, dream come true.
GullLars - Monday, May 10, 2010 - link
Why not just go watercooling and it won't be a problem. You could make 1-slot GPUs with 250-500W thermal envelopes provided you have enough flow and a good radiator (wich can be placed outside the case, along with the resevoar and pump if space is an issue).You could make a really compact µATX build with Ci7 980x OCd past 4Ghz and a NV 480 or ATI 5970, or two cards in SLI/CF if the board has 2x PCIe x16 slots (only need 1 slot wide each).
Throw in 3x x25-M/SF-1200/C300 in RAID-0 from ICH and you have a compact rigg able to easily reach 20K+ PCmark Vantage points.
GeorgeH - Friday, May 7, 2010 - link
A cooler inspired by a Transformers toy in a McDonalds Happy Meal featuring plastic colors that scream "Walmart special" is good looking.A simple black frame housing three fans with a functional design that yields incredibly good acoustics and temperatures is ugly.
Be honest - right now you're wearing white tube socks, brown shoes, black belt, Superman briefs and a faded Power Rangers t-shirt, aren't you?
;)
rqle - Friday, May 7, 2010 - link
looks better then the Galaxy GTX to me personallyOperandi - Saturday, May 8, 2010 - link
Agreed. I think it has a "technical" look to it, the same way some DOHC engines can look cool without engine covers.The Galaxy card on the other hand is the dumbest looking graphics card I've seen in a long time.
IanCutress - Friday, May 7, 2010 - link
Almost everyone loves performance, but certain market segments also cater for style - especially the LAN gamer segment. This card will look odd in most cases in most scenarios, so for a LAN gamer, may not be the card of choice.In terms of SLI, again, there are market segments, particularly those on performance or GPU computing, that prefer two smaller cards rather than a big one. For example, I'm currently running 2x 5850s. Here in the UK, these together cost the same as a GTX480, however in all benchmarks, the 2x 5850 setup works better. If they were triple slot cards, I'd have to make sure I had the right board. This board I have, I bought before the 5xxx series came out, comes with dual PCI-E spacing - so I really wouldn't like to buy another board for triple slot cards.
It's always a case of seeing which areas of the consumer base would preferentially use the card. What may or may not be important to you may or may not be important to others.
;)
All the best,
Ian
Mr Perfect - Saturday, May 8, 2010 - link
That could make for an interesting reader poll, actually. See just how many people are using SLI/Crossfire and/or overclocking their cards. The general assumption(by everyone) is that readers of tech sites automatically have multiple video cards overclocked to the raggedy edge.Personally, I was rather surprised by Anandtech's reader poll just before the launch of, if I remember correctly, the i7s. Judging from the emphasis many review sites put on overclocking in their reviews, I expected to be in the minority when I cast a vote saying CPU overclocking tests weren't important to me. It turned out about 75% of the reader based agreed!
Leyawiin - Friday, May 7, 2010 - link
I've had two cards I've put tri-fan AC Accelero coolers on (an 8800 GTX and GTX 260) and have been very happy with the results. I don't need the other expansion slots so the three slot size isn't a problem either. What I do care about is noise and cooling and they have excelled at those two functional requirements. If this card does as well I'd be happy to give it three slots.JonnyDough - Saturday, May 8, 2010 - link
I can imagine the only people buying these cards live in cold regions, pay little for electricity, and only have expensive gas heat (or maybe they hate cutting wood for the stove).3 fans? Am I the only one who thinks this is ridiculous? I think NVidia must really want to see how far they can push the "too much heat/too much electricity needed" envelope and find out exactly how dumb consumers will be. I would NEVER buy this card. Not when it would force me to jump outside in my neighbors pool every five minutes during the summer just so I don't die of heat exhaustion. Fans might cool the card, but they don't magically get rid of all that wasted electricity and thermal energy being blown off the card into the room. Your air conditioner has to do that. Which increases your utility bill which is already high just from powering the video card. No thanks.
JonnyDough - Saturday, May 8, 2010 - link
I would add however, that with the crazy coolers that people buy for their CPUs and how games today are still somewhat GPU limited this does sort of make sense. However, when you consider the competitions offerings I think the answer is clear.Cullinaire - Saturday, May 8, 2010 - link
Turn that frown upside down, make lemonade of the situation! If I had this card, I would duct the exhaust to an oven. Make dinner by playing games, what could be better?Leyawiin - Saturday, May 8, 2010 - link
People like to game. Decent gaming hardware produces heat, regardless of CPU or GPU chosen to do the job. If you're that worried about electrical consumption or taxing your air conditioner I doubt you would understand a gamer's mindset anyway.JonnyDough - Tuesday, May 11, 2010 - link
LOL? Understand a gamer's mindset? I'm 30. I've been a "gamer" for well over a decade. I played Oregon Trail on an Apple II, Empire Deluxe off a single floppy, the original Alone in the Dark off of about eight floppies, etc. You think all gamers don't care about the environment, or have unlimited funds to spend? Get real.Mr Perfect - Saturday, May 8, 2010 - link
I wouldn't harp on this particular card for having three fans. This GTX 470 would produce the same amount of heat as any other GTX 470, even though those card only have one fan. Given the choice between a GTX that will slowly roast itself, or one that effectively drops the heat into your room, this would be a better choice.nsx241 - Monday, May 10, 2010 - link
You obviously missed the part where they said this was just an aftermarket cooler (Arctic Cooling Accelero Xtreme) on a reference GTX 470. The triple fan Accelero Xtreme has been around since the HD2900/9800GTX days, so this is nothing new. And given how effective these coolers are, even more appropriate for cards like the GTX 470/480.JonnyDough - Tuesday, May 11, 2010 - link
Nope. Didn't miss a thing. I was putting this in the same "ridiculous" category as those cards. I'm excited to see what AMD comes out with next, and am hoping that Nvidia does some real work and plays a little catchup. It would be great to see them leap ahead and slash their little heat issue for good. Maybe you're forgetting all the overheating laptops, the G80, etc...Nvidia doesn't seem to be able to keep their temps under control, that's why we have such ridiculous cards like these. There's a little word I like to throw around called sustainability, and I'm sorry but with the dependence on foreign oil, our little Chevy Volt being released, etc...its really time to start paying attention to power consumption. Nvidia is a tad behind the curve.Ikshaar - Tuesday, May 11, 2010 - link
I want something with powerful and silent... I don't care about look or even use of other PCIe slots... GFX card is the only card in my PC.Will wait for confirmation that it can really stay silent on load but that would be the winner for me.
Hrel - Tuesday, May 11, 2010 - link
As long as it's quiet, cools efficiently and fits in the case it's fine by me.I guess I just like that industrial look.