In what has now become a bona fide tradition for NVIDIA, at their GDC event this evening the company announced their next flagship video card, the GeForce GTX 1080 Ti. Something of a poorly kept secret – NVIDIA’s website accidentally spilled the beans last week – the GTX 1080 Ti is NVIDIA’s big Pascal refresh for the year, finally rolling out their most powerful consumer GPU, GP102, into a GeForce video card.

The Ti series of cards isn’t new for NVIDIA. The company has used the moniker for their higher-performance cards since the GTX 700 series back in 2013. However no two generations have really been alike. For the Pascal generation in particular, NVIDIA has taken the almighty Titan line in a more professional direction, so whereas a Ti card would be a value Titan in past generations – and this is still technically true here – it serves as more of a flagship for the Pascal generation GeForce.

At any rate, we knew that NVIDIA would release a GP102 card for the GeForce market sooner or later, and at long last it’s here. Based on a not-quite-fully-enabled GP102 GPU (more on this in a second), like its predecessors the GTX 1080 Ti is meant to serve as a mid-generation performance boost for the high-end video card market. In this case NVIDIA is aiming for what they’re calling their greatest performance jump yet for a Ti product – around 35% on average – which would translate into a sizable upgrade for GeForce GTX 980 Ti owners and others for whom GTX 1080 wasn’t the card they were looking for.

NVIDIA GPU Specification Comparison
  GTX 1080 Ti NVIDIA Titan X GTX 1080 GTX 980 Ti
CUDA Cores 3584 3584 2560 2816
Texture Units 224 224 160 176
ROPs 88 96 64 96
Core Clock ? 1417MHz 1607MHz 1000MHz
Boost Clock 1582MHz 1531MHz 1733MHz 1075MHz
Memory Clock 11Gbps GDDR5X 10Gbps GDDR5X 10Gbps GDDR5X 7Gbps GDDR5
Memory Bus Width 352-bit 384-bit 256-bit 384-bit
FP64 1/32 1/32 1/32 1/32
FP16 (Native) 1/64 1/64 1/64 N/A
INT8 4:1 4:1 N/A N/A
TDP 250W 250W 180W 250W
GPU GP102 GP102 GP104 GM200
Transistor Count 12B 12B 7.2B 8B
Die Size 471mm2 471mm2 314mm2 601mm2
Manufacturing Process TSMC 16nm TSMC 16nm TSMC 16nm TSMC 28nm
Launch Date 03/2017 08/02/2016 05/27/2016 06/01/2015
Launch Price $699 $1200 MSRP: $599
Founders $699

We’ll start as always with the GPU at the heart of the card, GP102. With NVIDIA’s business now supporting a dedicated compute GPU – the immense GP100 – GP102 doesn’t qualify for the “Big Pascal” moniker like past iterations have. But make no mistake, GP102 is quite a bit larger than the GP104 GPU at the heart of the GTX 1080, and that translates to a lot more hardware for pushing pixels.

GTX 1080 Ti ships with 28 of GP102’s 30 SMs enabled. For those of you familiar with the not-quite-consumer NVIDIA Titan X (Pascal), this is the same configuration as that card, and in fact there are a lot of similarities between those two cards. Though for this generation the situation is not going to be cut & dry as in the past; the GTX 1080 Ti is not strictly a subset of the Titan.

The big difference on the hardware front is that NVIDIA has stripped GP102 of some of its memory/ROP/L2 capacity, which was fully enabled on the Titan. Of the 96 ROPs we get 88; the last ROP block, its memory controller, and 256KB of L2 cache have been disabled.

However what the GTX 1080 Ti lacks in functional units it’s partially making up in clockspeeds, both in regards to the core and the memory. While the base clock has not yet been disclosed, the boost clock of the GTX 1080 Ti is 1582MHz, about 50MHz higher than its Titan counterpart. More significantly, the memory clock on the GTX 1080 Ti is 11Gbps, a 10% increase over the 10Gbps clock found on the Titan and the GTX 1080. Combined with the 352-bit memory bus, and we’re looking at 484GB/sec of memory bandwidth for the GTX 1080 Ti.

Taken altogether then, the GTX 1080 Ti offers just over 11.3 TFLOPS of FP32 performance. This puts the expected shader/texture performance of the card 28% ahead of the current GTX 1080, while the ROP throughput advantage stands 26%, and memory bandwidth at a much greater 51.2%. Real-world performance will of course be influenced by a blend of these factors, so I’ll be curious to see how much the major jump in memory bandwidth helps given that the ROPs aren’t seeing the same kind of throughput boost. Otherwise, relative to the NVIDIA Titan X, the two cards should end up quite close, trading blows now and then.

Speaking of the Titan, on an interesting side note, it doesn’t look like NVIDIA is going to be doing anything to hurt the compute performance of the GTX 1080 Ti to differentiate the card from the Titan, which has proven popular with GPU compute customers. Crucially, this means that the GTX 1080 Ti gets the same 4:1 INT8 performance ratio of the Titan, which is critical to the cards’ high neural networking inference performance. As a result the GTX 1080 Ti actually has slighty greater compute performance (on paper) than the Titan. And NVIDIA has been surprisingly candid in admitting that unless compute customers need the last 1GB of VRAM offered by the Titan, they’re likely going to buy the GTX 1080 Ti instead.

Speaking of memory, as I mentioned before the card will be shipping with 11 pieces of 11Gbps GDDR5X. The faster memory clock comes courtesy of a new generation of GDDR5X memory chips from partner Micron, who after a bit of a rocky start with GDDR5X development, is finally making progress on boosting memory speeds that definitely has NVIDIA pleased. Until now NVIDIA’s GPUs and boards have been ready for the higher frequency memory, and the memory is just now catching up.

Moving on, the card’s 250W TDP should not come as a surprise. This has been NVIDIA’s segment TDP of choice for Titan and Ti cards for a while now, and the GTX 1080 Ti isn’t deviating from that.

However the cooling system has seen a small but important overhaul: the DVI port is gone, opening up the card to be a full slot blower. In order to offer a DVI port along with a number of DisplayPorts/HDMI ports, NVIDIA has traditionally blocked part of the card’s second slot to house the DVI port. But with GTX 1080 Ti, that port is finally gone, and that gives the GTX 1080 Ti the interesting distinction being the first unobstructed high-end GeForce card since the GTX 580. The end result is that NVIDIA is promising a decent increase in cooling performance relative to the GTX 980 Ti and similar designs. We’ll have to see how NVIDIA has tuned the card to understand the full impact of this change, but this likely will further improve on NVIDIA’s already great acoustics.

Meanwhile the end result of removing the DVI port means that the GTX 1080 Ti’s display I/O has been pared down to just a mix of HDMI and DisplayPorts. Altogether we’re looking at 3x DisplayPort 1.4 ports and 1x HDMI 2.0 port. As a consolation to owners who may still be using DVI-based monitors, the company will be including a DisplayPort to DVI adapter with the card (presumably DP to SL-DVI and not DL-DVI), but it’s clear that DVI’s days are now numbered over at NVIDIA.

Moving on, for card designs NVIDIA is once again going to be working with partners to offer a mix of reference and custom designs. The GTX 1080 Ti will initially be offered in a Founder’s Edition design, while partners are also bringing up their own semi and fully custom designs to be released a bit later. Importantly however, unlike the GTX 1080 & GTX 1070, NVIDIA has done away with the Founder’s Edition premium for the GTX 1080 Ti. The MSRP of the card will be the MSRP for both the Founder’s Edition and partners’ custom cards. This makes pricing more consistent, though I’m curious to see how this plays out with partners, as they benefitted from the premium in the form of more attractive pricing for their own cards.

Finally, speaking of pricing, let’s talk about the launch date and availability. Just in time for Pi Day, NVIDIA will be launching the card on the week of March 5th (Update: an exact date has finally been revealed: Friday, March 10th). As for pricing, long-time price watchers may be surprised. NVIDIA will be releasing the card at $699, the old price of the GTX 1080 Founder's Edition (which itself just got a price cut). This does work out to a bit higher than the GTX 980 Ti - it launched at $649 two years ago - but it's more aggressive than I had been expecting given the GTX 1080's launch price last year.

In any case, at this time the high-end video card market is NVIDIA’s to command. AMD doesn’t offer anything competitive with the GTX 1070 and above, so the GTX 1080 Ti will stand alone at the top of the consumer video card market. Long-term here AMD isn’t hesitating to note their work on Vega, but that’s a bridge to be crossed only once those cards get here.

Comments Locked


View All Comments

  • Flunk - Wednesday, March 1, 2017 - link

    I feel like you might have a fundamental misunderstanding of GPU pricing. A large percentage of the cost of building GPUs is development costs, not cost of materials. Because of this the GPU maker is always trying to sell their top-end cards for as much money as the market will bear and costs of all cards are not particularly tied to their price.

    A 1080 for example, doesn't cost 4x as much to build as a 1050 does. But in order to provide a variety of price points the GPU maker builds a variety of stripped-down designs to fit all the price points they think they can sell. This is how the GPU industry prices their products and also why the second competition shows up, the prices for the current cards all fall.
  • usernametaken76 - Wednesday, March 1, 2017 - link

    It's their prerogative to sell whatever they want to sell at whatever price they think they can sell it for. It's not a ludicrous proposition. They are a business and it's the job of the CEO and sales at the business to determine what the market will bear. It's kind of obvious that, by now, most people who were considering a TITAN X are already in possession of one or more. So, they move down the foodchain to sell at another price point to a different audience. There's nothing wrong with this tactic.
  • Laststop311 - Wednesday, March 1, 2017 - link

    That titan tax is the price you pay if you do not want to wait months for the performance. Also Vegas performance may be influencing nvidia right now to pre-preemptively strike and get as many people over to green before vega even releases.
  • Laststop311 - Wednesday, March 1, 2017 - link

    i'm more and more interested in making my next build all AMD. AMD CPU, GPU, and RAM. With the mobo asus matching the gpu board asus and the soundcard asus essence stx II

    (side note: the Asus Essence STX II sounds freakishly amazing connected via its rca output to KRK expose E8B's each speaker with it's own dedicated bi amps with 240 watts rms of power for each speaker totaling 480 watts of power for the 2 speakers, 1 amp for the highs and 1 amp for the high midrange and lower midrange and midbass of the 8 inch kevlar driver on each speaker and a 2 way crossover system on each speaker fully adjustable, 8 inch kevlar cone midrange/midbass and 1 inch aluminum beryllium inverted dome tweeter extending it's range from 40hz to 30khz, both material choices equal super light weight, stiff responsive speaker cones with no harsh resonances, I mean each speaker weighs in at over 67 pounds totally impressive beasts. Pair that sound card and those awesome speakers to a KRK rokit 12sHO subwoofera 400 watt RMS beast with a strengthened version of the kevlar cone to increase rigidity under high SPL for tight accurate bass even when it's pushing all it's watts. It can play a range of 29-60hz, 29-160hz or 29-211hz depending on where your midbass begins to pick up is where you cut the sub off. the 8 inch cones go down to 120hz no problem even tho they say 40hz they sound best cut off at 120-140hz to let the sub do what it does best and free up a lot of the 8 inch woofers midrange capabilities that are sacrificed when the power is drained to play them all the way down to 40hz. Run the sub 29hz to 120hz-140hz not adding too high of frequencies in there so it can concentrate its power on the big booms. Plus what I like about KRK is they give you the real numbers real usable specs not hypothetical laboratory bests that cant be reproduced. People hate on the 12 inch sub for only producing down to 29hz, well it plays sound below 29hz no problem there is just roll off after 29hz so krk puts the more truer sound quality accurate number instead of padding their tech spec sheets with bs but it has no problem playing down to the limit of human hearing 20hz, 29hz is just the lowest it can go and still play at its highest sound level. Adding a high quality parabolic equalizer is akin to a color calibrating device for a monitor adding a parabolic eq will let you fine tune adjust till you have flat frequency response over the entire range of human hearing and a set of these speakers are capable of producing super warm bass junky machines for hip hop and electronic while also retaining the ability of complete neutrality.
  • Laststop311 - Wednesday, March 1, 2017 - link

    o and for those waiting to hate on the supposed 400 watts is weak for a sub the 1 12 inch sub system weighs in at a whopping 110 pounds. It's moving serious amounts of air.
  • Laststop311 - Wednesday, March 1, 2017 - link

    and before you say but 15 and 18s they are so much better. For sound quality applications no they arent. 12-13 inches is the best size for reproducing 25-150hz. The bigger subs can edge it out on 20-30hz or the 18s at 16-40hz but they muddy up the upper bass the sub takes cares of.
  • Laststop311 - Wednesday, March 1, 2017 - link

    sorry for the crazy tangent. I just built my hi rez 24 bit 192khz listening station and got to excited.
  • Alex0826 - Thursday, March 2, 2017 - link

    Listening station? Zeta Reticuli, planet X or maybe seismic activity?
  • extide - Friday, March 3, 2017 - link

    That's why the biggest W7 is 13.5"
  • extide - Friday, March 3, 2017 - link

    Lol, 480w rms, I could be wrong but I think you are dreaming

Log in

Don't have an account? Sign up now