The NVIDIA GeForce RTX 2070 Founders Edition Review: Mid-Range Turing, High-End Price
by Nate Oh on October 16, 2018 9:00 AM ESTWhen NVIDIA first announced their Turing based GeForce RTX 20 series, they unveiled three GeForce RTX models: the 2080 Ti, 2080, and 2070. As we’ve seen earlier, Turing and the GeForce RTX 20 series as a whole are designed on a hardware and software level to enable realtime raytracing for games, as well as other new specialized features, though all of these are yet to launch in games. Nevertheless, last month’s release of the GeForce RTX 2080 Ti and 2080 finally revealed their places on the traditional performance spectrum. As the ‘value’ oriented enthusiast offering, the RTX 2070 is arguably the more important card for most prospective buyers. And so, ahead of tomorrow’s launch, today we take a look at the GeForce RTX 2070 Founders Edition.
Even as the value option, which is historically the case for the x70 part, the RTX 2070 Founders Edition comes in at $599, with standard MSRP at $499. For all intents and purposes, the lower $499 price won’t be seen in the near-future as AIBs will be aligned with NVIDIA to avoid cannibalization and lower ASPs. Either way, the $500 mark makes it clear that ‘value’ and ‘cheap’ don’t necessarily mean the same thing.
NVIDIA GeForce Specification Comparison | ||||||
RTX 2070 Founder's Edition |
RTX 2070 | GTX 1070 | RTX 2080 | |||
CUDA Cores | 2304 | 2304 | 1920 | 2944 | ||
ROPs | 64 | 64 | 64 | 64 | ||
Core Clock | 1410MHz | 1410MHz | 1506MHz | 1515MHz | ||
Boost Clock | 1710MHz | 1620MHz | 1683MHz | 1710MHz FE: 1800MHz |
||
Memory Clock | 14Gbps GDDR6 | 14Gbps GDDR6 | 8Gbps GDDR5 | 14Gbps GDDR6 | ||
Memory Bus Width | 256-bit | 256-bit | 256-bit | 256-bit | ||
VRAM | 8GB | 8GB | 8GB | 8GB | ||
Single Precision Perf. | 7.9 TFLOPs | 7.5 TFLOPs | 6.5 TFLOPs | 10.1 TFLOPs | ||
"RTX-OPS" | 45T | 45T | N/A | 60T | ||
SLI Support | No | No | Yes | Yes | ||
TDP | 185W | 175W | 150W | 215W FE: 225W |
||
GPU | TU106 | TU106 | GP104 | TU104 | ||
Transistor Count | 10.8B | 10.8B | 7.2B | 13.6B | ||
Architecture | Turing | Turing | Pascal | Turing | ||
Manufacturing Process | TSMC 12nm "FFN" | TSMC 12nm "FFN" | TSMC 16nm | TSMC 12nm "FFN" | ||
Launch Date | 10/17/2018 | N/A | 06/10/2016 | 09/20/2018 | ||
Launch Price | $599 | $499 | MSRP: $379 Founders $449 |
MSRP: $699 Founders $799 |
For the RTX 2070, its value would be measured by both traditional rasterization performance, and hybrid rendering performance. For the former, the GeForce GTX 1080 sits at the $500 price point, so that is very much the card to beat, with the AMD Radeon Vega 64 and GeForce GTX 1070 Ti also offering similar levels of performance. Beating the GTX 1080 by a significant margin will in turn offer more to those still on older generation cards like the GTX 970 & 980. But trading blows with the GTX 1080 would leave the RTX 2070 in a situation where it is priced higher with less availability for equivalent traditional performance. As an aside, HDR presents a wrinkle where the RTX 20 series incurs less of a performance hit, but the difference varies per game and only a selection of games support HDR in the first place.
Unfortunately, accurate hybrid rendering performance remains somewhat of a mystery. Games have yet to bring support for RTX platform features, and additionally DXR itself is only just starting to rollout as part of Windows 10 October 2018 Update (1809), itself delayed due to data-loss issues. RTX platform features like realtime ray tracing and DLSS come at a steep cost; currently, the RTX 2080 Ti stands at $1200 and the RTX 2080 at $800, and now with the $600 RTX 2070 as the entry card for those features. So for gamers interested in using realtime ray tracing, it's still unclear what we should expect as far as real-world hybrid rendering performance is concerned; in any case, the RTX 2070 does not support SLI, which precludes a future mGPU drop-in upgrade. That is to say, if the RTX 2070’s real time ray tracing performance target for resolution/framerate is significantly lower than the RTX 2080 Ti or 2080, there won’t be an easy solution in the form of doubling up 2070s.
In any case, the RTX 2070 is built on its own GPU, TU106, rather than being a cut-down version of TU104, and by the numbers offers 75% of the RTX 2080’s shading/texturing/tensor resources with the same ROP count and 256-bit memory bus. Considering the SM-heavy nature of ray tracing workloads, it would be interesting to investigate once real time ray tracing and DXR is fully released to the public in production-ready in games.
But as a straight upgrade, the RTX 2070 is in a delicate situation. What we know already is where the RTX 2080 Ti and RTX 2080 lie in terms of conventional gaming performance; the RTX 2080 Ti is roundabouts the Titan V, while the RTX 2080 is comparable to the Titan Xp and GTX 1080 Ti. As the top two performing cards of the stack, there’s some natural leeway with premiums, but the RTX 2070 does not have that luxury as the x70 part, and will be right in the mix of Pascal with the GTX 1080 and GTX 1070 Ti in the $450 to $600 range, along with GTX 1080 Ti models at the $700 mark. The GTX 1080s priced at $490 could act as a significant spoiler if there are issues with launch inventory, which has already caused delays in the RTX 2080 Ti.
Beyond that, the biggest open questions are all about the RTX platform features like realtime ray tracing and DLSS. Gamers considering making the plunge will be looking at the RTX 2070 as the entry point, but right now there is no accurate and generalizable way to determine what that entry level performance would look like in the real world.
Fall 2018 GPU Pricing Comparison | |||||
AMD | Price | NVIDIA | |||
$1199 | GeForce RTX 2080 Ti | ||||
$799 | GeForce RTX 2080 | ||||
$709 | GeForce GTX 1080 Ti | ||||
$599 | GeForce RTX 2070 | ||||
Radeon RX Vega 64 | $569 | ||||
Radeon RX Vega 56 | $489 | GeForce GTX 1080 | |||
$449 | GeForce GTX 1070 Ti | ||||
$399 | GeForce GTX 1070 | ||||
Radeon RX 580 (8GB) | $269/$279 | GeForce GTX 1060 6GB (1280 cores) |
121 Comments
View All Comments
Arbie - Tuesday, October 16, 2018 - link
Thanks for including Ashes Escalation in the results. I hope you will continue to do so. This is a unique game with great features.abufrejoval - Tuesday, October 16, 2018 - link
I find a lot of the discussions around here odd: Lots of people trying to convince each other that only their choice makes any sense… Please, let’s just enjoy that there are a lot more choices, even if that can be difficult.For me, compute pays the rent, gaming is a side benefit. So I aimed for maximum GPU memory and lowest noise, because training neural network can take a long time and I don’t have an extra room to spare. It was a GTX 1070 from Zotac, 150 Watts TDP, compact, low noise at high loads, not exactly a top performer in games, ok at 1080 slightly overwhelmed here and there with my Oculus Rift CV1, although quite ok with the DK2. I added a GTX 1050ti on another box mostly because it would do video conversion just as fast, but run extremely quiet and at zero power on that 24x7 machine.
Then I made a 'mistake': I bought a 43” 4k monitor to replace a threesome of 24” 1080 screens.
Naturally now games wouldn’t focus on one of those, but the big screen, which is 4x the number of pixels. With a screen so big and so close, I cannot really discern all pixel together at all times, but when I swivel my head, I will notice if pixels in my focus are sharp or blurred, so cutting down on resolution or quality won’t really do.
I replaced the 1070 with the top gun available at the time, a GTX 1080ti.
Actually, it wasn’t really the top gun, I got a Zotac Mini which again was nicely compact and low noise, does perfectly fine for GPU compute, but will settle on 180Watts for anything long-term. It’s very hard to achieve better than 70% utilization on GPU machine learning compute jobs, so all of these GPUs (except a mobile 1070) tend to stay very quiet.
A desperate friend took the 1050ti off my hands, because he needed something that wouldn’t require extra power connectors, so I chipped in some extra dosh and got a GTX 1060(6GB) to replace it. Again, I went for a model recommended for low noise from MSI, but was shocked to see that it was vastly bigger than the 1080ti in every direction when I unpacked it. It was, however, very silent even at top gaming loads, a hard squeeze to fit inside the chassis but a perfect fit for ‘noise’ and a surprisingly adequate for 1080 gaming at 120 Watts.
The reason I keep quoting those Watts is my observation that it’s perhaps a better sign of effective GPU power than the chip, as long as generation and process size are the same: There is remarkably little difference between the high-clocked 1060 at 120Watts, the average clocked 1070 at 150 Watts and the low-clocked 1080ti at 180Watts. Yes, the 1080ti will go to 250 Watts for bursts and deliver accordingly. But soon physics will weigh in onto that 1080ti and increasing fan speed does nothing but add noise, because surface area much like displacement in an engine is hard to replace.
I got an RTX 2080ti last week, because I want to explore INT8 and INT4 for machine learning inference vs. FP16 or FP32 training: A V100 only gives me FP16 and some extra cores and bandwidth while it costs 4x as much, even among friends. That makes the Turing based consumer product an extremely good deal for my use case: I don’t care for FP64, ECC or GPU virtualization enough to pay the Tesla/Quadro premiums.
And while the ML stuff will take weeks if not moths to figure out and measure, I am glad to report, that the Palit RTX 2080ti (only one available around here) turned out to be nicely quiet and finally potent enough to run ARK Survival Evolved at full quality at 4k without lags. Physically it’s a monster, but that also means it sustains 250 Watts throughout. That’s exactly how much a GTX 980ti and an R290X gulped from the mains inside that very same 18-Core Xeon box, but with performance increases harking back to the best times of Gordon Moore’s prediction.
IMHO discussions about the 2xxx delivering 15% more speed at 40% higher prices vs. 1xxx GPUs are meaningless: 15FPs vs. 9 FPs or 250FPs vs. 180FPs are academic. The GTX 1080ti failed at 4k, I had to either compromise quality or go down to 3k: I liked neither. The RTX 2080ti won’t deliver 100FPs at 4k: I couldn’t care less! But it never drops below 25FPS neither, and that makes it worth all the money to a gamer, while actually INT8 and INT4 compute will pay the bill for me.
I can’t imagine buying an RTX 1070 for myself, because I have enough systems and choices. But even I can imagine how someone would want the ability to explore ray-tracing or machine learning on a budget that offers a choice between a GTX 1080ti or an RTX 1070: Not an easy compromise to make, but a perfectly valid choice made millions of times.
Don't waste breath or keystrokes on being 'religious' about GPU choices: Enjoy a new generation of compute and bit of quality gaming on the side!
abufrejoval - Tuesday, October 16, 2018 - link
s/RTX 1070/RTX 2070 above: Want edit! It's this RTX 2070 which may not make a lot of sense to pure blooded games, except if they are sure that they continue to run at 1920x1080 over the next couple of years (where a GTX 1080ti is overkill) *and* want to try next generations graphics.Flunk - Tuesday, October 16, 2018 - link
So Nvidia has decided to push all their card numbers down one because AMD isn't competitive at the moment. The 2060 is now the 2070, 2070 is the 2080 and the 2080 is the 2080 TI. This sort of hubris is just calling out for a competitor to arrive and sell a more competitively priced product.As for ray tracing, I'll eat my hat if the 2070 can handle ray-tracing in real games at reasonable frame-rates and real resolutions when they arrive.
Kakti - Tuesday, October 16, 2018 - link
TBH...who gives a crap? With the advent of usable integrated GPUs from Intel and AMD, dGPU vendors are basically no longer making x20, x30 or x40 cards. So maybe they're just pushing up the product stack - instead of "enthusiasts" buying x60, x70 and x80 cards, we'll now be buying x50, x60, x70 and halo x80 products. I could care less what the badge number is for my card, what I care about it performance vs price.That said, I don't think I'll ever by a dGPU for more than $400. The highest I've ever paid was I think ~$350 for my 970 or 670. As long as there's a reasonably competitive card in the $300-$400 USD range, I don't care what they call it - it could be a RTX 2110 and I'll snap it up. Given the products NVidia has released so far under the RTX line, I'm going to wait and see what develops. Either I'll grab a cheap used 1080/1080ti or wait for smaller and cheaper 2100 cards. NV can ask whatever they want for a card, but at the end of the day most consumers have a price ceiling in which they won't purchase anything above. Seems like a lot of people are in the 350-500 range so either prices will have to come down or cheaper products will come out. I'm curious whether NV will make any more GTX cards since Tensor cores not only aren't that usable right now, but dramatically increase the fab cost given their size and complexity.
Yojimbo - Wednesday, October 17, 2018 - link
Nahh, look at the die sizes. The 2080 is bigger than the 1080 Ti. The 2070 is bigger than the 1080. The price/performance changes are not because NVIDIA is pushing the cards down one, it's entirely because of the resources spent for ray tracing capabilities. As far as the 2070's ability to handle ray tracing, we won't really know for a few more months.As for competitors, if AMD had a competitive product now they might be cleaning up. But since they don't, by the time they or some other competitor (Intel) does arrive they will probably need real time ray tracing to compete.
No one is forcing you to buy an RTX. If you're not interested in real time ray tracing you probably shouldn't be buying an RTX, and the introduction of RTX has forced the 10 series (and probably soon the Vega and Polaris series) prices down.
Voodoo2-SLI - Tuesday, October 16, 2018 - link
WQHD Performance Index for AnandTech's GeForce RTX 2070 Launch Review165.1% ... GeForce RTX 2080 Ti FE
137.5% ... GeForce RTX 2080 FE
115.3% ... GeForce RTX 2070 FE
110.6% ... GeForce RTX 2070 Reference
126.8% ... GeForce GTX 1080 Ti FE
100% ..... GeForce GTX 1080 FE
81,7% .... GeForce GTX 1070 FE
99,2% .... Radeon RX Vega 64 Reference
Index from 15 other launchreviews with an overall performance index of the GeForce RTX 2070 launch here:
https://www.3dcenter.org/news/geforce-rtx-2070-lau...
risa2000 - Wednesday, October 17, 2018 - link
What is exactly "RTX 2080" which is bouncing around the tables? I did not find any reference in the test description chapter. I assumed it could be card "stock clocked" RTX 2080 FE, but it seems these cards are not always performing in expected order (sometimes 2080 beats 2080 FE).Also, in the temp and noise section, there are two cards: 2080 and "2080 (baseline)" which give again quite different thermal and noise results.
Achaios - Wednesday, October 17, 2018 - link
Too much blabbering in the comments section. Way I see it:GTX 2070 offers the same performance with a GTX 1080, is significantly more expensive than the GTX 1080 whilst being less power efficient and hotter at the same time.
/Thread
milkod2001 - Wednesday, October 17, 2018 - link
Well and accurately said.+1