The Test

Because NVIDIA is not productizing any other reference-quality GeForce RTX 2070 card besides the Founders Editions, which have non-reference specifications, we've gone ahead and emulated the true reference specifications with a 90MHz downclock and lowering the TDP by roughly 10W. This is to keep comparisons standardized and apples-to-apples, as we always look at reference-to-reference results.

CPU: Intel Core i7-7820X @ 4.3GHz
Motherboard: Gigabyte X299 AORUS Gaming 7 (F9g)
Power Supply: EVGA 1000 G3
Hard Disk: OCZ Toshiba RD400 (1TB)
Memory: G.Skill TridentZ DDR4-3200 4 x 8GB (16-18-18-38)
Case: NZXT Phantom 630 Windowed Edition
Monitor: LG 27UD68P-B
Video Cards: AMD Radeon RX Vega 64 (Air Cooled)
NVIDIA GeForce RTX 2080 Ti Founders Edition
NVIDIA GeForce RTX 2080 Founders Edition
NVIDIA GeForce RTX 2070 Founders Edition
NVIDIA GeForce GTX 1080 Ti Founders Edition
NVIDIA GeForce GTX 1080 Founders Edition
NVIDIA GeForce GTX 1070 Ti Founders Edition
NVIDIA GeForce GTX 1070 Founders Edition
NVIDIA GeForce GTX 980 Ti
NVIDIA GeForce GTX 980
NVIDIA GeForce GTX 970
Video Drivers: NVIDIA Release 416.33 Press
AMD Radeon Software Adrenalin Edition 18.9.1
OS: Windows 10 Pro (April 2018 Update)
Spectre/Meltdown Mitigations Yes, both
Meet The GeForce RTX 2070 Founders Edition Battlefield 1


View All Comments

  • Arbie - Tuesday, October 16, 2018 - link

    Thanks for including Ashes Escalation in the results. I hope you will continue to do so. This is a unique game with great features. Reply
  • abufrejoval - Tuesday, October 16, 2018 - link

    I find a lot of the discussions around here odd: Lots of people trying to convince each other that only their choice makes any sense… Please, let’s just enjoy that there are a lot more choices, even if that can be difficult.

    For me, compute pays the rent, gaming is a side benefit. So I aimed for maximum GPU memory and lowest noise, because training neural network can take a long time and I don’t have an extra room to spare. It was a GTX 1070 from Zotac, 150 Watts TDP, compact, low noise at high loads, not exactly a top performer in games, ok at 1080 slightly overwhelmed here and there with my Oculus Rift CV1, although quite ok with the DK2. I added a GTX 1050ti on another box mostly because it would do video conversion just as fast, but run extremely quiet and at zero power on that 24x7 machine.

    Then I made a 'mistake': I bought a 43” 4k monitor to replace a threesome of 24” 1080 screens.

    Naturally now games wouldn’t focus on one of those, but the big screen, which is 4x the number of pixels. With a screen so big and so close, I cannot really discern all pixel together at all times, but when I swivel my head, I will notice if pixels in my focus are sharp or blurred, so cutting down on resolution or quality won’t really do.

    I replaced the 1070 with the top gun available at the time, a GTX 1080ti.

    Actually, it wasn’t really the top gun, I got a Zotac Mini which again was nicely compact and low noise, does perfectly fine for GPU compute, but will settle on 180Watts for anything long-term. It’s very hard to achieve better than 70% utilization on GPU machine learning compute jobs, so all of these GPUs (except a mobile 1070) tend to stay very quiet.

    A desperate friend took the 1050ti off my hands, because he needed something that wouldn’t require extra power connectors, so I chipped in some extra dosh and got a GTX 1060(6GB) to replace it. Again, I went for a model recommended for low noise from MSI, but was shocked to see that it was vastly bigger than the 1080ti in every direction when I unpacked it. It was, however, very silent even at top gaming loads, a hard squeeze to fit inside the chassis but a perfect fit for ‘noise’ and a surprisingly adequate for 1080 gaming at 120 Watts.

    The reason I keep quoting those Watts is my observation that it’s perhaps a better sign of effective GPU power than the chip, as long as generation and process size are the same: There is remarkably little difference between the high-clocked 1060 at 120Watts, the average clocked 1070 at 150 Watts and the low-clocked 1080ti at 180Watts. Yes, the 1080ti will go to 250 Watts for bursts and deliver accordingly. But soon physics will weigh in onto that 1080ti and increasing fan speed does nothing but add noise, because surface area much like displacement in an engine is hard to replace.

    I got an RTX 2080ti last week, because I want to explore INT8 and INT4 for machine learning inference vs. FP16 or FP32 training: A V100 only gives me FP16 and some extra cores and bandwidth while it costs 4x as much, even among friends. That makes the Turing based consumer product an extremely good deal for my use case: I don’t care for FP64, ECC or GPU virtualization enough to pay the Tesla/Quadro premiums.

    And while the ML stuff will take weeks if not moths to figure out and measure, I am glad to report, that the Palit RTX 2080ti (only one available around here) turned out to be nicely quiet and finally potent enough to run ARK Survival Evolved at full quality at 4k without lags. Physically it’s a monster, but that also means it sustains 250 Watts throughout. That’s exactly how much a GTX 980ti and an R290X gulped from the mains inside that very same 18-Core Xeon box, but with performance increases harking back to the best times of Gordon Moore’s prediction.

    IMHO discussions about the 2xxx delivering 15% more speed at 40% higher prices vs. 1xxx GPUs are meaningless: 15FPs vs. 9 FPs or 250FPs vs. 180FPs are academic. The GTX 1080ti failed at 4k, I had to either compromise quality or go down to 3k: I liked neither. The RTX 2080ti won’t deliver 100FPs at 4k: I couldn’t care less! But it never drops below 25FPS neither, and that makes it worth all the money to a gamer, while actually INT8 and INT4 compute will pay the bill for me.

    I can’t imagine buying an RTX 1070 for myself, because I have enough systems and choices. But even I can imagine how someone would want the ability to explore ray-tracing or machine learning on a budget that offers a choice between a GTX 1080ti or an RTX 1070: Not an easy compromise to make, but a perfectly valid choice made millions of times.

    Don't waste breath or keystrokes on being 'religious' about GPU choices: Enjoy a new generation of compute and bit of quality gaming on the side!
  • abufrejoval - Tuesday, October 16, 2018 - link

    s/RTX 1070/RTX 2070 above: Want edit! It's this RTX 2070 which may not make a lot of sense to pure blooded games, except if they are sure that they continue to run at 1920x1080 over the next couple of years (where a GTX 1080ti is overkill) *and* want to try next generations graphics. Reply
  • Flunk - Tuesday, October 16, 2018 - link

    So Nvidia has decided to push all their card numbers down one because AMD isn't competitive at the moment. The 2060 is now the 2070, 2070 is the 2080 and the 2080 is the 2080 TI. This sort of hubris is just calling out for a competitor to arrive and sell a more competitively priced product.

    As for ray tracing, I'll eat my hat if the 2070 can handle ray-tracing in real games at reasonable frame-rates and real resolutions when they arrive.
  • Kakti - Tuesday, October 16, 2018 - link

    TBH...who gives a crap? With the advent of usable integrated GPUs from Intel and AMD, dGPU vendors are basically no longer making x20, x30 or x40 cards. So maybe they're just pushing up the product stack - instead of "enthusiasts" buying x60, x70 and x80 cards, we'll now be buying x50, x60, x70 and halo x80 products. I could care less what the badge number is for my card, what I care about it performance vs price.

    That said, I don't think I'll ever by a dGPU for more than $400. The highest I've ever paid was I think ~$350 for my 970 or 670. As long as there's a reasonably competitive card in the $300-$400 USD range, I don't care what they call it - it could be a RTX 2110 and I'll snap it up. Given the products NVidia has released so far under the RTX line, I'm going to wait and see what develops. Either I'll grab a cheap used 1080/1080ti or wait for smaller and cheaper 2100 cards. NV can ask whatever they want for a card, but at the end of the day most consumers have a price ceiling in which they won't purchase anything above. Seems like a lot of people are in the 350-500 range so either prices will have to come down or cheaper products will come out. I'm curious whether NV will make any more GTX cards since Tensor cores not only aren't that usable right now, but dramatically increase the fab cost given their size and complexity.
  • Yojimbo - Wednesday, October 17, 2018 - link

    Nahh, look at the die sizes. The 2080 is bigger than the 1080 Ti. The 2070 is bigger than the 1080. The price/performance changes are not because NVIDIA is pushing the cards down one, it's entirely because of the resources spent for ray tracing capabilities. As far as the 2070's ability to handle ray tracing, we won't really know for a few more months.

    As for competitors, if AMD had a competitive product now they might be cleaning up. But since they don't, by the time they or some other competitor (Intel) does arrive they will probably need real time ray tracing to compete.

    No one is forcing you to buy an RTX. If you're not interested in real time ray tracing you probably shouldn't be buying an RTX, and the introduction of RTX has forced the 10 series (and probably soon the Vega and Polaris series) prices down.
  • Voodoo2-SLI - Tuesday, October 16, 2018 - link

    WQHD Performance Index for AnandTech's GeForce RTX 2070 Launch Review

    165.1% ... GeForce RTX 2080 Ti FE
    137.5% ... GeForce RTX 2080 FE
    115.3% ... GeForce RTX 2070 FE
    110.6% ... GeForce RTX 2070 Reference
    126.8% ... GeForce GTX 1080 Ti FE
    100% ..... GeForce GTX 1080 FE
    81,7% .... GeForce GTX 1070 FE
    99,2% .... Radeon RX Vega 64 Reference

    Index from 15 other launchreviews with an overall performance index of the GeForce RTX 2070 launch here:
  • risa2000 - Wednesday, October 17, 2018 - link

    What is exactly "RTX 2080" which is bouncing around the tables? I did not find any reference in the test description chapter. I assumed it could be card "stock clocked" RTX 2080 FE, but it seems these cards are not always performing in expected order (sometimes 2080 beats 2080 FE).

    Also, in the temp and noise section, there are two cards: 2080 and "2080 (baseline)" which give again quite different thermal and noise results.
  • Achaios - Wednesday, October 17, 2018 - link

    Too much blabbering in the comments section. Way I see it:

    GTX 2070 offers the same performance with a GTX 1080, is significantly more expensive than the GTX 1080 whilst being less power efficient and hotter at the same time.

  • milkod2001 - Wednesday, October 17, 2018 - link

    Well and accurately said.

Log in

Don't have an account? Sign up now