Concluding their Gamescom festivities for their newly-introduced GeForce RTX 20-series, NVIDIA has revealed a bit more about the hardware, its features, and its expected performance this evening. Tonight NVIDIA is announcing the new Ansel RTX features in GeForce Experience, as well as some game performance metrics for the GeForce RTX 2080 up against the GeForce GTX 1080. After recent hands-on demos featuring real-time raytracing, NVIDIA is offering some numbers for out-of-the-box and Deep Learning Super Sampling (DLSS) performance in traditionally rendered games.

NVIDIA RTX Support for Games
As of August 20, 2018
Game Real-Time Raytracing Deep Learning Super Sampling (DLSS)
Ark: Survival Evolved - Yes
Assetto Corsa Competizione Yes -
Atomic Heart Yes
Battlefield V Yes -
Control Yes -
Dauntless - Yes
Enlisted Yes -
Final Fantasy XV - Yes
Fractured Lands - Yes
Hitman 2 - Yes
Islands of Nyne - Yes
Justice Yes
JX3 Yes
MechWarrior 5: Mercenaries Yes
Metro Exodus Yes -
PlayerUnknown's Battlegrounds - Yes
ProjectDH Yes -
Remnant: From the Ashes - Yes
Serious Sam 4: Planet Badass - Yes
Shadow of the Tomb Raider Yes -
The Forge Arena - Yes
We Happy Few - Yes

Starting with NVIDIA’s DLSS – and real-time raytracing for that matter – we already know of the supported games list. What they are disclosing today are some face-value 4K performance comparisons and results. For DLSS, for now we can only say that it uses tensor core-accelerated neural network inferencing to generate what NVIDIA is saying will be high-quality super sampling-like anti aliasing. Though for further technical information, this is a project NVIDIA has been working on for a while, and they have published some blogs and papers with some more information on some of the processes used. At any rate, the provided metrics are sparse on settings or details, and notably measurements include several games rendered in HDR (though HDR shouldn't have a performance impact).

Otherwise, NVIDIA presented a non-interactive Epic Infiltrator 4K demo that was later displayed on the floor, comparing Temporal Anti Aliasing (TAA) to DLSS, where the latter provided on-average near-identical-or-better image quality but at a lower performance cost. In this case, directly improving framerates. To be perfectly honest, I spent the entire floor time talking with NVIDIA engineers and driver/software developers, so I have no pictures of the floor demo (not that anything less than a direct screenshot will really do it justice). Ultimately, the matter of DLSS is somewhat nuanced and there isn’t much we can add at the moment.

Overall, the idea is that even in traditionally rasterized games without DLSS, the GeForce RTX 2080 brings around 50% higher performance than the GeForce GTX 1080 under 4K HDR 60Hz conditions. Because this excludes real-time raytracing or DLSS, this would be tantamount to ‘out of the box’ performance. Though there were no graphics settings or driver details to go with these disclosed framerates, so I'm not sure I'd suggest reading into these numbers and bar charts one way or another.

Lastly, NVIDIA announced several new features, filters, and supported games for GeForce Experience’s Ansel screenshot feature. Relating to GeForce RTX, one of the features is Ansel RT for supported ray-traced games, where a screenshot can be taken with a very high number of rays, unsuitable for real-time but not an issue for static image rendering.

Ansel RTX also leverages a similar concept to the tensor core accelerated DLSS with ‘AI Up-Res’ super resolution, which also works for games not integrated with Ansel SDK.

In terms of the GeForce RTX performance, this is more-or-less a teaser of things to come. But as always with unreleased hardware, judgement should be reserved until objective measurements and further details. We will have much more to say when the time comes.

POST A COMMENT

92 Comments

View All Comments

  • Santoval - Wednesday, August 22, 2018 - link

    DLSS is not an antialising technique, because antialising cannot have a frame rate performance benefit. DLSS will employ deep learning to super-sample games (I just described its acronym btw)) that are running internally at a lower than 4K resolution (probably 1440p) and output them at 4K. Which is why Huang was showing these low res cats etc being super-sampled to a higher resolution in his presentation. If DLSS is effective enough the results might be practically identical. Reply
  • Alexvrb - Wednesday, August 22, 2018 - link

    What you just described is the opposite of supersampling.

    https://en.wikipedia.org/wiki/Supersampling

    Supersampling aka FSAA is the oldest form of AA. If they're *actually* supersampling (and not just using the term for marketing), they're rendering a higher resolution, then downsampling to a lower res. This method uses the extra samples for color calc. With various types of AA there can be a performance benefit vs FSAA, but not a performance benefit vs running without any AA in the first place.

    Again, this assumes they're actually performing some form of supersampling and not just marketing it as such to be extremely obnoxious.
    Reply
  • Yojimbo - Thursday, August 23, 2018 - link

    Yeah maybe it should be called deep learning upsampling. But let's try to make a case for calling it supersampling.

    One could argue that what you described is supersampling anti-aliasing, and not just supersampling. Supersampling would mean choosing more samples than exist in the render output target, taken from some other space. The space does not necessarily have to be a greater resolution render, that's just one possibility. In this case the space is an interpolation produced by a neural network operating on the initial render output. So they get these super samples and then, since they are not anti-aliasing, they don't downsample. Instead, they simply render at the new supersampled resolution. Deep learning supersampling?
    Reply
  • Yojimbo - Thursday, August 23, 2018 - link

    One minor clarification. I should say ..."taken from some other higher resolution space", because otherwise the term "super" isn't justified. But the neural network interpolation is such a higher resolution space that was made without rendering at a higher resolution. Reply
  • Yojimbo - Thursday, August 23, 2018 - link

    You know what, it seems they are comparing this to Temporal Anti-Aliasing. I guess they are doing temporal super sampling in an interpolated space created by deep learning inference on the rendered output. But I dunno, I'm just guessing here. I don't really know how temporal anti-aliasing works. Maybe someone with knowledge of anti-aliasing and temporal anti-aliasing can help. Reply
  • Yojimbo - Wednesday, August 22, 2018 - link

    Prior to DLSS I used to render at lower quality and then manually super-sample through interactive yes-no queries. But it was hard to do and maintain good game play. I bought a 6-button mouse just for that purpose, though. Reply
  • Yojimbo - Wednesday, August 22, 2018 - link

    The 1080 Ti came out over 9 months later, so it's not a fair comparison. It's definitely not an "apples-to-apples comparison", unlike what you claim.

    The GTX 1080 launched at $699 for the founders edition. The RTX 2080 is going to launch at $799 for the founders edition. So it's more expensive but not nearly as much as you're making it sound to be.
    Reply
  • Kvaern1 - Thursday, August 23, 2018 - link

    "The GTX 1080 launched at $699 for the founders edition. The RTX 2080 is going to launch at $799 for the founders edition. So it's more expensive but not nearly as much as you're making it sound to be."

    Furthermore. $699 was the MSRP for the 1080 FE but the retail price was considerably higher for months after the launch.
    Reply
  • RSAUser - Tuesday, August 28, 2018 - link

    That's not even the problem.
    1080 in SLI is not even close to the performance of two cards.
    Lots of these games (especially the more extreme ones) use HDR, which hammers the 1080 by 20% for some reason, and 4k, when the 1080 is known to lack bandwidth at 4k.
    These are the most cherry-picked results and the 2080 is probably only 15-20% faster truly, matching or slightly beating the 1080 Ti depending on the workload, while being at a way higher price.
    Reply
  • Samus - Thursday, August 23, 2018 - link

    I think what happened is the last generation (really the current generation cards) launched at a time the mining thing went ape shit and they were all inflated. So now things are returning to semi-normal, thats why the 2080 is priced at the MSRP of the 1080Ti. Reply

Log in

Don't have an account? Sign up now