Tuesday, October 16th 2018

NVIDIA Releases Comparison Benchmarks for DLSS-Accelerated 4K Rendering

NVIDIA released comparison benchmarks for its new AI-accelerated DLSS technology, which is part of their new Turing architecture's call to fame. Using the Infiltrator benchmark with its stunning real-time graphics, NVIDIA showcased the performance benefits of using DLSS-improved 4K rendering instead of the usual 4K rendering + TAA (Temporal Anti-Aliasing). Using a Core i9-7900X 3.3GHz CPU paired with 16 GB of Corsair DDR4 memory, Windows 10 (v1803) 64-bit, and version 416.25 of the NVIDIA drivers, the company showed tremendous performance improvements that can be achieved with the pairing of both Turing's architecture strengths and the prowess of DLSS in putting Tensor cores to use in service of more typical graphics processing workloads.

The results speak for themselves: with DLSS at 4K resolution, the upcoming NVIDIA RTX 2070 convincingly beats its previous-gen pair by doubling performance. Under these particular conditions, the new king of the hill, the RTX 2080 Ti, convincingly beats the previous gen's halo product in the form of the Titan Xp, with a 41% performance lead - but so does the new RTX 2070, which is being sold at half the asking price of the original Titan Xp.
Source: NVIDIA Blogs
Add your own comment

43 Comments on NVIDIA Releases Comparison Benchmarks for DLSS-Accelerated 4K Rendering

#1
bug
Wth? You want a comparison, you compare the same card with DLSS on an off.
Posted on Reply
#2
Gungar
bugWth? You want a comparison, you compare the same card with DLSS on an off.
You do that when you are a honest business.
Posted on Reply
#3
SIGSEGV
surely, I will stay on 24.21.13.9731 driver after Huang talking too much about benchmarks on his new cards. I ain't tempted to update the driver and don't give a damn on DLSS.
Posted on Reply
#4
SetsunaFZero
SIGSEGVsurely, I will stay on 24.21.13.9731 driver after Huang talking too much about benchmarks on his new cards. I ain't tempted to update the driver and don't give a damn on DLSS.
all nv card perform good in common benchmarks, like Heaven, 3d mark etc. even after some years. sneaky nv changed render paths for some games to perform less better and boom the new GFx gen performs much better also they stop optimizing older Generations. So naturally the new Gen will perform better
Posted on Reply
#5
bug
SetsunaFZeroall nv card perform good in common benchmarks, like Heaven, 3d mark etc. even after some years. sneaky nv changed render paths for some games to perform less better and boom the new GFx gen performs much better also they stop optimizing older Generations. So naturally the new Gen will perform better
Yes, that's why the gap between AMD and Nvidia is widening with each generation: because of Nvidia's driver tricks :rolleyes:
Posted on Reply
#6
Camm
Wouldn't it be more apt to compare that with previous gen cards running a lower rendering resolution with TAA on, since that's what DLSS is effectively doing.
Posted on Reply
#7
bug
CammWouldn't it be more apt to compare that with previous gen cards running a lower rendering resolution with TAA on, since that's what DLSS is effectively doing.
Don't worry too much about it, this will get benchmarked to death when the time comes. I was just saying, whatever Nvidia did is arguably the least relevant test setup of them all.
Posted on Reply
#8
nemesis.ie
@Camm Possibly, but it really depends on what the final result looks like.

What would be interesting it to have a quality comparison of a capture of a very detailed scene - maybe something like a UHD or FUHD test pattern, one of the ones with lines and circles.

Of course if DLSS was run on such patterns it might just pull a full quality copy out of the air and present it.

I have a bit of an issue with this "pre-trained" stuff. It means apps from e.g. indie devs may never see it - or will nvidia train anyone's app for free?
Posted on Reply
#9
metalslaw
Will dlss only apply to new games? Or will they go over a catalogue of older games of dx12? (and dx11?)
Posted on Reply
#10
HTC
metalslawWill dlss only apply to new games? Or will they go over a catalogue of older games of dx12? (and dx11?)
From my quite limited knowledge of this technology, only games specifically coded for it will be able to use it, which is why you're not seeing anyone using DLSS with current games.

DLSS requires many man hours for the game developers to place it in their games and because the cards that can use them (2070 and up: "jury's still out" on 2060 and below) are so expensive, only enthusiasts are supposed to buy them. Although many here @ TPU are enthusiasts, the vast majority are not, which means a lot of effort for only a small percentage "of audience".

Personally, i seriously doubt ray tracing will get traction with this generation of GPUs. Don't get me wrong: it's undoubtedly the future ... but that's still in the future ... how close that future is remains to be seen.
Posted on Reply
#11
bug
HTCFrom my quite limited knowledge of this technology, only games specifically coded for it will be able to use it, which is why you're not seeing anyone using DLSS with current games.

DLSS requires many man hours for the game developers to place it in their games and because the cards that can use them (2070 and up: "jury's still out" on 2060 and below) are so expensive, only enthusiasts are supposed to buy them. Although many here @ TPU are enthusiasts, the vast majority are not, which means a lot of effort for only a small percentage "of audience".

Personally, i seriously doubt ray tracing will get traction with this generation of GPUs. Don't get me wrong: it's undoubtedly the future ... but that's still in the future ... how close that future is remains to be seen.
I don't think DLSS is that hard to implement. But it needs per-title training to know how to optimize things. Something like a game profile. So far, it is unclear whether Nvidia will take up this task and include these in their drivers or let each developer do their thing.
And yes, I agree everything won't turn ray-tracing all of a sudden. But this generation is the first that enables developers to at least start assessing ray-tracing. Whether this stands on its own or goes the way of the dodo (like Mantle before it) making way for a better implementation, doesn't concern me very much.
Posted on Reply
#12
HTC
bugI don't think DLSS is that hard to implement. But it needs per-title training to know how to optimize things. Something like a game profile. So far, it is unclear whether Nvidia will take up this task and include these in their drivers or let each developer do their thing.
And yes, I agree everything won't turn ray-tracing all of a sudden. But this generation is the first that enables developers to at least start assessing ray-tracing. Whether this stands on its own or goes the way of the dodo (like Mantle before it) making way for a better implementation, doesn't concern me very much.
From my understanding, i'd say no because this needs to be a game developer thing rather then a driver thing, otherwise yes: you'd be correct.

Agreed. I'd say nVidia would have a far better chance to have this succeed if they enable ray tracing to all 2000 series cards. I don't see that happening because even 2080ti struggles with it currently, meaning 2060 and below will have no chance.
Posted on Reply
#13
bug
HTCFrom my understanding, i'd say no because this needs to be a game developer thing rather then a driver thing, otherwise yes: you'd be correct.
The current game profiles also look like game developer thing. Yet it's Nvidia who does it. Or at least they package the results after working together with the developer. That's why I think this could go either way.
HTCAgreed. I'd say nVidia would have a far better chance to have this succeed if they enable ray tracing to all 2000 series cards. I don't see that happening because even 2080ti struggles with it currently, meaning 2060 and below will have no chance.
For exploratory testing and proof of concepts, not much power is required. Not having to wait for days of weeks to see the results is already progress enough. Though in the end, if indeed lower tiered cards don't get the RT hardware, I expect that to be more of a cost control, rather than an enablement decision.
Posted on Reply
#14
coolernoob
dlss = same performance and IQ penalty as a simple resolution decrease and that even on nvidia picked benchmarks souce
Posted on Reply
#15
HTC
bugThe current game profiles also look like game developer thing. Yet it's Nvidia who does it. Or at least they package the results after working together with the developer. That's why I think this could go either way.

For exploratory testing and proof of concepts, not much power is required. Not having to wait for days of weeks to see the results is already progress enough. Though in the end, if indeed lower tiered cards don't get the RT hardware, I expect that to be more of a cost control, rather than an enablement decision.
This would make the difference between the size of the target "audience" and could totally make most, if not all, developers "come on board" with it, which is why it's so very important, IMO.

The way i see it, nVidia is too ambitious and the difference between RTX On / Off is far too great, thus having a very serious dent on performance that not even the 2080ti can really account for. I see this as move to leave AMD's cards "in the dust" more than the drive to give better quality @ higher resolutions. If the difference were much smaller, even lower tiered 2000 series cards could enable it but AMD's cards could too, potentially. To "move in for the kill", nVidia is trying to make sure AMD has no chance, which is why not even the 2080ti is enough.
Posted on Reply
#16
R0H1T
bugDon't worry too much about it, this will get benchmarked to death when the time comes. I was just saying, whatever Nvidia did is arguably the least relevant test setup of them all.
PT (& Intel) say hola :)
Posted on Reply
#17
stimpy88
nGreedia are in the same dark hole that Intel like to live in...

I would not believe a word either of them say about anything.
Posted on Reply
#18
jmcosta
SetsunaFZeroall nv card perform good in common benchmarks, like Heaven, 3d mark etc. even after some years. sneaky nv changed render paths for some games to perform less better and boom the new GFx gen performs much better also they stop optimizing older Generations. So naturally the new Gen will perform better
that is quite normal, you can see that with AMD as well, those old cards simply weren't built for modern render technologies and when you compare driver to driver in the same old cards, you will see a small increase or very similar numbers
they never roll out a bad driver

When a new game launches (with or without Nvidia tech), older GPUs don't get the full optimization treatment, the way that the current ones would
We have seen multiple scenarios where old GPUs with more raw power get beaten by newer ones with less.
Posted on Reply
#19
M2B
HTCFrom my quite limited knowledge of this technology, only games specifically coded for it will be able to use it, which is why you're not seeing anyone using DLSS with current games.

DLSS requires many man hours for the game developers to place it in their games and because the cards that can use them (2070 and up: "jury's still out" on 2060 and below) are so expensive, only enthusiasts are supposed to buy them. Although many here @ TPU are enthusiasts, the vast majority are not, which means a lot of effort for only a small percentage "of audience".

Personally, i seriously doubt ray tracing will get traction with this generation of GPUs. Don't get me wrong: it's undoubtedly the future ... but that's still in the future ... how close that future is remains to be seen.
DLSS needs an engine with temporal anti-aliasing support to be implemented.
Temporal Anti Aliasing is the most popular AA method nowadays.
DLSS implementation is relatively easy and doesn't need that much effort.
Posted on Reply
#20
mobiuus
geez dlss - new gen tlaa - new gen blurryness... why didn't they compare let's say 4x msaa...? or better when i think why didn't they create a new and true anti-aliasing like super sample but with better performance??
who need's wahed up downscaled graphics and that with ray tracing?? lol this generation is a flawed and foolish investment
Posted on Reply
#21
bug
DarkStalkergeez dlss - new gen tlaa - new gen blurryness... why didn't they compare let's say 4x msaa...? or better when i think why didn't they create a new and true anti-aliasing like super sample but with better performance??
who need's wahed up downscaled graphics and that with ray tracing?? lol this generation is a flawed and foolish investment
That's exactly what they did.
If you want to fault them, a better question would be: why bother with AA when your card can push 4k?
Posted on Reply
#22
M2B
DarkStalkergeez dlss - new gen tlaa - new gen blurryness... why didn't they compare let's say 4x msaa...? or better when i think why didn't they create a new and true anti-aliasing like super sample but with better performance??
who need's wahed up downscaled graphics and that with ray tracing?? lol this generation is a flawed and foolish investment
the fuck are you talking about?
DLSS can only be implemented in games with TAA support, how the heck you can compare DLSS with MSAA when your specific game/application doesn't even support it?
you can't make a proper apples to apples comparison between DLSS and MSAA or any other AA method when your specific Game/Application doesn't support both methods at the same time.
Posted on Reply
#23
mobiuus
M2Bthe fuck are you talking about?
DLSS can only be implemented in games with TAA support, how the heck you can compare DLSS with MSAA when your specific game/application doesn't even support it?
you can't make a proper apples to apples comparison between DLSS and MSAA or any other AA method when your specific Game/Application doesn't support both methods at the same time.
why can't i compare msaa 4x when pretty much all games and all generations gpus support that antialiasing??
Posted on Reply
#24
M2B
bugThat's exactly what they did.
If you want to fault them, a better question would be: why bother with AA when your card can push 4k?
Anti-Aliasing is needed even at 4K.
Posted on Reply
#25
HTC
M2BDLSS needs an engine with temporal anti-aliasing support to be implemented.
Temporal Anti Aliasing is the most popular AA method nowadays.
DLSS implementation is relatively easy and doesn't need that much effort.
If it were, current games would already be benefiting from it and yet they're not.

Heck: Shadow of the Tomb Raider doesn't have it and this game was one of the few showcased with it enabled when Turing cards were released, during that presentation by nVidia's CEO. How long ago was that, exactly? If it were easy, as you claim, it should be already enabled, no?
Posted on Reply
Add your own comment
Nov 21st, 2024 10:39 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts