Wednesday, February 13th 2019

NVIDIA DLSS and its Surprising Resolution Limitations

TechPowerUp readers today were greeted to our PC port analysis of Metro Exodus, which also contained a dedicated section on NVIDIA RTX and DLSS technologies. The former brings in real-time ray tracing support to an already graphically-intensive game, and the latter attempts to assuage the performance hit via NVIDIA's new proprietary alternative to more-traditional anti-aliasing. There was definitely a bump in performance from DLSS when enabled, however we also noted some head-scratching limitations on when and how it can even be enabled, depending on the in-game resolution and RTX GPU employed. We then set about testing DLSS on Battlefield V, which was also available from today, and it was then that we noticed a trend.

Take Metro Exodus first, with the relevant notes in the first image below. DLSS can only be turned on for a specific combination of RTX GPUs ranging from the RTX 2060 to the RTX 2080 Ti, but NVIDIA appear to be limiting users to a class-based system. Users with the RTX 2060, for example, can't even use DLSS at 4K and, more egregiously, owners of the RTX 2080 and 2080 Ti can not enjoy RTX and DLSS simultaneously at the most popular in-game resolution of 1920x1080, which would be useful to reach high FPS rates on 144 Hz monitors. Battlefield V has a similar, and yet even more divided system wherein the gaming flagship RTX 2080 Ti can not be used with RTX and DLSS at even 1440p, as seen in the second image below. This brought us back to Final Fantasy XV's own DLSS implementation last year, which was all or nothing at 4K resolution only. What could have prompted NVIDIA to carry this out? We speculate further past the break.
We contacted NVIDIA about this to get word straight from the green horse's mouth, hoping to be able to provide a satisfactory answer to you. Representatives for the company told us that DLSS is most effective when the GPU is at maximum work load, such that if a GPU is not being challenged enough, DLSS is not going to be made available. Accordingly, this implementation encourages users to turn on RTX first, thus increasing the GPU load, to then enable DLSS. It would thus be fair to extrapolate why the RTX 2080 Ti does not get to enjoy DLSS at lower resolutions, where perhaps it is not being taxed as hard.

We do not buy this explanation, however. Turning off VSync alone results in uncapped frame rates, which allow for a GPU load nearing 100%. NVIDIA has been championing high refresh rate displays for years now, and our own results show that we need the RTX 2080 and RTX 2080 Ti to get close to 144 FPS at 1080p, for that sweet 120+ Hz refresh rate display action. Why not let the end user decide what takes priority here, especially if DLSS aims to improve graphical fidelity as well? It was at this point where we went back to the NVIDIA whitepaper on their Turing microarchitecture, briefly discussed here for those interested.

DLSS, as it turns out, operates on a frame-by-frame basis. A Turing microarchitecture-based GPU has shader cores for gaming, tensor cores for large-scale compute/AI load, and RT cores for real-time ray tracing. As load on the GPU is applied, relevant to DLSS, this is predominantly on the tensor cores. Effectively thus, a higher FPS in a game means a higher load on the tensor cores. The different GPUs in the NVIDIA GeForce RTX family have a different number of tensor cores, and thus limit how many frames/pixels can be processed in a unit time (say, one second). This variability in the number of tensor cores is likely the major reason for said implementation of DLSS. With their approach, it appears that NVIDIA wants to make sure that the tensor cores never become the bottleneck during gaming.

Another possible reason comes via Futuremark's 3DMark Port Royal benchmark for ray tracing. It recently added support for DLSS, and is a standard bearer to how RTX and DLSS can work in conjunction to produce excellent results. Port Royal, however, is an extremely scripted benchmark using pre-determined scenes to make good use of the machine learning capabilities integrated in DLSS. Perhaps this initial round of DLSS in games is following a similar mechanism, wherein the game engine is being trained to enable DLSS on specific scenes at specific resolutions, and not in a resolution-independent way.

Regardless of what is the underlying cause, all in-game DLSS implementations so far have come with some small print attached, that sours the ultimately-free bonus of DLSS which appears to work well - when it can- providing at least an additional dial for users to play with, to fine-tune their desired balance of visual experience to game FPS.
Add your own comment

102 Comments on NVIDIA DLSS and its Surprising Resolution Limitations

#1
npp
The different GPUs in the NVIDIA GeForce RTX family have a different number of tensor cores, and thus limit how many frames/pixels can be processed in a unit time (say, one second). This variability in the number of tensor cores is likely the major reason for said implementation of DLSS. With their approach it looks like NVIDIA wants to make sure that the tensor cores never become the bottleneck during gaming.
A lot of words to basically say: if the frame rate is too high, the tensor cores can't keep up.

This explains why high end cards can't use DLSS at lower resolutions.

Lower-end cards on the other hand have so few tensor cores, that they can't keep up with the high resolution to begin with. This doesn't explain why the 2060 can use DLSS at 4k in Battlefield, though.

In any case, RTX takes care to keep FPS down so the tensor cores can keep up.

Kind of makes sense to me. Introducing new paradigms and having all the necessary resources on-die to support them don't come in the same generation.
Posted on Reply
#2
Crap Daddy
Ray Tracy and the Deep Learning S**t. Whoever came up with the idea, if it wasn't Jensen himself, needs to be fired. I hope people who bought the RTX cards did it because they are faster than what they had and not because of the mumbo dxrdlss jumbo. I know I did.
Posted on Reply
#3
Steevo
So, they have a known bottleneck of tensor cores at high frame rates and the latency cannot be masked.

That was quite a load of fertilizer about needing more GPU load, more like, they need frame times to be high enough so the end users don't notice.
Posted on Reply
#4
Ubersonic
Lol, that sucks, I'm glad I'm not interested in DLSS anyway because of the blurriness however it's lol that Nvidia won't let RTX2080 owners use DLSS at 1080p xD
Posted on Reply
#5
mouacyk
They also won't support FreeSync over HDMI.
Posted on Reply
#7
OneMoar
There is Always Moar
mouacykThey also won't support FreeSync over HDMI.
nobody from nvidia has ever stated that check your facts


alt theory DLSS requires a certain level of training and training is resolution dependent
Posted on Reply
#8
EarthDog
Is it a TC limitation or just implementing it where it's actually needed? Will those resolutions be supported later?

How can one test that it's actually a TC limitation?
Posted on Reply
#9
TheoneandonlyMrK
Just another demonstation of what "The way it's meant to be played"
How they say so.
Posted on Reply
#10
15th Warlock
Interesting theory, it would explain why the arbitrary restrictions that really don't make much sense...

It's still a developing technology, only time will tell if this is a permanent limitation or something that will be addressed in future drivers and/or patches for supporting games.

Either way, I'm very happy with the results, I'm not a competitive gamer, so I don't really care for frame rates beyond 100FPS, I'm just enjoying the fact that now I can really play BFV at 4K with everything maxed out and a smooth frame rate.

I listened to the advise of many people in the forum and now I make sure to leave my PC on from time to time so all games and drivers are up to date, so to me it was a really nice bonus to be able to enable this feature before I came to work today, the cherry on top you could say lol.

Both DLSS and RTX are still in their infancy, I have a feeling competitive players would disable either function to get the max performance out of any given game.
Posted on Reply
#11
Steevo
EarthDogIs it a TC limitation or just implementing it where it's actually needed? Will those resolutions be supported later?

How can one test that it's actually a TC limitation?
Logic. Every step through a pipe is Xns, and adding more steps =more ns. Since the Tensor core logic requires primatives for both these functions and they are an outside area to the graphics pipeline they incur latency. The primative lookup for RTX probably is the basis for DLSS, since it can identify sharp (acute) angles to the player with a simple filter and sort, act on those with another algorithm to smooth, but each operation takes ns, and that increases frame times, which directly reduces FPS.
Posted on Reply
#12
SystemMechanic
nppA lot of words to basically say: if the frame rate is too high, the tensor cores can't keep up.

This explains why high end cards can't use DLSS at lower resolutions.

Lower-end cards on the other hand have so few tensor cores, that they can't keep up with the high resolution to begin with. This doesn't explain why the 2060 can use DLSS at 4k in Battlefield, though.

In any case, RTX takes care to keep FPS down so the tensor cores can keep up.

Kind of makes sense to me. Introducing new paradigms and having all the necessary resources on-die to support them don't come in the same generation.
So in short, DLSS cant do 4k 120fps ? such a waste of 530 tensor cores when even having half of them as raster cores would give us so much raw performance lol.
Posted on Reply
#13
15th Warlock
SystemMechanicSo in short, DLSS cant do 4k 120fps ? such a waste of 530 tensor cores when even having half of them as raster cores would give us so much raw performance lol.
No, it's 1080p that's an issue, not 4K, I don't think any card can push 120FPS at 4K and ultra settings on modern games anyway, why is that such a big issue to you? :confused:
Posted on Reply
#14
SystemMechanic
15th WarlockNo, it's 1080p that's an issue, not 4K, I don't think any card can push 120FPS at 4K and ultra settings on modern games anyway, why is that such a big issue to you? :confused:
I meant 4k 120fps with DLSS and RT off.
1440p with DLSS would be nice too.... so we can hit that 200fps mark
Posted on Reply
#15
Tsukiyomi91
DLSS = Deep Learning Super Sampling. The word "deep learning" is a giveaway that this is AI-trained, of course it won't work if you don't let it "learn". Also, this is a very new tech & it takes time, not something you can use from the get go. Don't like it, keep it to yourselves. You can always opt for other solutions.
Posted on Reply
#16
the54thvoid
Super Intoxicated Moderator
What about other, non-standard resolutions like 2560x1600 or 1920x1200? How are these affected?
Posted on Reply
#17
Anymal
Tsukiyomi91DLSS = Deep Learning Super Sampling. The word "deep learning" is a giveaway that this is AI-trained, of course it won't work if you don't let it "learn". Also, this is a very new tech & it takes time, not something you can use from the get go. Don't like it, keep it to yourselves. You can always opt for other solutions.
Exactly, nvidia is pushing tech further, they can do this now simce AMD is 2 years behind and only 30% market share. Next gen of geforces on 7nm will bring perf we desire and tech will advance even further so we will again desire for more. AMD and their consoles are stagnant.
Posted on Reply
#18
bug
I wonder why the restriction for 2060 at 4k? DLSS is just what could push some titles into playable territory at that resolution.
Posted on Reply
#19
Prima.Vera
the54thvoidWhat about other, non-standard resolutions like 2560x1600 or 1920x1200? How are these affected?
or 3440x1440 ??
Posted on Reply
#20
Dexiefy
Wow... Did not care one bit for RTX. It will be cool once mainstream card will be able to provide 60 fps with RTX on at 1080p/1440p, but before that its a gimmick. DLSS however had my attention up until this point.
Thought at least one of ngreedia new techs are worthwhile, seems i was wrong.
Posted on Reply
#21
Rahnak
Wow, talk about a marketing sham. If I had bought an RTX card based on Nvidia's proposition that DLSS would give me more frames I would be pretty pissed right now. "It'll give you more frames.. sorta, maybe." Jeez.
Posted on Reply
#22
Breit
the54thvoidWhat about other, non-standard resolutions like 2560x1600 or 1920x1200? How are these affected?
I'd bet these won't work at all, because these are resolutions mainly used for professional displays rather than gaming displays and Nvidia doesn't care anyways.
Posted on Reply
#23
Tsukiyomi91
1080p/1440p with RTX is already possible to yield 60fps or higher with optimizations & patches. What more do you people want? Can't we just enjoy what we have & leave DLSS alone until it too has optimizations being added?
Posted on Reply
#24
bug
Tsukiyomi911080p/1440p with RTX is already possible to yield 60fps or higher with optimizations & patches. What more do you people want? Can't we just enjoy what we have & leave DLSS alone until it too has optimizations being added?
That would mean admitting Nvidia has yet another working feature that AMD is currently missing. So no.
Is it perfect? Is it what we'll be using in 5 years? Hell no. It's just a first iteration of a new tech. Support it if you want, ignore it if you think it's too expensive. Whining about it and reapeatedly pointing out flaws is childish though.
Posted on Reply
#25
B-Real
Now the RTX milked owners need to do the maths what monitors they need to buy if they want to use DLSS. :D
Posted on Reply
Add your own comment
Nov 20th, 2024 11:28 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts