Wednesday, February 13th 2019

NVIDIA DLSS and its Surprising Resolution Limitations

TechPowerUp readers today were greeted to our PC port analysis of Metro Exodus, which also contained a dedicated section on NVIDIA RTX and DLSS technologies. The former brings in real-time ray tracing support to an already graphically-intensive game, and the latter attempts to assuage the performance hit via NVIDIA's new proprietary alternative to more-traditional anti-aliasing. There was definitely a bump in performance from DLSS when enabled, however we also noted some head-scratching limitations on when and how it can even be enabled, depending on the in-game resolution and RTX GPU employed. We then set about testing DLSS on Battlefield V, which was also available from today, and it was then that we noticed a trend.

Take Metro Exodus first, with the relevant notes in the first image below. DLSS can only be turned on for a specific combination of RTX GPUs ranging from the RTX 2060 to the RTX 2080 Ti, but NVIDIA appear to be limiting users to a class-based system. Users with the RTX 2060, for example, can't even use DLSS at 4K and, more egregiously, owners of the RTX 2080 and 2080 Ti can not enjoy RTX and DLSS simultaneously at the most popular in-game resolution of 1920x1080, which would be useful to reach high FPS rates on 144 Hz monitors. Battlefield V has a similar, and yet even more divided system wherein the gaming flagship RTX 2080 Ti can not be used with RTX and DLSS at even 1440p, as seen in the second image below. This brought us back to Final Fantasy XV's own DLSS implementation last year, which was all or nothing at 4K resolution only. What could have prompted NVIDIA to carry this out? We speculate further past the break.
We contacted NVIDIA about this to get word straight from the green horse's mouth, hoping to be able to provide a satisfactory answer to you. Representatives for the company told us that DLSS is most effective when the GPU is at maximum work load, such that if a GPU is not being challenged enough, DLSS is not going to be made available. Accordingly, this implementation encourages users to turn on RTX first, thus increasing the GPU load, to then enable DLSS. It would thus be fair to extrapolate why the RTX 2080 Ti does not get to enjoy DLSS at lower resolutions, where perhaps it is not being taxed as hard.

We do not buy this explanation, however. Turning off VSync alone results in uncapped frame rates, which allow for a GPU load nearing 100%. NVIDIA has been championing high refresh rate displays for years now, and our own results show that we need the RTX 2080 and RTX 2080 Ti to get close to 144 FPS at 1080p, for that sweet 120+ Hz refresh rate display action. Why not let the end user decide what takes priority here, especially if DLSS aims to improve graphical fidelity as well? It was at this point where we went back to the NVIDIA whitepaper on their Turing microarchitecture, briefly discussed here for those interested.

DLSS, as it turns out, operates on a frame-by-frame basis. A Turing microarchitecture-based GPU has shader cores for gaming, tensor cores for large-scale compute/AI load, and RT cores for real-time ray tracing. As load on the GPU is applied, relevant to DLSS, this is predominantly on the tensor cores. Effectively thus, a higher FPS in a game means a higher load on the tensor cores. The different GPUs in the NVIDIA GeForce RTX family have a different number of tensor cores, and thus limit how many frames/pixels can be processed in a unit time (say, one second). This variability in the number of tensor cores is likely the major reason for said implementation of DLSS. With their approach, it appears that NVIDIA wants to make sure that the tensor cores never become the bottleneck during gaming.

Another possible reason comes via Futuremark's 3DMark Port Royal benchmark for ray tracing. It recently added support for DLSS, and is a standard bearer to how RTX and DLSS can work in conjunction to produce excellent results. Port Royal, however, is an extremely scripted benchmark using pre-determined scenes to make good use of the machine learning capabilities integrated in DLSS. Perhaps this initial round of DLSS in games is following a similar mechanism, wherein the game engine is being trained to enable DLSS on specific scenes at specific resolutions, and not in a resolution-independent way.

Regardless of what is the underlying cause, all in-game DLSS implementations so far have come with some small print attached, that sours the ultimately-free bonus of DLSS which appears to work well - when it can- providing at least an additional dial for users to play with, to fine-tune their desired balance of visual experience to game FPS.
Add your own comment

102 Comments on NVIDIA DLSS and its Surprising Resolution Limitations

#26
Anymal
Eh, pretencious nonsense.
Posted on Reply
#27
FordGT90Concept
"I go fast!1!11!1!"
@VSG are you able to determine if it is the driver imposing the limitation, the RTX API, or the game itself? If the game itself has all of this extra code baked in, that is very concerning. For example, what happens 20 years from now with new cards on old games? It breaks the many decades old paradigm of putting the options in the hands of the players. That doesn't sit right with me.
Posted on Reply
#28
bug
FordGT90Concept@VSG are you able to determine if it is the driver imposing the limitation, the RTX API, or the game itself? If the game itself has all of this extra code baked in, that is very concerning. For example, what happens 20 years from now with new cards on old games? It breaks the many decades old paradigm of putting the options in the hands of the players. That doesn't sit right with me.
Considering this is different between Metro and BFV, it's either in the game or in the game profile in the driver.
Posted on Reply
#29
MAXLD
such that if a GPU is not being challenged enough, DLSS is not going to be made available [...]
It would thus be fair to extrapolate why the RTX 2080 Ti does not get to enjoy DLSS at lower resolutions, where perhaps it is not being taxed as hard.
The concept of buying the most powerful card money can buy (@ $1200+) and find out it can't run games with the proudly advertised proprietary premium flagship features (that even a 2060 can), because... it's just too powerful for that (?!)... is just mind-boggling...

Typical buyer (at least inside the mind of nVidia):
- "This new tech sounds awesome! No way I gonna miss out on this bandwagon. Just to be safe I'll buy a more expensive model, so I can enjoy this new stuff and not have to worry about performance hits, resolutions or whatever. Unlimited power!"
Yep, bamboozled.
Posted on Reply
#30
EarthDog
One thing I just thought of...FFXV supports DLSS across all resolutions and gets high fps with 2080ti at 1080p... so... is it really a fps limitation?? Cant say I buy that considering...
Posted on Reply
#31
Vayra86
EarthDogOne thing I just thought of...FFXV supports DLSS across all resolutions and gets high fps with 2080ti at 1080pm... so... is it really a fps limitation?? Cant say I buy that considering...
No, its a budget limitation. Remember, its a deep learning process. Nvidia isn't putting whole farms on DLSS, just the bare minimum they need to get acceptable results. Why do you think its a slow trickle of DLSS enabled games? Why do you think its completely not consistent in its implementation? Nvidia is finding ways to push this feature without massively exceeding a budget available for it, while providing somewhat acceptable results, AND not showing too much of what is going on under the hood (@Steevo 's very plausible explanation of latency issues).

This is the core of my RTX-hate. Cost. DLSS has a very shaky business case, and RTX even more so - both for the industry and for Nvidia itself. Its a massive risk for everyone involved and for all the work all this deep learning and brute forcing is supposed to 'save', other work is created to keep Nvidia busy and developers struggling. And the net profit? I don't know... looking at those Metro screenshots I sure as hell don't prefer the RTX world they show, and the benefit of DLSS is far too situational.
Posted on Reply
#32
londiste
Vayra86looking at those Metro screenshots I sure as hell don't prefer the RTX world they show, and the benefit of DLSS is far too situational.
Remember that Metro was built with some other AO tech. They did not have RTX until rather late in the development process. How the lighting looks like at this point is not really a technical discussion but an art one - did level/art designers know what the level will look like with the RTX AO when they created it?

Physically and logically the screenshots with RTX do look more correct but that does not have a direct relevance to looking better or more playable. As you said, some of these screenshots - especially with closed off indoor areas - are too dark. If you think about it, it makes perfect sense. But it does not make the area good to go through in the game if you cannot see anything.
Posted on Reply
#33
Vayra86
londisteRemember that Metro was built with some other AO tech. They did not have RTX until rather late in the development process. How the lighting works at this point is not really a technical discussion but an art one - did level/art designers know what the level will look like with the RTX AO when they created it?

Physically and logically the screenshots with RTX do look more correct but that does not have a direct relevance to looking better or more playable. As you said, some of these screenshots - especially with closed off indoor areas - are too dark. If you think about it, it makes perfect sense. But it does not make the area good to go through in the game if you cannot see anything.
So what's next, an RT future with carefully placed 'soap opera' scenes you walk through, perfectly lit like a movie set ;) I can totally see it happening and being sold as 'realism' :D

The more I see of this technology and its effects, the more I get convinced it really serves a niche, if that, at the very best. Its unusuable in competitive gaming, its not very practical in any dark (immersive?) gameplay setting which happens to be a vast majority of them, and it doesn't play well with existing lighting systems either.

Dead end is dead...
Posted on Reply
#34
londiste
Vayra86So what's next, an RT future with carefully placed 'soap opera' scenes you walk through, perfectly lit like a movie set ;) I can totally see it happening and being sold as 'realism' :D
What do you mean? Game levels pretty much are a movie set. Set pieces are perfectly placed, lighting is tuned to look its best and the moment you meddle with something in there, it will look out of whack.
Posted on Reply
#35
Vayra86
londisteWhat do you mean? Game levels pretty much are a movie set. Set pieces are perfectly placed, lighting is tuned to look its best and the moment you meddle with something in there, it will look out of whack.
You said it. Lighting is tuned. RT depends not on tuning but on the actual light sources and their intensity. You can't tune much without tuning it everywhere in the same way.
Posted on Reply
#36
londiste
Vayra86You said it. Lighting is tuned. RT depends not on tuning but on the actual light sources and their intensity. You can't tune much without tuning it everywhere in the same way.
That is why I said it is a level design/art question. You simply add light sources to where these are needed.
Posted on Reply
#37
bug
Vayra86You said it. Lighting is tuned. RT depends not on tuning but on the actual light sources and their intensity. You can't tune much without tuning it everywhere in the same way.
Actually, you can tune it. You can tune the number, intensity and placement of individual light sources.
There is one light source that can't have its placement changed and that's the Sun (global illumination). Incidentally, global illumination is exactly what Metro uses RTX for. As cards become more powerful, they'll be able to handle both global and point light sources. But even then you'll still be able to claim rasterization looks better to you because... well... there's no metric for that.
Posted on Reply
#38
Vayra86
bugActually, you can tune it. You can tune the number, intensity and placement of individual light sources.
There is one light source that can't have its placement changed and that's the Sun (global illumination). Incidentally, global illumination is exactly what Metro uses RTX for. As cards become more powerful, they'll be able to handle both global and point light sources. But even then you'll still be able to claim rasterization looks better to you because... well... there's no metric for that.
Seeing is believing - what I'm seeing today is nowhere even close to a benefit to image quality. Or do you think this is a leap forward? Either in BFV or Metro? And 'as cards become more powerful' - you're already looking at a >700mm² GPU for the current performance. Good luck with that, with one or two node shrinks to go.

Honestly, if it objectively looks better I'm converted, but so far, the only instance of that has been a tech demo. And only ONE of them at that, the rest wasn't impressive at all, using low poly models or simple shapes only to show it can be done in real time.
Posted on Reply
#39
londiste
Vayra86Seeing is believing - what I'm seeing today is nowhere even close to a benefit to image quality. Or do you think this is a leap forward? Either in BFV or Metro?
Have not seen Metro beyond screenshots. In BFV DXR reflections are a benefit to image quality. Situational and ill-placed in a multiplayer shooter due to performance hit but a definite benefit.
Posted on Reply
#40
EarthDog
So, if FF XV has the ability to run DLSS at lower res and high FPS.. why not a AAA title like BF V? It makes no sense at all that this is a monetary or FPS limitation considering what we see around us. As time goes on and we see more implementations (remember, the only thing that was scheduled to come out in 2018 was SOTR and BFV... we will see more as the year goes on according to the lists published in 9-2018) we'll be able to see how this shakes out.

But yeah, interesting editorial...thought provoking. :)
Posted on Reply
#41
moproblems99
Vayra86Good luck with that, with one or two node shrinks to go.
I think next gen will tell us where RTX is heading. Someone pointed, I think @notb, out in previous thread that we are averaging about 7 DX12 titles per year and it has been declining since 2016. Is it too complicated? I don't think so as it was supposed to bring console level development to PC. Is DX12 too expensive to develop for? Likely. Add together now having to reimagine your lighting environment and there will definitely be a curve. That's not to mention the cards aren't really powerful enough to take advantage of what RTX was meant for. Both of these should be mitigated with another year of working with it and maybe another gen of more powerful RTX.

I keep harping on it but all the virtues of DX12 we were told were coming to games have not happened yet so I don't hold my breath for these. Of the new tech introduced, I think DLSS has the best chance to be successful as it gets tuned.
EarthDogSo, if FF XV has the ability to run DLSS at lower res and high FPS.. why not a AAA title like BF V?
I don't understand what the benefit would be, honestly. If you are already at high frames, what do you need it for? At this point, it lowers image quality to the point where you could tweak your settings to achieve the same effect.

Would it be simply to say that I run 'ultra' vs saying I run 'mostly high'?
Posted on Reply
#42
Vayra86
EarthDogSo, if FF XV has the ability to run DLSS at lower res and high FPS.. why not a AAA title like BF V? It makes no sense at all that this is a monetary or FPS limitation considering what we see around us. As time goes on and we see more implementations (remember, the only thing that was scheduled to come out in 2018 was SOTR and BFV... we will see more as the year goes on according to the lists published in 9-2018).

But yeah, interesting editorial...thought provoking. :)
FFXV was the first one to get the treatment, so they went all the way. BFV was an RTX 'launch' title, and Metro Exodus is neither of those and has the lowest degree of support thus far (ánd the longest dev cycle since RTX launch!). I don't see how this defeats the budget argument (yet - as you say, time will tell us more).

Another aspect I think we might overlook is the engine itself. I can totally understand that some engines are more rigid in their resolution scaling and support/options/features. Case in point with Metro Exodus: it can only run in fullscreen at specific resolutions locked to desktop res.
moproblems99I think next gen will tell us where RTX is heading. Someone pointed, I think @notb out in previous thread that we are averaging about 7 DX12 titles per year and it has been declining since 2016. Is it too complicated? I don't think so as it was supposed to being console level development to PC. Is DX12 too expensive to develop for? Likely. Add together now having to reimagine your lighting environment and there will definitely be a curve. That's not to mention the cards aren't really powerful enough to take advantage of what RTX was meant for. Both of these should be mitigated with another year of working with it and maybe another gen of more powerful RTX.

I keep harping on it but all the virtues of DX12 we were told were coming to have not happened yet so I don't hold my breath for these. Of the new tech introduced, I think DLSS has the best chance to be successful as it gets tuned.
Spot on. Its all about money in the end. That is also why I think there is a cost aspect to DLSS and adoption rate ties into that in the most direct way for Nvidia. And it also supports my belief that this tech won't take off under DXR. The lacking adoption of DX12 has everything to do with workforce and money, too, and not with complexity. If its viable, complexity doesn't matter, because all that is, is additional time (and time is money). Coding games for two radically different render techniques concurrently is another major time sink.
Posted on Reply
#43
InVasMani
What happens when you initially cap the frame rate or force higher AA to keep the frame rate lower? Is the use of DLSS predetermined? Also how does DSR play into any of this. It honestly just seems and feels as if there is more to this than Nvidia lets on. It also makes the marketing of DLSS feel like a serious case of snake oil or bait and switch. I can start to see why it's basically been not much more than a tech demo if it were better and easier to use and implement in beneficial ways we'd probably be seeing more widespread use of it by now rather than 2 or 3 AAA studio's. Sure it's new, but even among AAA studio's it's scarcely been demonstrated to date and they'd have earlier acess than the public to inner works of it and how to ready implementing it as well.

This really lowers the appeal of it and limits it's usage especially for the weaker card's though it's limiting for the stronger cards as well at the same time. It's like they are trying to tie it strictly to ultra high resolution and/or RTRT only. The thing is I don't think people that bought the cards with DLSS more in mind bought it with that handicap in the back of their heads this is almost akin to the 3.5GB issue if it's as bad as it sounds.
Posted on Reply
#44
EarthDog
moproblems99I don't understand what the benefit would be, honestly. If you are already at high frames, what do you need it for? At this point, it lowers image quality to the point where you could tweak your settings to achieve the same effect.

Would it be simply to say that I run 'ultra' vs saying I run 'mostly high'?
It looks like the cutoff for BF V, give or take a few FPS, is around 60-70FPS, no? I wouldn't call that high FPS. Some want 75/120/144/165 FPS/MHz. Many who do sacrifice IQ anyway to get there... so this is simply another option?
Posted on Reply
#45
londiste
moproblems99I keep harping on it but all the virtues of DX12 we were told were coming to have not happened yet so I don't hold my breath for these.
To get the benefits from DX12 (or Vulkan) you need damn good developers. And, due to lower level API you are required to do a lot of legwork yourself which does include writing (pieces of) rendering paths that are better for GPUs of one vendor or another. Or alternatively - writing several. This adds to time and cost of developing a DX12 engine or game. There is really a limited amount of examples out there where DX12 is a complete benefit.

Sniper Elite 4, Shadow of Tomb Raider. With some concessions, Rise of Tomb Raider and Hitman with latest patches and drivers.
There are some DX12-only games with good performance like Forzas or Gears of War 4 but we have no other APIs in them to compare to.
Metro Exodus seems to be one of the titles where DX12 at least does not hurt which is a good thing (as weird as that may sound).
I am probably missing 1-2 games here but not really more than that.

As an AMD card user there is a bunch of games where you can get a benefit from using DX12 but not from all DX12 games. Division, Deus Ex: Mankind Divided, Battlefields.
As an Nvidia card user you can generally stick to DX11 for the best. With very few exceptions.
Posted on Reply
#47
moproblems99
EarthDogIt looks like the cutoff for BF V, give or take a few FPS, is around 60-70FPS, no? I wouldn't call that high FPS. Some want 75/120/144/165 FPS/MHz. Many who do sacrifice IQ anyway to get there... so this is simply another option?
Agreed, I think the restrictions are currently because NV is doing the learning where it will make the most impact.
londisteTo get the benefits from DX12 (or Vulkan) you need damn good developers. And, due to lower level API you are required to do a lot of legwork yourself which does include writing (pieces of) rendering paths that are better for GPUs of one vendor or another. Or alternatively - writing several. This adds to time and cost of developing a DX12 engine or game. There is really a limited amount of examples out there where DX12 is a complete benefit.

Sniper Elite 4, Shadow of Tomb Raider. With some concessions, Rise of Tomb Raider and Hitman with latest patches and drivers.
There are some DX12-only games with good performance like Forzas or Gears of War 4 but we have no other APIs in them to compare to.
Metro Exodus seems to be one of the titles where DX12 at least does not hurt which is a good thing (as weird as that may sound).
I am probably missing 1-2 games here but not really more than that.

As an AMD card user there is a bunch of games where you can get a benefit from using DX12 but not from all DX12 games. Division, Deus Ex: Mankind Divided, Battlefields.
As an Nvidia card user you can generally stick to DX11 for the best. With very few exceptions.
So considering everything you just wrote, what does that say for the likely implementation of RTX going forward? Your last sentence spells it out the best.
Posted on Reply
#48
notb
Vayra86Spot on. Its all about money in the end. That is also why I think there is a cost aspect to DLSS and adoption rate ties into that in the most direct way for Nvidia. And it also supports my belief that this tech won't take off under DXR. The lacking adoption of DX12 has everything to do with workforce and money, too, and not with complexity. If its viable, complexity doesn't matter, because all that is, is additional time (and time is money).
There's also another way to look at this problem.
DX12 hasn't really offered anything interesting for the consumer. A few fps more is not enough. And yes, it's harder to code and makes game development more expensive.
RTRT could be the thing that DX12 needs to become an actual next standard.
Posted on Reply
#49
bug
Vayra86Seeing is believing - what I'm seeing today is nowhere even close to a benefit to image quality. Or do you think this is a leap forward? Either in BFV or Metro? And 'as cards become more powerful' - you're already looking at a >700mm² GPU for the current performance. Good luck with that, with one or two node shrinks to go.
It's not only the fab node (though that is a big part of the equation). With the data collected from Turing, it's possible to fine tune the hardware resources (e.g. the ratio of CUDA cores:tensor cores:RT cores or something like that).
Posted on Reply
#50
Vayra86
bugIt's not only the fab node (though that is a big part of the equation). With the data collected from Turing, it's possible to fine tune the hardware resources (e.g. the ratio of CUDA cores:tensor cores:RT cores or something like that).
Possibly. Seeing is believing... so far Turing on a shader level was not much of a change despite taking up additional die space for cache. I think its quite clear the CUDA part of it won't be getting much faster. And RT is already a brute force with an efficiency pass (denoise), can't be that much left in the tank IMO. That is not to say Nvidia hasn't surprised us before, but that is exactly what I'm waiting for here. DLSS sure as hell isn't it, and it sure as hell was intended to be.

Call me the pessimist here... but the state of this new generation of graphics really doesn't look rosy. Its almost impossible to do a fair like-for-like comparison with all these techniques that are proprietary and on a per-game basis, Nvidia is creating tech that makes everything abstract which is obviously a way to hide problems as much as it is a solution.
Posted on Reply
Add your own comment
Jul 5th, 2024 02:08 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts