• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

What's your preferred DLSS config?

What's your preferred DLSS config?

  • Native (DLSS disabled)

    Votes: 8,245 53.3%
  • Native + Frame Generation

    Votes: 735 4.7%
  • DLAA

    Votes: 1,223 7.9%
  • DLAA + Frame Generation

    Votes: 1,023 6.6%
  • Upscaling

    Votes: 2,014 13.0%
  • Upscaling + Frame Generation

    Votes: 1,351 8.7%
  • FSR (on NVIDIA)

    Votes: 887 5.7%

  • Total voters
    15,478
  • Poll closed .
Considering the Nvidia card that's running in my TV Box right now is a 980ti, FSR has been a real godsend getting stuff like Hogwarts Legacy and Jedi Fallen Order running acceptably.

Kinda the ideal use case scenario for upscaling tech: keeping an old piece of hardware out of the landfill for a few more years.
 
I used FSR on performance settings to play Baldur's Gate 3 on an iGPU (it was entirely playable at 10-20fps I guess).
But with a GPU I would (and do) always choose no upscaling whatsoever if I can get a game running at 30+ fps because for me graphical quality is more important than very high fps.

But I'm someone who often plays with no antialiasing because of the minute blurring it can sometimes add and I even have these settings tweaked for no compromise on quality in Radeon settings:
1701095525124.png

And this is probably what less than 1% of gamers do since the defaults are 'standard' and 'enabled'.
 
Sorry, my take!

There's no need for any tricks that substitute native rendering. If one needs better framerate, the culling is one of the best things invented for that purpose.
Upscaling should be feature for old weak cards, not new and powerful.

I personally can't say, that games for last five to seven or so years, have a significant jump/progress and eyecandy. So what's a point of buying 4080-4090 and then use DLSS, considering the game was developed for a console RX6700-ish class GPU? Wouldn't want to hamper image any further. The only acceptable addition to the rendering is some AA technique, that actually improves the image quality.

BTW. Nobody would need the DLSS/FSR and let alone frame generation, if the devs actually were optimizing the games. The GPU vendors advertise the game dev's incompetence. No AI would substitute the buggy mess and filthy code noodles, left unfixed for ages.

P.S.: All DLSS apostles praise it. That might be good. But this is a feature for games, that are couple years old at most. Does it work with old games (5-10 years old) that do not have the support of the said feature? Same for FSR. No offence. Just curious.
 
Last edited:
I used FSR on performance settings to play Baldur's Gate 3 on an iGPU (it was entirely playable at 10-20fps I guess).

Using FSR on iGPUs is actually a bad idea. You get almost no performance increase, because the computational cost of FSR is too high.

Most games will actually run better at native 720p compared to 1080p with FSR Performance (so 540p).

BTW. Nobody would need the DLSS/FSR and let alone frame generation, if the devs actually were optimizing the games. The GPU vendors advertise the game dev's incompetence. No AI would substitute the buggy mess and filthy code noodles, left unfixed for ages.

I have one use case from my personal experience where you "need" upscaling - playing on a 4K TV. There are no 1440p TVs, everything is 4K (or 1080p in the low end). So if you don't want to buy a 4090, you have to upscale. And DLSS is fantastic in that scenario.

But I absolutely agree about the lack of optimization. It's a plague now. Consoles are running most "next-gen" games at ~720p60, which is laughable. And it translates to the PC versions, which usually have lots of other problems.
 
I used FSR on performance settings to play Baldur's Gate 3 on an iGPU (it was entirely playable at 10-20fps I guess).
But with a GPU I would (and do) always choose no upscaling whatsoever if I can get a game running at 30+ fps because for me graphical quality is more important than very high fps.

But I'm someone who often plays with no antialiasing because of the minute blurring it can sometimes add and I even have these settings tweaked for no compromise on quality in Radeon settings:
View attachment 323184
And this is probably what less than 1% of gamers do since the defaults are 'standard' and 'enabled'.
Texture filtering quality on high is a must, but honestly, I haven't seen surface format optimisation do much, if anything at all, so I just leave it on. Same with tessellation on AMD optimised vs default.

Other than that, I agree. As long as I have 30+ FPS, I'll always choose visual quality over more performance.
 
A major issue with "Native" nowadays is that its not actually native, it usually had a mandatory schmeer of TAA applied over the top. In some apps, the TAA implementation performs well while in others the native TAA is just garbage.

Unfortunately older AA approaches like MSAA and such generally aren't an option anymore, and neither is outright turning off TAA either.

In such a world, something like DLSS or even FSR can be considered superior to a native that never was.
 
A major issue with "Native" nowadays is that its not actually native, it usually had a mandatory schmeer of TAA applied over the top. In some apps, the TAA implementation performs well while in others the native TAA is just garbage.

Unfortunately older AA approaches like MSAA and such generally aren't an option anymore, and neither is outright turning off TAA either.

In such a world, something like DLSS or even FSR can be considered superior to a native that never was.
Yeah, what happened with MSAA and SSAA? Why don't we see them in games anymore? :(
 
Yeah, what happened with MSAA and SSAA? Why don't we see them in games anymore? :(
MSAA (if improperly implemented) can be rather demanding, look at red dead 2, it's a mess in that game. Something that may help explain it better than I ever could is this document from nvidia, but in short, it's a bit of a pain in the ass to properly detect edges with deferred rendering when compared to forward rendering.
And SSAA is just a resolution slider nowadays, and when that doesn't work you can usually find a way in drivers, be it DLDSR or VSR

But I absolutely agree about the lack of optimization. It's a plague now. Consoles are running most "next-gen" games at ~720p60, which is laughable. And it translates to the PC versions, which usually have lots of other problems.
I keep thinking about that roundtable discussion that happened on digital foundry about DLSS, and how a CDPR dev said that DLSS is actually a great optimization tool, and they see it as a good way to keep pushing visual fidelity. They literally see it as a crutch for optimization, we are going backwards, I can excuse alan wake 2 because that actually has a lot of new tech and they are doing a lot in changing the engine, so I expect growing pains, and it also has the most detail I've ever seen in a game, it actually feels next gen. But anyone else? Considering how the visual fidelity isn't much improved, I'd just say they are lazy honestly, and things like what that CDPR dev said just confirms my belief
 
Last edited:
And SSAA is just a resolution slider nowadays, and when that doesn't work you can usually find a way in drivers, be it DLDSR or VSR

Downsampling is not quite the same as SSAA. I'm no expert, so I'm not gonna pretend to know how it works, but there's a substantial visual difference.

I especially remember early Euro Truck Simulator 2 days. It was a DX9 game back then, so you could force all kinds of AA in NV Inspector, but the game also had internal scaling. I was playing at 1080p, and using SGSSAAx2 looked sooo much cleaner than downscaling even from 4K. SGSSAAx4 looked absolutely perfect, and that's not even the full on SSAA.
There were also different compatibility flags that would change the visual output. Some were blurrier than others.

Another game with horrible aliasing is Destiny 2. Even if you play in 4K with 200% scaling (which means 8K), there are still many jaggies everywhere, it can never look perfect. I expect an actual SSAA implementation would do a much better job, but I'd rather just see DLAA for performance reasons.

Alien Isolation also has some crazy aliasing (specular or whatever), which you can never eliminate using downsampling. But I think there are some TAA mods for it, which clean it up nicely.

That's exactly why TAA was invented, and why all these new upscalers are based on it. It has drawbacks, but purely from an "eliminating jaggies" perspective, it's the most superior method, and it has an extremely low performance cost. MSAA and SSAA stopped being viable a long time ago, as they both have a high cost and a weak end result in modern games.
 
Yeah, what happened with MSAA and SSAA? Why don't we see them in games anymore? :(
Both essentially need to much power and also MSAA has a huge downside it doesn't fix all problems on the screen (in that regard TAA is better). SSAA especially is rare but you can always use VSR or DSR to upscale your resolution to 2x+ if you really want "SSAA".
 
It's interesting that in the op by w1z specifically asks nvidia users.
NVIDIA users: we're curious... if a game fully supports all the available options, which ones would you choose?
Yet anyone and everyone can vote, so I'd say that these results aren't truly indicative of how only nvidia users would vote.

From what I see online from bonafide nvidia users, Native would have a significantly smaller representation, with DLAA / upscaling / upscaling + FG would have more.

I can appreciate there's no practical way to achieve this, and what I myself see through my relatively narrow lens is likely skewed, just thought it was an interesting point to make.
 
I generally prefer native even though it comes with some asterisks. DLSS is great tech but I've still run into a few games where it looks quite poor when compared with native. And in those games where it does look better then it's often because the native TAA implementation itself is poor.

I'm still in the MSAA camp overall and especially miss SGSSAA. The new Forza Motorsport looks genuinely dreadful after the switch from MSAA to whatever TAA they're using and DLSS is even worse than that in this game. TAA can look fantastic but many recent implementations haven't been that great, in my opinion.
 
OK I finally got a 2060 Super just so I could take the poll.

Kinda mostly maybe a little bit. As I was playing Hogwarts Legacy on it at 1440p and enjoying myself, I'll vote upscaling/DLSS though DLAA is very nice too.
 
Sharpening filters are subjective and introduce sharpening artifacts. They've also been around for decades and not something specific to DLSS. You can inject sharpening pretty easily w/o DLSS
Not DLAA. I have no artifacts here, on every game I've tested.
 
Depending on the Performance DLDSR->DLAA->DLSS Q->DLSS B -> reduce other details.
Since there is no FG for Ampere, I can not comment on it.
 
All are welcome tools, useful especially when your old GPU shows signs of fatigue in the fight with new games. But not only.
For example, in the game below, when joining the two video recordings, I had to run a third video recording because I wasn't sure which one was off and which was on. The color of the hero's helmet is the biggest difference, otherwise it is difficult to distinguish and only by direct comparison.

In the conditions that DLSS does its job well, its activation brings some benefits:
1. FPS increase, if you need it.
2. The consistent decrease of GPU Usage (even vRAM Usage), which leads to the reduction of energy consumption and temperatures, when playing with vSync ON.

Since activation is not required, I don't understand where the problem is with those who criticize them. You don't like it, you don't use it.

 
So after all the testing, press, hardcore brand loyalist debates, code optimizations, etc., over half of Nvidia users don’t enable or even want super sampling/scaling/etc.

Native rendering FTW!
 
I love DLAA and Frame Gen, i use them whenever possible. I also love DLDSR when DLAA isn't an option. In older games, i play at 5120x2880 using DSR with no blur.

I can't stand TAA in most games today, it's just a blurry mess and aliasing is still present a lot.
 
Since activation is not required, I don't understand where the problem is with those who criticize them. You don't like it, you don't use it.
I agree, though activation is actually becoming a requirement in some games, even if native, you're still running the algorithm, which can be expensive, example being Alan Wake 2.

Other times the games are made with "upscalers in mind", such is the case with remnant II, or seemingly every UE5 game thus far, criticism is becoming more prevalent because developers are seeing it as "optimization" (example being CDPR, said in the roundtable discussion with DF), instead of optimizing for native rendering, they use upscalers as a way to achieve the target performance, which people don't like as it's not how we the users thought it should be used for
 
I am the owner of a 4080 and a 1440p 165Hz monitor, and I prefer to play with DLAA + Frame Generation. This significantly improves the picture quality compared to native, and the gameplay smoothness is much higher. However, before enabling Frame Generation, I adjust the graphics settings to ensure that the clean FPS with DLAA is not less than 70 FPS. This way, enabling generation does not introduce any negative effects.
 
Last edited:
So after all the testing, press, hardcore brand loyalist debates, code optimizations, etc., over half of Nvidia users don’t enable or even want super sampling/scaling/etc.
The poll is flawed, it asks only Nvidia users to vote, but anyone can (and has) voted, so no this doesn't in any meaningful way accurately represent Nvidia users.
 
And this is probably what less than 1% of gamers do since the defaults are 'standard' and 'enabled'.
I am in this <1%. Yet I hate when it's lower than 55 FPS. I had too much of sub-30 FPS gaming when I was a kid. I am now a big boy and big boys need big framerate numbers. xD

My preferred DLSS config would be resolution 4 times higher than native (i.e. 5120x2880 on a 2560x1440 display) + DLSS: Quality if the framerate is right (55 or higher).
If that's too slow I would resort to plain DLSS: Quality. Why "whould?" I don't own an nVidia GPU. That's how I play with FSR and I don't think DLSS experience will be much different from that apart from being less artifacted and generally more stable.
 
I think you'll get more AMD users that'll vote native which skews the results.
bang on.
I like playing games the way they were imagined by their creator.
Not sure I follow this logic, the creator creates games with standard resolution options, generally from 720p ish or even lower, all the way to 4k and beyond, sometimes (hopefully even, with ultrawide options and VR too) so which one of those is intended by the creator? Is playing 720p native more the way the creator imagined than 1440p? what about 1440 upscaled to 4k? or is 4k native what's intended and anything lower isn't? I'm interested to hear from you how the end users arbitrary resolution choice relates to the creators vision.

Bonus question, if the Developer lists upscaling for all the system specs and target resolutions and fps as we've seen lately, is playing with upscaling the way it's imaged by the creator?

-----------------------------------------------------------------------------------------------------------------------------------------------------------------------

I'm continually puzzled by the antiquated notion that native rendering resolution will always be best, the goal, some pinnacle of image quality.... the native resolution of any given persons personal monitor?..

If rendering at native panel resolution is to be touted as the best, then why does super sampling exist?

Now I don't doubt for a second that each panel has an absolute quality limit, imposed by the physical resolution. However, when taking about rendering resolution, it's absolutely plainly evident and obvious, that rendering at a given panels native resolution isn't where IQ stops improving.
 
Last edited:
Back
Top