• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

What's your preferred DLSS config?

What's your preferred DLSS config?

  • Native (DLSS disabled)

    Votes: 8,245 53.3%
  • Native + Frame Generation

    Votes: 735 4.7%
  • DLAA

    Votes: 1,223 7.9%
  • DLAA + Frame Generation

    Votes: 1,023 6.6%
  • Upscaling

    Votes: 2,014 13.0%
  • Upscaling + Frame Generation

    Votes: 1,351 8.7%
  • FSR (on NVIDIA)

    Votes: 887 5.7%

  • Total voters
    15,478
  • Poll closed .
Can it be forced on in driver like DLDSR or is it needing devs to implement it?
Afaik it's gotta be implemented per game, so unfortunately it's in a subset of DLSS games. It seems to be in any new game with DLSS now, but it seemed to have a slow uptake.
 
Afaik it's gotta be implemented per game, so unfortunately it's in a subset of DLSS games. It seems to be in any new game with DLSS now, but it seemed to have a slow uptake.
Is a shame, as sadly I have still yet to have a game I want to play that supports DLSS, but hopefully they find a way to do at driver level.
 
(Back when I used NVIDIA) I Usually used DLSSQ, as my pc wasn't that good to drive a 1440p panel, but I usually tried running the game at native resolution, as I always liked how sharp it looked when compared to upscaling, even though I liked upscaling plenty, I was still seeking native resolution rendering, even more so when I used a 1080p display instead of 1440p.
 
on a 4k OLED DLSS Quality would be my choice but on my G8 OLED I prefer DLAA. I can't stand TAA in most games. I also don't mind Frame Generation in CP2077 and Witcher 3 NG as long as the base framerate is over 60FPS.

My favorite nuclear option is to run DLDSR at the highest setting with DLSS quality it looks much better than native but even with a 4090 that is not possible in every game.
 
Genuinely surprised at most of the answers. I know most new features sound silly or gimmicky but some of them actually do wonders.
I guess most people didn't bother to check DLDSR + DLSS, it is an absolute game changer.
Playing at 5760x3240 with DLSS in performance or balanced mode is far better than regular 4K gaming.
 
Genuinely surprised at most of the answers. I know most new features sound silly or gimmicky but some of them actually do wonders.
I guess most people didn't bother to check DLDSR + DLSS, it is an absolute game changer.
Playing at 5760x3240 with DLSS in performance or balanced mode is far better than regular 4K gaming.
I'm shocked at the amount of people voting for Native over DLAA.
Most people don't like to fiddle with driver level settings, I suppose.

I'm kind of a purist myself. I like playing games the way they were imagined by their creator. I only enable AMD Anti-Lag or Nvidia Reflex for some extra smoothness, but that's it.
 
Genuinely surprised at most of the answers. I know most new features sound silly or gimmicky but some of them actually do wonders.
I guess most people didn't bother to check DLDSR + DLSS, it is an absolute game changer.
Playing at 5760x3240 with DLSS in performance or balanced mode is far better than regular 4K gaming.
I had never heard of this as a thing, it seems so counterintuitive to me but is definitely something I'll try after reading this thread.
 
Genuinely surprised at most of the answers. I know most new features sound silly or gimmicky but some of them actually do wonders.
I guess most people didn't bother to check DLDSR + DLSS, it is an absolute game changer.
Playing at 5760x3240 with DLSS in performance or balanced mode is far better than regular 4K gaming.

Is it better than DLAA, if a game supports it?


I once tried Destiny 2 at 1080p with 200% render scale (so downsampling from 4K), and then integer scaling to 4K. It looked softer, but it definitely helped with the extreme jaggies (which the game has even in native 4K). The UI was very pixelated, though.
I really like how games using TAA look in 1080p on a 4K screen with integer scaling. Unfortunately small text usually looks pixelated, but 3D rendering looks great.
 
Last edited:
Is it? There hasn't been any significant improvements since DLSS 2.0. I think the fact that DLSS 3.5 regards ray tracing and 3.0 regards frame-generation clearly signals that they have more or less squeezed most of what they can out of up-scaling and have moved onto other things. After all there are limitations to what an AI can guess.
Dude, IQ has improved A LOT since 2.0.

In fact DLAA is today the best AA method available. Not only it removes the aliasing, but also makes textures look better by increasing sharpness...
 
Dude, IQ has improved A LOT since 2.0.

In fact DLAA is today the best AA method available. Not only it removes the aliasing, but also makes textures look better by increasing sharpness...

Sharpening filters are subjective and introduce sharpening artifacts. They've also been around for decades and not something specific to DLSS. You can inject sharpening pretty easily w/o DLSS
 
Voted Native, but:
Back when I still uses 3070 on 1080p monitors, on cases that DLAA is fine, I use DLAA. On cases that DLAA does not look well visually (eg: Forza Motorsport 8: TAA ghosting) I use native with no AA.
99% of the time I play racing games, so TAA ghosting is very annoying to the point I'm likely to prefer no AA.
 
Is it better than DLAA, if a game supports it?


I once tried Destiny 2 at 1080p with 200% render scale (so downsampling from 4K), and then integer scaling to 4K. It looked softer, but it definitely helped with the extreme jaggies (which the game has even in native 4K). The UI was very pixelated, though.
I really like how games using TAA look in 1080p on a 4K screen with integer scaling. Unfortunately small text usually looks pixelated, but 3D rendering looks great.
You can run DLDSR on any game, even older ones, that's the whole point. I like going back to older games and enjoy rediscovering them without aliasing.
Depends what you mean by better, image quality wise yes but it will be more taxing. I have a 4090 so it isn't really a concern. DLDSR + DLSS is also superior to DLAA or even native 4K. Check it out for yourself.
People shouldn't even bother with native if they have a DLSS compatible GPU, it is that good.
 
Native? Really?

I can’t believe all those who voted for native that have actually tried DLSS ever in their lives.

DLSS Quality is 95% better than native.-
It’s extremely unlikely to find where it just doesn’t work.

Anyway. I voted for upscaling. Although my gpu supports the frame generation tech, I don’t like it. It’s necessary on path traced games but seeing 100+ fps coming all the way up from 30, does not feel right to me.

That's the thing guy, I don't think they have.

I don't know if it's 95% of the quality, but I do know most people do not and/or will not notice a large (if any) difference between 1440p and 4k. The perf of native 4k is def not worth it for almost anybody.

Hence, yes, I agree "upscaling" is correct. I don't have FG but don't see it as a feature I'd prefer/crutch on. DLAA is nice/preferred, but I don't think it's a deal-breaker versus certain native/upscaled resolutions. If someone can save some money by not having that ~8% performance, I think that's fine.

Quality (1440p) preferred, balanced (1253p) upscaling acceptable at 4k. 1440p native preferred, Quality (960p) upscaling acceptable. 1080p native or bust. JMO.
 
DLAA only in Rogue City , love it , fix with latest game update .
 
That's the thing guy, I don't think they have.

I don't know if it's 95% of the quality, but I do know most people do not and/or will not notice a large (if any) difference between 1440p and 4k. The perf of native 4k is def not worth it for almost anybody.

Hence, yes, I agree "upscaling" is correct. I don't have FG but don't see it as a feature I'd prefer/crutch on. DLAA is nice/preferred, but I don't think it's a deal-breaker versus certain native/upscaled resolutions. If someone can save some money by not having that ~8% performance, I think that's fine.

Quality (1440p) preferred, balanced (1253p) upscaling acceptable at 4k. 1440p native preferred, Quality (960p) upscaling acceptable. 1080p native or bust. JMO.

They don't need to try DLSS. There's enough content on the web showcasing how bad it looks. Each time a new version releases, I check out videos on it. In the clips where they show different clips with DLSS on/off, but don't say which ones have it until the end, it's obvious which clips have DLSS enabled, because it looks like it was recorded at a lower bitrate. If it's that noticeable on Youtube, where minor differences shouldn't be seen due to Youtube's own compression, then it's going to be worse in person. There's no point in testing the latest release. I think the only place where it's acceptable is in movies and TV shows. There's a lot less movement so the artifacts won't show up as often. Some shows weren't recorded in 4K, so you have to use some form of upscaling.
 
Native, and I want SSAA back.
 
With RTX 3070Ti I don't have access to Frame Generation and FSR didn't impress me. You can check all the other options, depending on the game.
 
DLAA is the best, frame generation is very helpful to keep up frame rates at 4K.
 
I use DLSS upscaling and Frame Generation whenever possible since my monitor runs at a high refresh rate, at 175hz. A trick with quality is that instead of using DLAA, using DLDSR 2.25x with DLSS Performance plus Frame Generation (Balance or Quality if you still have room to spare) achieves better image quality with minimum fps loss
I don't have frame generation with my RTX 2080 ti's in S.L.I. However, that's similar to what I do when I force A.F.R rendering & DSR is set to 4x. While my monitor is 1080p the render is 4K. I don't even use any type of AA & AF because it just looks fine without it. It's really too bad frame generation only works on DX12, it would be nice to compare to S.L.I on at least one game.
 
I often play older games with the maximum possible graphics *me takes a look at my pile of shame*.
Whenever the game supports it, I use multisampling antialiasing with transparency AA, which means no disturbance on e.g. foliage. In other words, no need for image smoothing afterward.

Later games have deferred renderers (DX10+ era), which means that MSAA is no longer supported except for a few nv-inspector tricks.

There were games without AA at all until post-processing AA was introduced, FXAA→SMAA.
Later, temporal enhancement of the image was added with TAA (software) -> DLSS/FSR/XESS (hardware).

What still works though, as it is applied to the whole image, is supersampling antialiasing. If MSAA is supported, then SGSSAA provides pretty much the best picture quality. If it is not supported, there is still the general activation via a higher game resolution, which is then downsampled to the monitor resolution: DSR(nVidia), VSR(AMD)
Due to different LOD systems in engines, this usually also increases the render range of clutter.

Built in supersampling: Some games, often from Ubisoft Franchise (or World of Warcraft), have a slider to set the resolution at 200%, which makes things a lot easier.

Todays post processing AA:
Allowing artifacts due to missing subsampling (no MSAA/TrAA) and covering these up again by interpolating intermediate images and/or guessing images using machine learning (the object looked like this yesterday) is not really a solution for me. :kookoo:

→ i love native with downsampling
Frame Generation is also fine, I don't need crazy low latencies.

PS:
DLAA from a marketing perspective the name suggests ordinary AA, although it offers better quality (input = output) than
DLSS, which is a great marketing name, as it suggests supersampling (downsampling), but sadly it is the opposite → upsampling (input < output)

Unfortunately, DLDSR only uses the original spatial DLSS 1.x, which is worse than the later developed FSR 1.x
In a situation where I require this, I would prefer downsampling with in game FSR or DLSS at highest quality setting over DLDSR to calm the bushes. (downsampling+DLAA for example)

PPS: I am only writing my opinion here. The techniques of the last 15 years are described from memory and may not be chronologically correct.
 
I still use a 1080p monitor, hopefully not for long...
But right now when I play RT or PT single player games on my 3060 I use DLSS Q to get playable frame rates (30 - 60 depending on the game), but in online shooters (mostly Squad, Hell Let Loose and Post Scriptum) I use DLDSR to render the game at 1440p without the nasty artifacts of non-integer down sampling, I find this setup is usually better clarity wise than in-game resolution sliders at 125 - 150% +TAA!
If the online game has DLSS too, i'll use DLSS Q as well to get even better frame rates, while still having better image stability than Native +TAA
 
Back
Top