• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

What's your preferred DLSS config?

What's your preferred DLSS config?

  • Native (DLSS disabled)

    Votes: 8,245 53.3%
  • Native + Frame Generation

    Votes: 735 4.7%
  • DLAA

    Votes: 1,223 7.9%
  • DLAA + Frame Generation

    Votes: 1,023 6.6%
  • Upscaling

    Votes: 2,014 13.0%
  • Upscaling + Frame Generation

    Votes: 1,351 8.7%
  • FSR (on NVIDIA)

    Votes: 887 5.7%

  • Total voters
    15,478
  • Poll closed .
I prefer native, but upscaling is alright when it's done properly. Some games actually have higher graphics quality when upscaled, so that's a win-win! Frame generation can fuck right off. Sure, the FPS counter goes up, but the latency does too. The main reason I'm chasing higher performance in games is to get that latency down, so frame generation is counterproductive.
 
DLDSR: 5120x1920 up-scaled from 3430x1286 downscaled to 3840x1440
 
DLSS/FSR don't exist to me aside from writing this sentence.

I much prefer Native with no form of upscaling or framegen. That's why I bought AMD.
Upscaling and framegen only come in the picture if having a high refreshrate is preferable (immersive games) and because the game just runs like absolute ass.
Definitely not by choice.

Edit: I just realized it's specifically targeted to Nvidia owners lol. @W1zzard maybe add an option "I have AMD" or add AMD options. I think you'll get more AMD users that'll vote native which skews the results.
oh yea I guess I also fell into this trap. But I did and would do the same if I had Nvidia as well.
 
I prefer native, but upscaling is alright when it's done properly. Some games actually have higher graphics quality when upscaled, so that's a win-win! Frame generation can fuck right off. Sure, the FPS counter goes up, but the latency does too. The main reason I'm chasing higher performance in games is to get that latency down, so frame generation is counterproductive.
frame gen is nice to have, but the best result i got from is in robocop, 130 fps to 240 fps, still not enough baseline fps to have low inputlag. but when i get a hold of 360hz monitors and beyond, 200fps should be a great starting point for frame gen having no drawbacks and giving great smoothness.
 
Native 95% of the time.
If I'm just messing around in a game I might enable it just to save some power and cool things down a little, especially with summer coming and I don't
have an air con in my room for the 30+ days that are starting to hit Cape town.

There are to many visual glitches with DLSS, when you look for them. So I avoid it as often as I can.
 
Even lower DLSS presets look better than native (with TAA) in most games. I believe people voting for native just do it because of the feeling to have the game rendering at that resolution and not because of the image quality at hand.
 
People, please, native + DLAA is clearly the right answer! I remember first trying DLAA on ESO & being amazed that it actually almost entirely fixes the games aliasing issues at the cost of very slightly less crisp textures. Worse in stills, but WAY better when moving around. Now it's not that night & day in other games I've tried it in, but it's still usually my preferred since most games have some rubbish TAA applied which you can't disable so sticking DLAA on hides the issues with a lot of games' "native" presentation.

Personally I've yet to play something where DLSS hasn't looked too blurry for my tastes. It's fantastic as an option if you can't get the frame rate you want, but I personally would rather play a game with Medium settings native res (+ DLAA if it's an option), than Ultra settings + DLSS.

Edit: useful background info is I'm playing at 4k on a 42" OLED on a large desk, so it's perhaps easier to spot DLSS' negatives at that size & that close.
 
fsr performance (720p) + dsr 1440p on 1080p tv, much better than fsr quality 1080p...
 
Even lower DLSS presets look better than native (with TAA) in most games. I believe people voting for native just do it because of the feeling to have the game rendering at that resolution and not because of the image quality at hand.
Well I can tell you that your wrong, when I had my 6800xt I played Cyberpunk with the res at 4k and then using slider that set it to 85% of that res so it would stay at 60fps.
When they did the update that removed the slider in favour of lowering the res to then try and "improve" it, I was annoyed.
The game was much better with just a -15% on the set res.

I have never seen any improvement in the picture quality when using FSR or DLSS.
 
Well I can tell you that your wrong, when I had my 6800xt I played Cyberpunk with the res at 4k and then using slider that set it to 85% of that res so it would stay at 60fps.
When they did the update that removed the slider in favour of lowering the res to then try and "improve" it, I was annoyed.
The game was much better with just a -15% on the set res.

I have never seen any improvement in the picture quality when using FSR or DLSS.
Your example is FSR though, for most games DLSS is noticeably better looking (especially in motion) than FSR. Sad though that is as I much prefer tech every vendor can use equally, it's the truth.
 
Your example is FSR though, for most games DLSS is noticeably better looking (especially in motion) than FSR. Sad though that is as I much prefer tech every vendor can use equally, it's the truth.
My current card is a 4090 and DLSS still doesn't look better than native, at least with Cyberpunk and a few indie games I have played.
Just the other day while using DLSS in Cyberpunk there was a scene where a moving object didn't keep it's shape when moving through a small gap.
I tested the scene without DLSS and it kept its basic shape. It looked like DLSS couldn't work out what it was properly in the small gap.

I would rather run with a slighlty lower res than use DLSS or FSR.
As a tech to lower the GPU temps and power consumption when my room is reaching 30 degrees, it's great.
 
"Deep Learning", "AI Powered", "Anti-lag", "Fluid Frames", "Ray Reconstruction", "Reflex", "FidelityFX" are nothing more than marketing labels to me.

I play my games at native resolutions. I don't need to rely on fancy gimmicks to actually enjoy them. If performance is lacking, I experiment with the settings to find the optimal balance.

Talk to me in real numbers, not gobbledygook.
 
I know there are lots of nay sayers to DLSS here, but IMO the technology is evolving so fast that soon it'll become very difficult to see the difference between native and upscaled.
 
All rendering is based on tricks and hacks that have been added over time, DLSS and frame generation is just another one. But why make the GPU work harder when it’s unnecessary, at native res even at 4K you may get shimmering artefacts or aliasing on some objects, if DLSS solves this while providing the same, if not better than native res in some cases all while making it easier for the GPU to render which can mean lower power consumption, a quieter card all while still having a great image then it’s just stupid not to use it.

Why work harder when you can work smarter. DLSS has come a long way since it first came out with that vaseline looking mess and it's getting better all the time. If we want visuals to take a big leap now instead of waiting another 5 to 10yrs to get the same performance at native when doing things like path tracing then DLSS and frame generation are necessary.

DLSS and Frame-gen don't really get rid of all artifacts and they introduce their own. Frame-generation adds input lag so no thank you, I'll not subscribe to your notion that these should be integral technologies used across the board. I'll gouge my eyes out before being forced to use frame gen.

I know there are lots of nay sayers to DLSS here, but IMO the technology is evolving so fast that soon it'll become very difficult to see the difference between native and upscaled.

Is it? There hasn't been any significant improvements since DLSS 2.0. I think the fact that DLSS 3.5 regards ray tracing and 3.0 regards frame-generation clearly signals that they have more or less squeezed most of what they can out of up-scaling and have moved onto other things. After all there are limitations to what an AI can guess.
 
where is the option, it depends on the game? as there isnt one options fits all.
 
I use DLSS upscaling and Frame Generation whenever possible since my monitor runs at a high refresh rate, at 175hz. A trick with quality is that instead of using DLAA, using DLDSR 2.25x with DLSS Performance plus Frame Generation (Balance or Quality if you still have room to spare) achieves better image quality with minimum fps loss
 
I use DLSS upscaling and Frame Generation whenever possible since my monitor runs at a high refresh rate, at 175hz. A trick with quality is that instead of using DLAA, using DLDSR 2.25x with DLSS Performance plus Frame Generation (Balance or Quality if you still have room to spare) achieves better image quality with minimum fps loss
This sounds so bizarre...I must try it :D
 
DLAA if the game isn't demanding (I've only used it in Forza Horizon 5).

DLSS Quality otherwise, as it looks better than regular TAA in every game I've played, at least in 4K. DLSS Performance also looks very good in 4K and I'd absolutely use it to enable RTGI or path tracing (I've used it in Metro Exodus and Dying Light 2).

Native 4K is just a waste of processing power (and actual power). It only makes sense if you don't have to lower any settings, and there's no DLSS available.
 
This sounds so bizarre...I must try it :D
You won't regret it haha. The trick here is to upscale the graphics before applying DLSS. Since the output is higher than native and DLSS is able to preserve the perception, you'll get better looking graphics. Based on my experience DLDSR 2.25x plus DLSS Performance has the same FPS as native while balance and quality will lower the fps but produces even better quality
 
I can see this thread being closed due to hostilities in the near future.:fear:

On topic: as an ex-Nvidia user (I still have Nvidia parts in my HTPCs and on the shelf, I just don't game on them anymore), I have to say, native is always better than DLSS, unless one doesn't have the performance to run it. Why this is even a question is beyond me.

As for DLAA, I have no experience.
 
Last edited:
People, please, native + DLAA is clearly the right answer! I remember first trying DLAA on ESO & being amazed that it actually almost entirely fixes the games aliasing issues at the cost of very slightly less crisp textures. Worse in stills, but WAY better when moving around. Now it's not that night & day in other games I've tried it in, but it's still usually my preferred since most games have some rubbish TAA applied which you can't disable so sticking DLAA on hides the issues with a lot of games' "native" presentation.

Personally I've yet to play something where DLSS hasn't looked too blurry for my tastes. It's fantastic as an option if you can't get the frame rate you want, but I personally would rather play a game with Medium settings native res (+ DLAA if it's an option), than Ultra settings + DLSS.

Edit: useful background info is I'm playing at 4k on a 42" OLED on a large desk, so it's perhaps easier to spot DLSS' negatives at that size & that close.
So DLAA is a shimmering fix? (which the only other AA I seen do that in past is SGSSAA), thats amazing if true.

I am not a big fan of TAA, its crazy how AA got worse over the years.
 
DLAA if the game isn't demanding (I've only used it in Forza Horizon 5).

DLSS Quality otherwise, as it looks better than regular TAA in every game I've played, at least in 4K. DLSS Performance also looks very good in 4K and I'd absolutely use it to enable RTGI or path tracing (I've used it in Metro Exodus and Dying Light 2).

Native 4K is just a waste of processing power (and actual power). It only makes sense if you don't have to lower any settings, and there's no DLSS available.

That's my take too, if the option is there is often a no-brainer to use as it enables the use of higher graphic settings whilst also bringing higher framerates with little or no image quality loss. For games like Death Stranding it's actually essential.

Nice option to have frankly either way.
 
So DLAA is a shimmering fix? (which the only other AA I seen do that in past is SGSSAA), thats amazing if true.

I am not a big fan of TAA, its crazy how AA got worse over the years.
At least in ESO it is! Well, complete fix might be overselling it, but it's a big improvement. There must be some comparison videos lurking about. I'm fully with you on TAA generally looking like trash, & yes I'm sure default AA techniques have actually gotten worse over the last few years!
 
At least in ESO it is! Well, complete fix might be overselling it, but it's a big improvement. There must be some comparison videos lurking about. I'm fully with you on TAA generally looking like trash, & yes I'm sure default AA techniques have actually gotten worse over the last few years!
Can it be forced on in driver like DLDSR or is it needing devs to implement it?
 
Back
Top