• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

What DLSS/FSR Upscaling Mode do you use?

What DLSS/FSR Upscaling Mode do you use?

  • Native

    Votes: 13,024 44.5%
  • Quality

    Votes: 11,341 38.8%
  • Balanced

    Votes: 2,593 8.9%
  • Performance

    Votes: 1,376 4.7%
  • Ultra Performance

    Votes: 930 3.2%

  • Total voters
    29,264
  • Poll closed .

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,960 (3.75/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
With upscalers becoming more and more common, we're wondering are you using "Quality" most of the time, or do you prefer "Ultra Performance" for those extra FPS? Or maybe you really like native, without upscalers?

Obviously all games are different, but what setting do you typically use, most of the time.
 
I always run native unless I absolutely have to. Being on AMD means I get access to FSR and XeSS and I've found that the sacrifice of visual fidelity is too much for my liking. Even in STALKER 2 where I have to for anything above 60fps I still run native. I did try FSR3 and frame-gen in STALKER 2 and the artifacting/ghosting was too much for my liking.
 
Voted Native.
I will drop other settings before resorting to upscaling, as I am allergic to artifacts at low-ish frame rates.
 
I'd reeeeally rather go native. But if I have to, then quality.
 
I miss the option
*) what the game offers

Some games do not have upscalers. Some games only have upscalers.

I voted native. A few games only have upscalers. Even less have upscalers for amd graphic cards.
 
Quality 99% of the times.
Balanced when the 4080 can not perform well in heavy RT/PT games.
 
Must be a joke to use upscalling just to compensate the tanking in FPS because of RT.

In other words: native withour RT.
 
Didnt use it at all until i got my 4K 240hz monitor.. now i stick with balanced.
 
Native or Quality -> if i can't see/notice the difference i go Quality
 
DLAA, or FSR at 100% scaling - but only if the native AA/TAA is garbage, or if it's a game where frame-gen is appropriate (mandating DLAA or FSR).

Temporal blur that's done poorly is, IMO, far worse than no AA at all - and I know that no AA at all looks pretty damn horrible. There are plenty of long-form videos on Youtube from popular and obscure sources, whitepapers, reddit threads, yada yada - Temporal AA is ruining games unless it's tuned very carefully to minimise inter-frame blur - and in an overwhelming majority of titles, it's tuned poorly or not at all.
 
I'm afraid I'm going against the flow, but I actually vary what I use for both game and system-specific reasons.

I'll provide a few examples to get the idea of what I usually think.

In something like Call of Duty: Black Ops 6, I use DLAA for the campaign and DLSS Quality outside of it. There is some image degradation with DLSS, so you can think of it as "prioritize image quality" vs "prioritize performance while keeping it as close as native as possible". Given how relatively difficult the game is to run even considering that I've dropped settings (well, to reach a very high frame rate) for non-campaign content, every bit helps.

I also have quite a bunch of games that are graphically demanding (e.g. Cyberpunk 2077, Witcher 3) when ray tracing is enabled. The 8GB VRAM is also fairly small all things considered, and I'm also dealing with an increased pixel count of ultrawide 2560x1080 instead of the standard 1920x1080. I'd do anything to get the thing to be at least responsive and smooth. I've noticed that usually, once I've dialed in my desired settings (which surprisingly, isn't too far off what NVIDIA App's auto-optimize function would suggest), it's entirely possible to get to a 80FPS target or more with a bit of leftover performance to spare, if I enabled both DLSS Quality and frame generation. In my experience, it's actually harder to reach 60FPS with no FG than it is to reach 80 with. Funnily enough, the "weaker" cards still end up in a situation where significant amounts of RT effects were able to be enabled.

There's one more game, Wuthering Waves, that ends up with me having DLSS Quality enabled at all times even when the hardware is otherwise capable of running at native with no performance implications. The normal TAA has edge flicker issues. So it kind of ends up echoing what others have already said. DLAA isn't supported (yet). The game will receive a RT update in the near future, so it also means that there's a bit more headroom later on - I did watch the demonstration video NVIDIA posted and I think the general improvements are likely to be going to be worth it (although they described it as RT reflections, it looked like it also affected general lighting.)

If the game isn't demanding enough to actually need DLSS Quality and DLAA was available, I'd use it. There's actually something interesting, though. If you can't tell the difference, there's potentially a power saving angle, if the game is also being played with a set frame rate limit, which I usually use on single-player games. You also shave off the VRAM usage on games with resolution-dependent buffers, which I've noticed that can somewhat benefit in like the newest Indy game which had a high baseline VRAM usage, and extremely so in Stalker 2, where, when I last played the latter while checking out new stuff on Game Pass, had a nasty memory leak.

I think that's about it for the games that I've recently played so far.

For the "system-specific" part, I also sometimes play on a laptop with just a 2050 (mostly because it barely cost more than a very similar laptop with no dGPU and it's at the price tier where the CPUs won't have high-end iGPUs). Here, even after discounting games that will never fit within a 4GB buffer, DLSS Quality, or even more aggressive upscaling, becomes critical to get a decently smooth experience for the games that I'd play on it.

So in the end... I don't really have a set answer, to be honest, but if I had to pick a firm selection, it'd be DLSS Quality, FG where available and won't cause me to VRAM crash.
 
This is a massive "it depends". I try to target a certain performance level and see how far I can get before visual start to take a massive nosedive.
I use 4k.
Starfield is 50% render scale, visually the difference isn't that large because starfield looks like shite and performance even more so.
But that's more of an outlier, most other games I don't really want to go below 1440p render resolution because I can start to visually tell.
 
Native. Unless the game is unplayable, which is rare. I have run into a couple games by now that don't have a native option, then I use DLSS - Quality. Same if performance is not satisfactory at native.
 
DLAA, or FSR at 100% scaling - but only if the native AA/TAA is garbage, or if it's a game where frame-gen is appropriate (mandating DLAA or FSR).

Temporal blur that's done poorly is, IMO, far worse than no AA at all - and I know that no AA at all looks pretty damn horrible. There are plenty of long-form videos on Youtube from popular and obscure sources, whitepapers, reddit threads, yada yada - Temporal AA is ruining games unless it's tuned very carefully to minimise inter-frame blur - and in an overwhelming majority of titles, it's tuned poorly or not at all.
I completely agree with you, I'm using FSR Native in War Thunder and it works better than the other AA option with better fps. SSAA is better but fps take a serious hit.
I've seen videos about temporal anti-aliasing and it's baffling to me that they let those poor implementation go into the actual product.
 
native, but if I need it and if there are options in the game with dynamic resolution I might use that instead of upscaling. In BO6 with dynamic resolution in 4K it is clearly better than the upscaling I tried.
 
There's only one right option: Native, no RT.
Miss me with fake-scaling, fake-frames, whatever bullsh`t they create to deceive res/fps.
 
I stated performance but I only do that on my x86 handhelds. On my main rig I do not use FSR. It's native or nothing for me.
 
DLSS Quality mode you cannot tell the difference from Native. FSR is a post-operations upscaler so it is a bit more blurry and artifacts. DLSS Frame Generation is only good if your above 120 FPS. Otherwise it gives a laggy gameplay feel.
 
Native when possible, Quality when necessary.

I game with low end GPUs a bunch and quality DLSS/FSR is a savior for that hardware, however I usually choose to play games which don't need those. Lotsa good games from 2020 and earlier I still gotta play.
 
DLSS Quality mode you cannot tell the difference from Native. FSR is a post-operations upscaler so it is a bit more blurry and artifacts. DLSS Frame Generation is only good if your above 120 FPS. Otherwise it gives a laggy gameplay feel.
FSR 2+ work like DLSS and can be before or after post processing.


As for me, it really depend on the game. My monitor is a 4K 144Hz OLED TV. I can see fairly easily the artefact of upscaling. Contrary to some people here, i tend to prefer not use too aggressive DLSS settings when ray tracing is on as i find the image way too noisy for my taste. It's even more noticeable if you can't get the FPS high enough.

I try to use native/DLAA or if i have to, else i try to use the least aggressive setting. There is an exception. In summers i find that my computer generate too much heat so i don't hesitate to use DLSS Performance + Frame Cap to lower my system power consumption. And that work.

But next summers i will install a mini split for my basement and i shouldn't have that issue anymore.
 
I voted quality, but I should selected it depends: 2077 no upscalling, stutter of chernobyl dlaa, mirage dlss, always turn off TAA. If hardware is enough for 30-40fps 4k in capframex: no upscalling
 
I vote native and always be, if only AA temporal available I would also turn image sharpening on.
In my opinion the other 4 options are a bit redundant, they are all in the same scam just have different names.
Say what you want about upscaling, or other more superior upscaling, I've done my homework and debunked "original" still better. Funny enough, DLSS works best with IPS, and FSR with VA panels.
 
Back
Top