• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA RTX owners only - your opinion on DLSS Image quality

Only works in fullscreen exclusive mode, not windowed or borderless (might be ME's issue)
It's set to fullscreen. Though I've also got Vsync turned on, and I get between 120-240 fps constant. Something weird is going on with the ME:LE engine.

It's not weird, we only get the 2.25x - you're meant to get the fidelity of the 4x, with a lower performance loss
Ah, so nvidia is saying that 2.25x comes at a lesser performance loss than 4x! Who would've thought? :rolleyes: I guess we've got to test 2.25x DL vs 2.25x conventional. Otherwise, performance metrics mean nothing.
 
It's set to fullscreen. Though I've also got Vsync turned on, and I get between 120-240 fps constant. Something weird is going on with the ME:LE engine.


Ah, so nvidia is saying that 2.25x comes at a lesser performance loss than 4x! Who would've thought? :rolleyes: I guess we've got to test 2.25x DL vs 2.25x conventional. Otherwise, performance metrics mean nothing.
Yeah but they're saying the 2.25x is the same image quality as the original 4x, we should get more options later - it literally just launched
 
Yeah but they're saying the 2.25x is the same image quality as the original 4x, we should get more options later - it literally just launched
2.25x standard looks pretty crisp on my monitor already, so I have no way to test that claim. The only thing I can (and will) test is the performance of 2.25x DL vs 2.25x standard.
 
This new feature has made me move my PC and spare desk to the living room, so that i can now use the 55" UHDTV to play games in 6K res
 
This new feature has made me move my PC and spare desk to the living room, so that i can now use the 55" UHDTV to play games in 6K res
No chance I'm gonna try that with a 2070. :roll:

Right, I've got some performance results in World of Tanks enCore RT:

1080p + 1.76x DSR (1440p), Ultra graphics, Ultra RT: 12954 points,
1080p + 1.76x DLDSR (1440p), same everything: 12665 points.

I'm not sure if points convert into FPS, but if they do, that's a 2.2% performance loss in DL mode. Though I have to add that the picture looks a tiny bit sharper, so it's a good trade-off, I guess. :ohwell:

Although, I still don't think I'm gonna turn on any DSR, as the resolution switch messes up the size and position of my apps on my second screen. :(
 
Last edited:
5K DLDSR (1.78x) looks even sharper than 8K DSR (4x), 8K DSR is slightly sharper than 4K Native in PUBG

Looks to me like DLDSR is a more advanced sharpening filter than Nvidia's Sharpen/Sharpen+ and AMD's RIS. There is hardly any ringing artifacts with DLDSR, though there is some noticeable contrast modification similar to RIS.
 
5K DLDSR (1.78x) looks even sharper than 8K DSR (4x), 8K DSR is slightly sharper than 4K Native in PUBG

Looks to me like DLDSR is a more advanced sharpening filter than Nvidia's Sharpen/Sharpen+ and AMD's RIS. There is hardly any ringing artifacts with DLDSR, though there is some noticeable contrast modification similar to RIS.

I was noticing that also that 1.78 looks better than the 4x on my LG C1 but that dudes smoothness option is set way too high.
 
I noticed in 7D2D if i had the smoothness setting above 33%, my white aiming reticule lines became half as thick and turned grey instead making them hard AF to see
 
I'm about ~10 hours into God Of War and the visuals are just stunning (dad life means slim game time)

Experimented a lot with optimized this and that, in game settings, resolutions etc, but the 3080 just powers through anything, so my current settings are basically give me all the IQ.

Native res is 3440x1440, 2.25x DLDSR sets a resolution of 5120x2160, using DLSS quality mode for a render res matching my native 3440x1440.... DLAA on crack?

Optimised settings are smart, but using the Ultra preset, and the above settings, I'm still getting 67-80fps in typical gameplay, sometimes over 80, not once below 60 and it feels exceptionally buttery and clear.

Yeah, pretty glad I never played this on checkerboarded original settings on PS.

Helps that the game is fucking awesome of course.
 
  • Like
Reactions: Kei
From a normal sitting distance on a 4k TV the checkerboard implementation is hard to notice.... Played it on both the PS4 pro and PS5 and now on PC the game wowed me more when it came out than it does now running on it's RX570 ish level gpu in the pro vs my 3080ti. Seems to run 4k native max settings around 60-70fps or 90-100fps with dlss ultra quality. I find the DLSS implementation to be pretty good. HDR in this game on my C1 oled is pretty amazing though and makes a bigger difference than it not being checkerboard visually. Although the pro and the ps5 already did that well.

Even more so than AI upscaling pc gamers deserve better monitors hopefully 2022 fixes that with both the QD Oled and mini led displays coming out hopefully they've fixed some of the issues from previous released ones and they hopefully won't cost a kidney.
 
HDR in this game on my C1 oled is pretty amazing
I've had my 3080 rig plugged into a mates CX65, and I must agree, not only does it do the 120hz/Gsync thing, but the HDR is breathtaking.
Even more so than AI upscaling pc gamers deserve better monitors
A C2 42 is on my shopping list this year, I can't wait.
 
A C2 42 is on my shopping list this year, I can't wait.

A very good plan..... You definitely will not regret it and from a normal sitting distance DLSS works even better than on a monitor regardless of the screen size.
 
just a heads up on the game. it just really good sdr.
seeing i have play it. on both platforms and they went back in and touch up sdr to look better on the pc.
 
Picked up Deathloop recently, really enjoying DLSS in this title, especially as it's implemented with a dynamic mode with FPS target, set and forget basically. Interesting game concept too, lets see how long it can keep my interest.

Still waiting on this bloody LG C2 42" that I paid for what feels like 2 months ago... I can't wait to see a bit more of DLSS's capability explored where it has the best chance to shine.
 
Picked up Deathloop recently, really enjoying DLSS in this title, especially as it's implemented with a dynamic mode with FPS target, set and forget basically. Interesting game concept too, lets see how long it can keep my interest.

Still waiting on this bloody LG C2 42" that I paid for what feels like 2 months ago... I can't wait to see a bit more of DLSS's capability explored where it has the best chance to shine.
dlss on streaming asset games.. wont work correctly. flight sim and 1 cod tried it.
its utter garbage. rant done
yes deathloop nice.
 
I've been playing FS2020 for a week or so now on my new LG 42 C2, optimised settings, getting around 60-70 fps when GPU limited which is not bad at all. Was trawling the web last night and saw I could opt into the Beta and get build 10 preview which has DLSS and glad I did. 80-90 FPS now when GPU limited, it was ever so slightly softer with standard sharpening but increasing that somewhat has made it very comparable to native, I'm impressed. Will be keen to test FSR when it comes too which I believe has been annoucned.
 
Using RTX 3050, i try DLSS 2.0 not looking up for the image quality. I am just looking for better fps. I should check it for image quality
 
Using RTX 3050, i try DLSS 2.0 not looking up for the image quality. I am just looking for better fps. I should check it for image quality
It can depend a lot on the resolution of your monitor, the higher it is, the better the visual result. So at 4k you get very comparable to native results, sometimes aspects are actually better. On the whole though 1080p would almost certainly appear softer, but like you say, it's more about FPS increase while minimising quality loss, which it's also excellent at.
 
HUB recently did some testing of DLSS vs FSR, many of you will have seen. the best FSR could manage over 26 games was a tie, never once besting DLSS outright.

Off the back of that, HUB opted to test DLSS vs Native, and since FSR never won a comparison vs DLSS it wasn't worth coming in a 3 way shootout, just DLSS vs native.

Here's the table of results below. Tim going on to say that yes indeed, DLSS can give better than native results, broadly on par results, and of course lesser looking results, however on a balance of IQ and performance gained, DLSS is clearly the desirable one to enable if it's better than native and anywhere between much better and tied, and in his opinion still often desirable over native when it loses, if you aren't CPU limited or otherwise not gaining any performance from DLSS. Of course, if Native had better TAA, it would look better, hardly a shocker there, but Dev's virtually never go back and update TAA - probably the best version of that argument is to use a mod to get DLAA (or native DLAA support), or FSR and ratchet up the internal res to 0.9x-1x to just use it as AA. Tim also rightfully adds how easy it is to drop a new DLL in, which would have changed at least one result (RDR2), if not more of these results to favour DLSS, or tie. Tim also adds that developers should be updating games to have the latest/best DLL, and same for FSR versions - especially when the game is still actively being patched and updated.

Interesting results and it's great to see what I've been saying for quite some time now be thoroughly corroborated, which is; 'DLSS can look as good or better than native'. Naturally, I game at 4k where this tech is at it's most useful and thoroughly improves a given cards capability to drive that res.

1681711436709.png
 
I was impressed with how much DLSS has improved at 1440p vs native.
 
RT is not so interesting for me, DLSS however is very impressive. I use it in many games even though I'm at 1080p just fpr the boost in fps while maintaining similar image quality.
 
RT is not so interesting for me, DLSS however is very impressive. I use it in many games even though I'm at 1080p just fpr the boost in fps while maintaining similar image quality.

It's definitely gotten to a point where it just doesn't make sense not to use it unless you are already hitting your displays refresh rate and even then somtimes its worth using over the games TAA.
 
HUB recently did some testing of DLSS vs FSR, many of you will have seen. the best FSR could manage over 26 games was a tie, never once besting DLSS outright.

Off the back of that, HUB opted to test DLSS vs Native, and since FSR never won a comparison vs DLSS it wasn't worth coming in a 3 way shootout, just DLSS vs native.

Here's the table of results below. Tim going on to say that yes indeed, DLSS can give better than native results, broadly on par results, and of course lesser looking results, however on a balance of IQ and performance gained, DLSS is clearly the desirable one to enable if it's better than native and anywhere between much better and tied, and in his opinion still often desirable over native when it loses, if you aren't CPU limited or otherwise not gaining any performance from DLSS. Of course, if Native had better TAA, it would look better, hardly a shocker there, but Dev's virtually never go back and update TAA - probably the best version of that argument is to use a mod to get DLAA (or native DLAA support), or FSR and ratchet up the internal res to 0.9x-1x to just use it as AA. Tim also rightfully adds how easy it is to drop a new DLL in, which would have changed at least one result (RDR2), if not more of these results to favour DLSS, or tie. Tim also adds that developers should be updating games to have the latest/best DLL, and same for FSR versions - especially when the game is still actively being patched and updated.

Interesting results and it's great to see what I've been saying for quite some time now be thoroughly corroborated, which is; 'DLSS can look as good or better than native'. Naturally, I game at 4k where this tech is at it's most useful and thoroughly improves a given cards capability to drive that res.

View attachment 291888
I still think it's a subjective experience, though.
 
It's definitely gotten to a point where it just doesn't make sense not to use it unless you are already hitting your displays refresh rate and even then somtimes its worth using over the games TAA.
Without a doubt at 4k (which I game at now), but I already considered it a no brainer on my old 34" 3440x1440 144hz VA monitor.
I still think it's a subjective experience, though.
Might I ask how long it's been since you tried it, and at what res? Iirc you rock a 6700/6750XT and game at 1080p? Obviously the tech is less robust at 1080p and furthermore I'd hazard a guess its been a long time since you used your RTX card? It's not the simplest task to switch between AMD and nvidia and get all the settings spot on again. I'd highly recommend trying your RTX card with the latest DLL and tell us how you feel it does at 1080p, otherwise I think your comments aren't really within the spirit of the thread and request in post #1. I do respect your comments and opinion, I just want to ensure they're presented in line with the rather explicit theme of the thread.
 
Without a doubt at 4k (which I game at now), but I already considered it a no brainer on my old 34" 3440x1440 144hz VA monitor.

For me it's gone from I guess I'll use it the hit to image qualify is small and the extra frames are nice to thinking wow this looks better than the in game TAA solution in some scenarios.

Setting up witcher 3 next gen for my wife at 1440p I was pretty surprised how good it looked vs native but figured that was a one off.

And definitely people who are not swapping in the lates DLL files are doing it wrong takes like 5 seconds
 
Back
Top