• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA RTX owners only - your opinion on DLSS Image quality

I've also watched that HUB comparison vid yesterday and for the most part I agree with the conclusion.
I'm only gaming on a 29" 2560x1080 ultrawide monitor. 'pixel count sits between 1080p and 1440p but closer to 1080p'

And even on this monitor I find DLSS Quality to be perfectly usable and arguably better than native TAA in some games.
If the default DLLS feels off I manually switch to the 2.5.1 version which gave me some decent results so far. 'I'm yet to try newer versions tho'

HUB also pointed out probably the most important part for me, the image stability.
I'm not sure whats going on with some games but I can notice so much texture/whatever else flickering with native TAA that it just bothers me once I notice it. 'especially foliage flicker'

DLSS seems to fix that for the most part so for that alone I prefer using DLSS on Quality and everything else is just a bonus like the better performance and the better small details in the background. 'This I noticed in Cyberpunk if I tried native vs DLSS'

I've noticed that some ppl complain about bad ghosting with DLSS but to be honest I'm yet to notice that in the games I've played with DLSS or maybe those games use a worse DLSS version idk.
 
HUB recently did some testing of DLSS vs FSR, many of you will have seen. the best FSR could manage over 26 games was a tie, never once besting DLSS outright.

Off the back of that, HUB opted to test DLSS vs Native, and since FSR never won a comparison vs DLSS it wasn't worth coming in a 3 way shootout, just DLSS vs native.

Here's the table of results below. Tim going on to say that yes indeed, DLSS can give better than native results, broadly on par results, and of course lesser looking results, however on a balance of IQ and performance gained, DLSS is clearly the desirable one to enable if it's better than native and anywhere between much better and tied, and in his opinion still often desirable over native when it loses, if you aren't CPU limited or otherwise not gaining any performance from DLSS. Of course, if Native had better TAA, it would look better, hardly a shocker there, but Dev's virtually never go back and update TAA - probably the best version of that argument is to use a mod to get DLAA (or native DLAA support), or FSR and ratchet up the internal res to 0.9x-1x to just use it as AA. Tim also rightfully adds how easy it is to drop a new DLL in, which would have changed at least one result (RDR2), if not more of these results to favour DLSS, or tie. Tim also adds that developers should be updating games to have the latest/best DLL, and same for FSR versions - especially when the game is still actively being patched and updated.

Interesting results and it's great to see what I've been saying for quite some time now be thoroughly corroborated, which is; 'DLSS can look as good or better than native'. Naturally, I game at 4k where this tech is at it's most useful and thoroughly improves a given cards capability to drive that res.

View attachment 291888
Upto 4k where native reigns supreme again, he gave an evaluated conclusion so others don't have to.

That's from dlss v native he concluded dlss had little to give IQ wise@4k but tbf wasn't often worse than native, all quite subjective anyway.
 
Upto 4k where native reigns supreme again, he gave an evaluated conclusion so others don't have to.

That's from dlss v native he concluded dlss had little to give IQ wise@4k but tbf wasn't often worse than native, all quite subjective anyway.

It's still impressive that at both 1440p/4k in the quality mode it either tied or was better in 14 out of the 24 games tested at 4k and 15 at 1440p

When DLSS 2.0 first launched I can't remember a single game at 1440p it would be considered tied with native let alone better.

Lets not even talk about how bad DLSS 1 was....


I do find it funny that they've neen called amd biased for a while and everyone jumped their shit for doing and amd/nvidia comparison using FSR2. Gotta love the fanboys though from both sides.

I'm glad they are dropping DLSS/FSR from their main reviews and only comparing it in dedicated videos like this.
 
I still think it's a subjective experience, though.

agree and not only that, it depends on what your watching, still frames, moving frames, what frame rate, what areas of the game. I don't think these conclusions are scientific in the least.
 
Might I ask how long it's been since you tried it, and at what res? Iirc you rock a 6700/6750XT and game at 1080p? Obviously the tech is less robust at 1080p and furthermore I'd hazard a guess its been a long time since you used your RTX card? It's not the simplest task to switch between AMD and nvidia and get all the settings spot on again. I'd highly recommend trying your RTX card with the latest DLL and tell us how you feel it does at 1080p, otherwise I think your comments aren't really within the spirit of the thread and request in post #1. I do respect your comments and opinion, I just want to ensure they're presented in line with the rather explicit theme of the thread.
I never said it's a bad experience - only a subjective one. ;) The same way motion blur is something that people either like, hate, or don't mind. You can't say turning everything on will make any game look better for everyone. Similarly, you can't say upscaling tech is objectively better and everyone should turn it on. If that's the case, then I say Nirvana is the greatest rock band of all times, and every living soul on the planet should listen to their Nevermind album all day and night.

As for my experiences... I tried DLSS 2.0 with Cyberpunk 2077 on my 2070. The "Quality" setting was an acceptable compromise between frame rate and looks. The image was slightly blurry, but the slightly higher framerate was a good compensation. I ended up finishing the game with that setting. I couldn't say the same about the "Balanced" and "Performance" settings which I didn't like at all.

Then, I tried FSR in God of War. The image quality didn't seem as bad at lower settings as they did in Cyberpunk, but I still wouldn't go lower than "Quality". Nothing beats native resolution, though, and since my 6750 XT did a stable 60 FPS anyway, I finished the game with FSR off.

My conclusion is that I'll enable DLSS or FSR at "Quality" setting if I have to, but then I should start thinking about a graphics card upgrade. Native will always look best. But like I said, it's a subjective experience. If you like it, by all means, enable it. I'll just never give a damn about what youtubers and other reviewers think. There's no science in what kind of image you prefer.

agree and not only that, it depends on what your watching, still frames, moving frames, what frame rate, what areas of the game. I don't think these conclusions are scientific in the least.
That too. No one should form any opinion before trying it in a game.
 
It's still impressive that at both 1440p/4k in the quality mode it either tied or was better in 14 out of the 24 games tested at 4k and 15 at 1440p

When DLSS 2.0 first launched I can't remember a single game at 1440p it would be considered tied with native let alone better.

Lets not even talk about how bad DLSS 1 was....


I do find it funny that they've neen called amd biased for a while and everyone jumped their shit for doing and amd/nvidia comparison using FSR2. Gotta love the fanboys though from both sides.

I'm glad they are dropping DLSS/FSR from their main reviews and only comparing it in dedicated videos like this.
It is.

I like it right up to that moment I do a quick 180° spin.

Some games, that doesn't matter, most it does.

So I wouldn't say I won't use Dlss2, just not unless it's really necessary and not a fast paced FPS for example.

I like how the OP starts off with some stuff about just dlss 2 no bitchin yet just posted that joyous rebuttal of FSR.

I didn't mention it because of the OP content, but funny none the less.

HUB analysis has been pretty spot on to be fair but what was said by Tim, has been given new meaning by some, off topic here though.
 
No one should form any opinion before trying it in a game.

This I 100% agree with, sadly nowadays ppl form opinions so damn fast sometimes not even watching /reading the review let alone have personal experience first.
Reading the comments under the HUB video makes this painfully obvious.

I did notice this topic earlier but since I had no RTX card I did not want to take a part in it since I had no personal experience, now that I do I can share how I personally feel about it.
Sure some ppl might argue me that I'm wrong but I don't think there is right or wrong but more like what my own eyes are telling me and I will for sure play my games that way even if its not 'generally' accepted.

For me DLSS is a good and worthy stuff/tech and it was actually a selling point to me when I was deciding between a new AMD or Nvidia card for the same exact price on the second hand market.
I have nothing against owning yet another AMD card since I had no issues with my previous ones but not having DLSS would make me think twice now. 'unless they up their game and come up with a better FSR 3.0 or something'
 
This I 100% agree with, sadly nowadays ppl form opinions so damn fast sometimes not even watching /reading the review let alone have personal experience first.
Reading the comments under the HUB video makes this painfully obvious.

I did notice this topic earlier but since I had no RTX card I did not want to take a part in it since I had no personal experience, now that I do I can share how I personally feel about it.
Sure some ppl might argue me that I'm wrong but I don't think there is right or wrong but more like what my own eyes are telling me and I will for sure play my games that way even if its not 'generally' accepted.

For me DLSS is a good and worthy stuff/tech and it was actually a selling point to me when I was deciding between a new AMD or Nvidia card for the same exact price on the second hand market.
I have nothing against owning yet another AMD card since I had no issues with my previous ones but not having DLSS would make me think twice now. 'unless they up their game and come up with a better FSR 3.0 or something'
And yet as someone who has tried both, your opinion should be tested by you trying fsr2 quality, you won't find much wrong there either but for the same things , fractionally better though Dlss2 maybe, your getting excited without really knowing, funny because you start by saying

"
sadly nowadays ppl form opinions so damn fast sometimes not even watching /reading the review let alone have personal experience first.
Reading the comments under the HUB video makes this painfully obvious."

Then End by doing just that.

I'm out,.. , my opinions out there, that'll also do.
 
Sure some ppl might argue me that I'm wrong but I don't think there is right or wrong but more like what my own eyes are telling me and I will for sure play my games that way even if its not 'generally' accepted.
Well said! :)

My own eyes are telling me that DLSS/FSR "Quality" aren't bad on my monitor, but native resolution is always best. But each to their own. :)
 
And yet as someone who has tried both, your opinion should be tested by you trying fsr2 quality, you won't find much wrong there either but for the same things , fractionally better though Dlss2 maybe, your getting excited without really knowing, funny because you start by saying

"
sadly nowadays ppl form opinions so damn fast sometimes not even watching /reading the review let alone have personal experience first.
Reading the comments under the HUB video makes this painfully obvious."

Then End by doing just that.

I'm out,.. , my opinions out there, that'll also do.

I never said that I did not try FSR 2 quality, I was using it on my GTX 1070 before I got a RTX card. :)
It was usable but there I could definitely notice the difference while with DLSS Quality I cannot for for the most part and in most games I actually prefer it vs TAA.

I mean FSR 2 is not bad if there is nothing else to use but I still think it needs some more work untill I can consider them on a equal field for my preferences. 'even nowadays I try all of the available upscaling options in a game before switching to DLSS'
 
Last edited:
And yet as someone who has tried both, your opinion should be tested by you trying fsr2 quality, you won't find much wrong there either but for the same things , fractionally better though Dlss2 maybe, your getting excited without really knowing, funny because you start by saying

"
sadly nowadays ppl form opinions so damn fast sometimes not even watching /reading the review let alone have personal experience first.
Reading the comments under the HUB video makes this painfully obvious."

Then End by doing just that.

I'm out,.. , my opinions out there, that'll also do.

I'm pretty harsh on FSR2 and honestly I've tried it over and over thinking I was crazy thinking it had some major issues I was seeing that somehow others didn't. I even started to think that maybe it was just due to gaming on a large display making the pixels a little easier to pick out.

I think Tim hit the nail on the head though and it really just has to do with flickering of fine details bothering me more than the inherent downsides of dlss. The thing is though the in game TAA also exhibits this and the issues in the past is I would just swap between FSR and DLSS without even closely comparing native and being slightly harder on FSR than I should be becuase of that.

Honestly what my real hope with those two videos is that AMD sees them and wants to do better DLSS is a pretty big selling point and it will be going forward. I would love FSR to get to a point where that isn't the case though.

At the same time my wife who is mostly oblivious to all this stuff I've swapped back and forth between native, dlss, and FSR and maybe she notices but she's never told me anything she just happily games on whatever settings the games on when she launches it. Maybe that's scientificly the best way just to enjoy games regardless.
 
I'm pretty harsh on FSR2 and honestly I've tried it over and over thinking I was crazy thinking it had some major issues I was seeing that somehow others didn't. I even started to think that maybe it was just due to gaming on a large display making the pixels a little easier to pick out.

I think Tim hit the nail on the head though and it really just has to do with flickering of fine details bothering me more than the inherent downsides of dlss. The thing is though the in game TAA also exhibits this and the issues in the past is I would just swap between FSR and DLSS without even closely comparing native and being slightly harder on FSR than I should be becuase of that.

Honestly what my real hope with those two videos is that AMD sees them and wants to do better DLSS is a pretty big selling point and it will be going forward. I would love FSR to get to a point where that isn't the case though.

At the same time my wife who is mostly oblivious to all this stuff I've swapped back and forth between native, dlss, and FSR and maybe she notices but she's never told me anything she just happily games on whatever settings the games on when she launches it. Maybe that's scientificly the best way just to enjoy games regardless.

Yeah I'm on a similar if not the same view, I did play around with FSR and still try it out regardless of my RTX card.
Heck I even tried out Intel's upscaling in some games but I heard/read that works better on actual Intel cards anyway so I can't draw a proper opinion for myself.

And yea I sometimes miss the days when I was completely oblivious to any of the tech mumbo jumbo and just played my games in a whatever way and still had all the fun.
I mean I still have fun but I sometimes spend too much time messing around with my settings and find things to bother myself with instead of just playing. :laugh: 'currently playing Serious Sam 4 on Ultra but 12 settings customized..'
 
semi tangent off topic.
but most people that talk or use RT.
think oh its shiny surfaces....
Upscale tech for games... mass used start around 360 era and keep going.
back then. the really smart people guess right on current hardware just cant do native anymore and said rez;s
that even still using sub 2k/4k assets. which massively saves on storage space/bw/vram
 
I think DLSS is a good thing to boost performance when using raytracing. Because without DLSS RT performace is just to bad even on RTX 3000. Some games look really good with RT, so the decline in overall image quality by adding DLSS is acceptable.

On the other hand, i still think DLSS looks shitty and since RTX 3000 is very fast without RT, there is no need to smear the image by turning on DLSS for these (non-RT) scenarios.

In the end, DLSS is just a gap filler until RT finally gets the performance it needs. DLSS is nice to have in some scenarios, but whenever the framerate without it is high enough to satisfy the viewers expectations, I would turn it off. Hopefully one day GPUs will be powerful enough to drop rasterrisation completely and generate pictures by raytracing alone.
Unfortunately any real RT is done on a render farm. Will not see it on a single PC for a decade. I am waiting for the cloud gaming world to kick in, my low ping AT&T fiber connection can handle it now.
 
This I 100% agree with, sadly nowadays ppl form opinions so damn fast sometimes not even watching /reading the review let alone have personal experience first.
It's unfortunate, and online is absolutely the worst for it, hearing people crap-talk somewith with zero real experience with it.
As for my experiences...
Nothing beats native resolution, though
Native will always look best.
I'll just never give a damn about what youtubers and other reviewers think. There's no science in what kind of image you prefer.
My own eyes are telling me that DLSS/FSR "Quality" aren't bad on my monitor, but native resolution is always best. But each to their own. :)
This testing, plus many others, like what I quoted here, have clearly shown that Native resolution isn't always best, even at 1440p shown here, upscaling can produce and where from a tie to being significantly better.

These tests are performed as objectively as possible, by people in the know, and there's quite a lot of evidence now to suggest native isn't always best, there absolutely are scientific ways to assess which techniques provide less or more shimmer, ghosting, disocclusion artefacts etc. You can certainly stand by your argument that it's all subjective and you are able to claim to prefer whatever suits your eyes and tastes best, that's everyone's personal truth.

As for your experiences, sounds like you tested Cyberpunk at 1080p with a 2070 18+ months ago, I suggest you revisit it or stop posting in this thread, as per my request in the opening post.
If that's the case, then I say Nirvana is the greatest rock band of all times, and every living soul on the planet should listen to their Nevermind album all day and night.
Please, don't use strawman arguments.

If this is what you want to discuss, native is always best, personal preference etc and aren't adding anything worthwhile or new to the discussion I started with quite specific rules, please, do it in another thread. Last time I ask.
 
It's unfortunate, and online is absolutely the worst for it, hearing people crap-talk somewith with zero real experience with it.

This testing, plus many others, like what I quoted here, have clearly shown that Native resolution isn't always best, even at 1440p shown here, upscaling can produce and where from a tie to being significantly better.

These tests are performed as objectively as possible, by people in the know, and there's quite a lot of evidence now to suggest native isn't always best, there absolutely are scientific ways to assess which techniques provide less or more shimmer, ghosting, disocclusion artefacts etc. You can certainly stand by your argument that it's all subjective and you are able to claim to prefer whatever suits your eyes and tastes best, that's everyone's personal truth.

As for your experiences, sounds like you tested Cyberpunk at 1080p with a 2070 18+ months ago, I suggest you revisit it or stop posting in this thread, as per my request in the opening post.

Please, don't use strawman arguments.

If this is what you want to discuss, native is always best, personal preference etc and aren't adding anything worthwhile or new to the discussion I started with quite specific rules, please, do it in another thread. Last time I ask.

I think @AusWolf games at 1080p I wish TIm did a comparison at that resolution, I think in that scenario Native probably would have won the majority of the time..... I don't personally have a 1080p monitor to see for myself.
 
DLSS is definitely better from what I've tried, I can't get too upset either way because with my RTX card I get the option of both.

Win win!
 
I think @AusWolf games at 1080p I wish TIm did a comparison at that resolution, I think in that scenario Native probably would have won the majority of the time..... I don't personally have a 1080p monitor to see for myself.
So do I, with the 2.5.1 DLL or newer its the lowest modes that got the biggest improvements. Personally I've found it broadly comparable to native on my RTX A2000 / 1080p 165hz IPS box, Dropping the 2.5.1 DLL into Metro EE, Doom Eternal, High on life and Spiderman MM, but I haven't done pixel peeping comparisons, as per usual the real gravy is in the extra performance which is immediately felt, as oppose to quality comparisons which can be far more nuanced than say how 45fps vs 70fps feels and looks. What I should go back and test is using DLSS tweaks to render maybe 75-90% render scale over the 66.6% quality uses, so something like 1600x900 internal rather than 1280x720, I'd put money on being able to match or beat TAA in at least some titles while delivering the same or slightly better performance.
DLSS is definitely better from what I've tried, I can't get too upset either way because with my RTX card I get the option of both.

Win win!
I have definitely had a few FSR 2.X situations at 4k Quality mode come very close, but the difference in their approaches tend to fall short on the fine/distant detail DLSS nails, shimmering/stair steeping on straight edges and foliage, and disocclusion artefacts. DLSS just simply has the best AA in the business IMO, at every render scale including native/DLAA. That's delving into personal taste though, FSR tends to end up more sharpened/slightly less soft unless you apply additional sharpening to DLSS and that might appeal to some? Definitely agreed on how good it is to have both (or all 3 even), but I'd agree with Tim's comment that DLSS is that much better it should be a factor in a buying decision, the scale to which that factor matters and how much that is worth to the individual with each unique purchase decision and what options at what prices are available, will be highly personal of course.
 
My hope now is they put the same amount of effort into improving DLSS3/Framegen because when it works well it's even more impressive than vanilla DLSS. The problem currently is it's only good in 2 out of the 7 games I've tried it in.
 
This testing, plus many others, like what I quoted here, have clearly shown that Native resolution isn't always best, even at 1440p shown here, upscaling can produce and where from a tie to being significantly better.

These tests are performed as objectively as possible, by people in the know, and there's quite a lot of evidence now to suggest native isn't always best, there absolutely are scientific ways to assess which techniques provide less or more shimmer, ghosting, disocclusion artefacts etc. You can certainly stand by your argument that it's all subjective and you are able to claim to prefer whatever suits your eyes and tastes best, that's everyone's personal truth.

As for your experiences, sounds like you tested Cyberpunk at 1080p with a 2070 18+ months ago, I suggest you revisit it or stop posting in this thread, as per my request in the opening post.

Please, don't use strawman arguments.

If this is what you want to discuss, native is always best, personal preference etc and aren't adding anything worthwhile or new to the discussion I started with quite specific rules, please, do it in another thread. Last time I ask.
You asked what experiences I had, and I told you exactly that. Yet again, you use other people's tests and opinions to make me reconsider mine. Why? Who am I playing games for? Myself, or other people? If I think native resolution looks better on my monitor, and my PC still produces the steady 60 FPS that I want in basically all my games, and within half of my GPU's power limit, then why should I force myself to use a setting that only other people think is better?

This thread is about experiences with DLSS 2.0. I don't think I was off-topic and/or should be told to voice my opinion elsewhere just because it's not the same as yours.

I don't have Cyberpunk installed at the moment, but I'll probably revisit DLSS/FSR in The Witcher 3 Enhanced when I get there. I'm having too much fun in Kingdom Come: Deliverance right now. :ohwell:
 
I had, and I told you exactly that.
Yet again, you use other people's tests and opinions to make me reconsider mine. Why? Who am I playing games for? Myself, or other people? If I think native resolution looks better on my monitor, and my PC still produces the steady 60 FPS that I want in basically all my games, and within half of my GPU's power limit, then why should I force myself to use a setting that only other people think is better?
This thread is about experiences with DLSS 2.0. I don't think I was off-topic and/or should be told to voice my opinion elsewhere just because it's not the same as yours.
You already shared that opinion at the time, 18+ months ago in this very thread, and now you're here again doubling down on that opinion formed in 1 title, all that time ago, being argumentative, attempting to use a strawman and saying native is always best - it's not adding anything to this discussion.
This is just about Image quality and performance from the people who have actually extensively played with it enabled.
I've clearly outlined the purpose of the discussion in this thread I created, so please, try it again with new DLL's try other games and you're more than welcome to share, as you'd at least stand to be adding something new to the discussion, with up to date relevant experience using an RTX card which by all accounts it doesn't seem that you have.

Further replies being argumentative will be reported. Stay on topic or don't post. Easy.
 
DLSS is a great tool to achieve better gaming experience, no matter what GPU you have.

If some people are against using a great tool because of some personal bias, that's their loss :)

From my own pixel peeping, Tim's opinions are pretty spot on, though manually updating the DLSS DLL would make the conclusion more in favor of DLSS.

Also people are unaware their IPS/VA monitors have specific overdrive range, if they use VRR and the FPS is outside that range there are ghosting/inverse ghosting, and this ghosting do not show up in recorded gameplay. That means DLSS + FPS cap will ensure that you have the best visual output from your monitor.
 
From my own pixel peeping, Tim's opinions are pretty spot on, though manually updating the DLSS DLL would make the conclusion more in favor of DLSS.
Absolutely, Tim states it would have swung RDR2 to favour DLSS, and from personal experience I'd say the newer DLL/s help DLSS along in FH5, FS2020, and both Spiderman games at the least.
 
Low quality post by AusWolf
You already shared that opinion at the time, 18+ months ago in this very thread, and now you're here again doubling down on that opinion formed in 1 title, all that time ago, being argumentative, attempting to use a strawman and saying native is always best - it's not adding anything to this discussion.
Yes, I did share my opinion, and I had no intention to do it again (all I said was that upscaling tech is a subjective experience), but you asked for it. And now it's my bad that I answered. It seems like anything I do here is wrong for you. If I answer, I'm being argumentative and will be reported. If I don't answer, then my personal opinion is irrelevant and wrong. Just make up your mind, will ya?

I've clearly outlined the purpose of the discussion in this thread I created, so please, try it again with new DLL's try other games and you're more than welcome to share, as you'd at least stand to be adding something new to the discussion, with up to date relevant experience using an RTX card which by all accounts it doesn't seem that you have.
No, you did not, because the purpose is to praise the quality of DLSS over native resolution, apparently. Even just suggesting that it may not always be better for everyone in all scenarios, just like not everyone likes motion blur, for example, is wrong here.

I don't know what you mean by "play extensively". Is doing a whole playthrough in Cyberpunk with DLSS "Quality" enabled not extensive enough?

Further replies being argumentative will be reported. Stay on topic or don't post. Easy.
Go ahead, report me for answering your question and defending my answer that you asked for - which is a personal opinion, by the way, and as such, no one forces you to agree with. I've done nothing against forum rules as far as I'm concerned, nor was I off-topic by talking about DLSS 2.0.
 
I'd kindly ask that people who aren't currently using up-to-date drivers on RTX cards capable of DLSS shouldn't be posting. Tech moves along and we can all agree that the first implementations of DLSS weren't great. But it has improved so arguing over older experiences is tantamount to trolling.

I'll also add that DLSS is game dependent, which makes it harder to argue for as a global thing. I ran Dead Space remake at native because the DLSS was crap, frankly. I noticed it when I started the game and I've got old eyes. I chose some preset and it was on by default and at 1440p the clarity should have been better. Went into menu, took it off, and the game looked far better. Still got 70fps, so was happy. DLSS is not a magic bullet. It's a bullet, but not always hitting the target.
 
Back
Top