# NVIDIA RTX owners only - your opinion on DLSS 2.0 Image quality



## wolf (Apr 22, 2021)

*Much the same as **THIS **post, this is STRICTLY ADDRESSED TO NVIDIA RTX OWNERS AND THEIR PERSONAL EXPERIENCES WITH DLSS2.0 - No I'm not interested in your non-owner opinion from watching youtube comparisons, nitpicking stills, regurgitated reviewer thoughts, how many games it features in etc. *

In the spirit of the post I've mentioned and linked, I'll be reporting posts that do not follow the request of this thread. Don't like it? start your own thread.

This is just about Image quality and performance from the people who have actually extensively played with it enabled. I've seen my fair share of comments across the web from people who Don't own an RTX card and have never actually played a DLSS 2.0 game and/or seen it with their own eyes, they perhaps dislike how it works, and/or have their opinion after seeing some of the things I mentioned above. if that sounds like you, I don't want to hear from you.

Here are some super generalized statements you can feel free to base your response off, or just go from scratch and tell me everything you want.

DLSS 2.0 generally improves the visual quality of the game
I can't tell the difference between DLSS 2.0 on and off
Some parts of the DLSS 2.0 image are as good or better, but some are as good or worse
DLSS 2.0 generally reduces the visual quality of the game
Some other things for owners to comment on if you feel so inclined.

Even if there are IQ limitations, would you enable it anyway for the performance gain? if so which mode?
Are you hopeful for ongoing adoption and support?
Which is your preferred mode and for what native output resolution?
Do you also use Image sharpening / adjust the negative LOD bias in conjunction with DLSS 2.0 to further improve results?
If a game featured RTX and DLSS 2.0, would you use both, just one, neither?
Would you like to see it adopted in more games that do not feature Ray Tracing purely for increased performance?


----------



## ebivan (Apr 22, 2021)

I think DLSS is a good thing to boost performance when using raytracing. Because without DLSS RT performace is just to bad even on RTX 3000. Some games look really good with RT, so the decline in overall image quality by adding DLSS is acceptable.

On the other hand, i still think DLSS looks shitty and since RTX 3000 is very fast without RT, there is no need to smear the image by turning on DLSS for these (non-RT) scenarios.

In the end, DLSS is just a gap filler until RT finally gets the performance it needs. DLSS is nice to have in some scenarios, but whenever the framerate without it is high enough to satisfy the viewers expectations, I would turn it off. Hopefully one day GPUs will be powerful enough to drop rasterrisation completely and generate pictures by raytracing alone.


----------



## Mussels (Apr 22, 2021)

DLSS quality looked like some attempts at anti aliasing over the years, softening things up.
I could run with it on and forget it was there, apart from the odd shimmering texture or visual glitch

I would definitely choose DLSS over dropping the res, but i'm not sure i'd choose DLSS on over disabling other features (like RTX)
I see it as being super helpful on older cards as time goes by... especially in the laptop world.


----------



## nguyen (Apr 22, 2021)

I have tried DLSS 2.0 on 1080p, 1440p and 4K on at least 10 games with DLSS 2.0 and the results are that DLSS is the superior AA version than TAA, maybe not as good as MSAA2x or 4x but that is to be expected.
If and only if a game support MSAA2x and 4x and you are getting playable framerate with MSAA, then I would use MSAA instead of DLSS. For example Shadow of the Tomb Raider where RT is useless and DLSS 1.0 sucks ass.

Otherwise DLSS would be the default AA option in supported games. If the performance is excessive for single player games with DLSS Quality (like Death Stranding, Nioh2) then I just cap the FPS to 120 and save on power consumption (100W less is possible) while getting better IQ than Native + TAA.

And there is no need to using DLSS only with DXR, with competitive games like Fortnite and Warzone support DLSS, turning on DLSS will give you major advantage (higher fps the better)

As for DLSS making image blurry, just use Image Sharpening, it's not that hard.


----------



## wolf (Apr 22, 2021)

Mussels said:


> apart from the odd shimmering


Funny you mention it, I regularly play two DLSS 2.0 games, Control and CP2077 (slowly making my way through both, dad life...) and my finding is that DLSS reduces or even virtually eliminates shimmering, which is at the top of negative visual artefacts for me, making DLSS even more appealing, as shimmering makes the whole image appear unstable in motion. Which games would you say shimmering is worsened by using DLSS?


nguyen said:


> As for DLSS making image blurry, just use Image Sharpening


I definitely agree here, on a per game basis I have tweaked sharpening to offset imagine softness in every game I play. And, this solution is 100% not confined to DLSS, virtually any TAA game stands to benefit as well as many others, I'd highly encourage anyone DLSS or not to give sharpening a try and use the control panel or in game settings to get that 'goldilocks' amount, it can make a drastic difference.


----------



## Mussels (Apr 22, 2021)

wolf said:


> Funny you mention it, I regularly play two DLSS 2.0 games, Control and CP2077 (slowly making my way through both, dad life...) and my finding is that DLSS reduces or even virtually eliminates shimmering, which is at the top of negative visual artefacts for me, making DLSS even more appealing, as shimmering makes the whole image appear unstable in motion. Which games would you say shimmering is worsened by using DLSS?
> 
> I definitely agree here, on a per game basis I have tweaked sharpening to offset imagine softness in every game I play. And, this solution is 100% not confined to DLSS, virtually any TAA game stands to benefit as well as many others, I'd highly encourage anyone DLSS or not to give sharpening a try and use the control panel or in game settings to get that 'goldilocks' amount, it can make a drastic difference.



different shimmer

the shimmering around say - a fence would go away, but some other textures would go weird in exchange. someone had a GIF of it in the other thread, looked the sort of thing a patch or driver would fix.


----------



## nguyen (Apr 22, 2021)

wolf said:


> I definitely agree here, on a per game basis I have tweaked sharpening to offset imagine softness in every game I play. And, this solution is 100% not confined to DLSS, virtually any TAA game stands to benefit as well as many others, I'd highly encourage anyone DLSS or not to give sharpening a try and use the control panel or in game settings to get that 'goldilocks' amount, it can make a drastic difference.



Yeah with a 48in 4K TV hanging <1m from my face, every game need at least .25 Image Sharpening before it can look good, with DLSS Balanced I just increase the sharpening to .5 and it's perfect. Nvidia Image Sharpening is such a versatile tool that not too many people know of.


----------



## toilet pepper (Apr 22, 2021)

The DLSS implementation is different from game to game. I'm currently using a 3440 x 1440 100Hz monitor and DLSS is really a must to get decent frame rates on heavy games.

Cyberpunk 2077 - With the digitalfoundry settings I get around 45-55fps in the city using DLSS Quality. For some reason, DLSS Quality looks better than native if standing still or slowly moving. 

Avengers - DLSS Quality is better than native if standing still. Since this is a fast paced game, there is a barely-noticable blur/smear/abberation on the edsges of moving things.

Outriders - DLSS is better than native. Everything is vibrant and much detailed. Blur is not as noticeable.



For some reason games today have hidden TAA settings. Cyberpunk has it and if you remove it, it breaks the game. I just changed the Negative LOD setting in Nvidia inspector for some games to make them sharper.

Here's some anecdotal observations of mine. (did not thoroughly test) DLSS consumes more power.

I'm already capable of playing Outriders at max settings at 95fps cap I had. I turn on DLSS Quality (coz its better than native) and GPU board power increases by 10 Watts. Since you are rendering it at a lower resolution CPU power would also increase. Just my experience and I'm not sure if you all noiced this as well.


----------



## wolf (Apr 23, 2021)

Mussels said:


> different shimmer
> 
> the shimmering around say - a fence would go away, but some other textures would go weird in exchange. someone had a GIF of it in the other thread, looked the sort of thing a patch or driver would fix.


Very interesting, would you be able to dig it up because I haven't noticed that at all, but I'd be keen to check it out. Hopefully it is the sort of thing that can be patched.


nguyen said:


> Nvidia Image Sharpening is such a versatile tool that not too many people know of.


Yeah, I do wonder how many people use it, in virtually all circumstances I've found it to boost IQ, to my eyes, when tweaked on a per-game basis.


----------



## Mussels (Apr 23, 2021)

wolf said:


> Very interesting, would you be able to fig it up because I haven't noticed that at all, but I'd be keen to check it out. Hopefully it is the sort of thing that can be patched.
> 
> Yeah, I do wonder how many people use it, in virtually all circumstances I've found it to boost IQ, to my eyes, when tweaked on a per-game basis.


cant find it, it was in one of the many 2077 threads here on TPU, but not the 'OFFICIAL' thread


----------



## tabascosauz (Apr 23, 2021)

I only play two games that have it, both DLSS 2.0:

War Thunder - I haven't tested it with tanks yet, but in Air RB DLSS _fucking _sucks. Everything in sight is blurry as hell even on the highest quality setting possible when flying at >1000ft, and DLSS has a terrible time dealing with the distant texture of open water. The performance improvement is tangible, sure (+30ish fps @ 1440p), but the quality is so horrible I haven't used it since testing it for about a day. There are a lot of good, high quality historical skins for the F-4C and F-4E and they straight up look like ass. Especially this livery that I love: WT Live // Camouflage by Danny74_ (warthunder.com). But then again, given Gaijin, it really wasn't a surprise - they're the type to burn water.

MW 2019 - added just yesterday I think? I don't play Warzone but it works in the rest of MW2019 too. The implementation is pretty damn good. Compared to 1440p near-maxed settings, it's a ~25-30fps uplift all the time with no noticeable loss in quality aside from some minor shimmering on some textures (usually weapon skin specific, the deag skin with the union jack on the side). No artifacts, no extra stuttering. Frankly I think it's better on image quality than DLSS off, because the AA modes in MW2019 are kinda jank - you get noticeable jaggies if you turn Filmic off, but if you use Filmic it really degrades the sharpness and visibility of certain parts of the image even if it looks cinematically "good". So DLSS definitely offers a good middle-ground AA mode, and honestly I think it's the way to go.

But performance might be influencing what I think is "acceptable" as well, War Thunder is an easy 120fps @ 1440p locked all the time without DLSS at near-max settings. MW2019 on the other hand hovers in the 90-110fps range @ 1440p without DLSS, but DLSS takes that up to a constant 120fps. But on pure performance gain alone, DLSS is a great thing.

I'm not sure if DLSS needs time to "settle in"? In any case, MW works a bit better today than day 1. Smoother, no occasional slowdowns where DLSS previously seemed uncertain of itself.


----------



## nguyen (Apr 23, 2021)

toilet pepper said:


> Here's some anecdotal observations of mine. (did not thoroughly test) DLSS consumes more power.
> 
> I'm already capable of playing Outriders at max settings at 95fps cap I had. I turn on DLSS Quality (coz its better than native) and GPU board power increases by 10 Watts. Since you are rendering it at a lower resolution CPU power would also increase. Just my experience and I'm not sure if you all noiced this as well.



In NVCP, Power Management Mode, did you put it to "Prefer Maximum Performance" in the Global tab? This mode will keep the clocks as high as possible even when you have a Max FPS cap, leading to higher power consumption. Usually I only put it to prefer maxium performance in online competive game profile in order to reduce input latency.

You can add the Outriders profile and set it to "Optimal Power", then the clocks speed will vary accordingly to keep the 95FPS cap, this can drastically reduce power consumption. DLSS Balanced mode can reduce power consumption further if you can't distinguish any image quality degradation associated with it.


----------



## toilet pepper (Apr 23, 2021)

NVCP is at default except for gsync and vsync.its just 10 Watts and nothing to cry about. I think it was from 170 to 180 watts. The card is also deshrouded so it can't be the fans. (Running at .818mv at 1830mhz.)

I'll try to test it out over the weekend and check if there really is a difference in power consumption.


----------



## Mussels (Apr 23, 2021)

DLSS lowering the GPU load, could allow higher FPS and increase CPU load

Its going to mess with the wattage the system uses, could be up or down depending on settings (especially Vsync/FPS caps)


----------



## wolf (Apr 23, 2021)

Indeed in every single case where I've used DLSS, GPU power consumption as reported by MSI AB has either been equal or lower, so if the system is pulling 10w more from the wall using DLSS, I'd wager it's shifting some load elsewhere.


----------



## watzupken (Apr 23, 2021)

Based on my experience, you should never run RT without DLSS enabled. While I am not using the latest and greatest card from Nvidia, I don't expect a top end card like RTX 3090/ 3080 to run sub 100 FPS with RT enabled, and DLSS disabled. I can stomach some blurriness, as long as its not that bad. Generally if I don't run DLSS 2.0 on performance mode, it will look fine until you do a side by side comparison with trained eyes. You gain some, but lose some in this case, i.e. you lose the jaggered images, but introduce some minimal level of visual downgrade.


----------



## londiste (Apr 23, 2021)

DLSS 2.0 generally reduces the visual quality of the game.

Still enable in low-FPS scenarios - like RT enabled - because the resulting performance boost is just too big to ignore.
Quality only, the rest have a too visible impact on image quality.

The adoption seems to be well on its way for wide selection of games already. DLSS now being included in Unreal Engine as well as Unity will go a long way towards that goal.



toilet pepper said:


> Here's some anecdotal observations of mine. (did not thoroughly test) DLSS consumes more power.


The only case when that is true if you are running with an FPS cap that does not fully utilize the GPU. Lower native resolution means lower load on GPU (added load from DLSS is relatively small) and less load means less power.


----------



## watzupken (Apr 23, 2021)

wolf said:


> Indeed in every single case where I've used DLSS, GPU power consumption as reported by MSI AB has either been equal or lower, so if the system is pulling 10w more from the wall using DLSS, I'd wager it's shifting some load elsewhere.


As the load gets offloaded from the GPU, it will naturally shift towards the CPU. The GPU works less because it is pumping out a lot less pixels, i.e. 1440p instead of 2160p, if you are using a 4K monitor. But it will quickly become bottlenecked by the CPU because of the higher CPU utilization on Nvidia's GPUs. Which is why the gains coming from enabling DLSS at lower resolution hits a limit pretty quickly. I suspect this is especially so if RT is not enabled.


----------



## lowrider_05 (Apr 23, 2021)

I just want to say the following to DLSS 2.0: In Watchdogs 2 I can say it boosts performance like it should and looks better than just resolutionscaling alone BUT on a 55" 4K OLED TV it looks like a 2,5-3K image @ the Quality Mode when compared to native 4K.


----------



## watzupken (Apr 23, 2021)

lowrider_05 said:


> I just want to say the following to DLSS 2.0: In Watchdogs 2 I can say it boosts performance like it should and looks better than just resolutionscaling alone BUT on a 55" 4K OLED TV it looks like a 2,5-3K image @ the Quality Mode when compared to native 4K.


I feel there is no way to mask the reduction in resolution when you use such a huge display and the image is stretched. Generally for conventional monitor users, i.e. up to 32 inch, the issue may not be that pronounced because the PPI is dense enough to mask the problem. If you think it looks like 2.5K, that it is exactly the resolution that quality DLSS 2.0 should be upscaling from for a 4K display.


----------



## blued (Apr 23, 2021)

DLSS 2 saved the day for me with cyberpunk 2077 @ 4k. 

BUT... only if you take the time to experiment with it to arrive at best IQ. For me that involves DLSS Quality + image sharpening 0.8 + AFx16. Resulting IQ was indistinguishable from 4k native, but with a massive 30-40% perf jump. Very impressed.


----------



## toilet pepper (Apr 24, 2021)

Okay I did the test. I had it backwards. DLSS saves around 20 Watts on GPU board power. I don't have a means of checking it from the plug though.

DLSS off







DLSS on


----------



## tabascosauz (Apr 24, 2021)

toilet pepper said:


> Okay I did the test. I had it backwards. DLSS saves around 20 Watts on GPU board power. I don't have a means of checking it from the plug though.
> 
> DLSS off
> 
> DLSS on



My UPS has a data cable going to my computer so I can read a rough output wattage at all times in HWInfo. It's obviously not 100% precise and is subject to PSU efficiency, but it's good enough to corroborate the GPU Power and GPU Total Board Input Power readings. In MW2019 specifically, capped at 120fps, I save anywhere between 10-30W of power with DLSS on. About 380-390W total system power draw with DLSS off, about 360-380W total system power draw with DLSS on. With 120fps capped DLSS my 2060S still runs at pretty high (70-95%) utilization so it doesn't affect temps more than 2C.

Obviously it's a best case scenario as I was unable to hit even 110fps most of the time with DLSS off in that game, while DLSS on is a constant 120fps at all times. Also MW2019 always pretty much maxes out two CPU cores even when it's not CPU-bound, so there is no difference in CPU power draw between uncapped, 60fps capped or 120fps capped.


----------



## TheUn4seen (Apr 24, 2021)

I have a 3080 and I have the most experience, as far as RTX and DLSS, in Cyberpunk 2077. I bought the card mostly because of DLSS and, to a lesser extent, general performance improvement over the 1080ti. So, here goes:

As for image quality, I can see occasional artifacts such as ghosting on object edges in high-contrast situations, like a brightly colored car moving in front of a dark wall or around other fast-moving objects (not the pixel response's fault). I can see them only because I saw some static a-b comparisons in reviews. I can also see situations where DLSS actually improves fine detail on distant objects, but again, only because I know what to look for and only when I actively look for it. None of the artifacts are visible to me when playing the game.

I personally don't care for ray tracing at all - yes, some areas of the game look slightly nicer, but even though my playing style is mostly of a relaxed sightseeing type, I consider the minuscule improvement in image quality not worth the performance hit.

To be frank, even the 3080 can't run this game in native 3840x2160, so DLSS is a godsend. I tried playing CP2077 on the 1080ti, but the rasterized scaling from 2560x1440 looked awful to the point of actually being distracting, DLSS is a night and day improvement in quality. So yes, I hope this feature will be available in more games.
In short, I don't care for RTX and consider DLSS an infinitely more important feature. Also, it's technologically much more impressive if you look at it.


----------



## toilet pepper (Apr 24, 2021)

DLSS  would work wonders in VR. I havent tested a game yet that has it though. The only thing I'm worried about with it in VR is if the ghosting on edges are more visible.


----------



## nguyen (Apr 25, 2021)

tabascosauz said:


> My UPS has a data cable going to my computer so I can read a rough output wattage at all times in HWInfo. It's obviously not 100% precise and is subject to PSU efficiency, but it's good enough to corroborate the GPU Power and GPU Total Board Input Power readings. In MW2019 specifically, capped at 120fps, I save anywhere between 10-30W of power with DLSS on. About 380-390W total system power draw with DLSS off, about 360-380W total system power draw with DLSS on. With 120fps capped DLSS my 2060S still runs at pretty high (70-95%) utilization so it doesn't affect temps more than 2C.
> 
> Obviously it's a best case scenario as I was unable to hit even 110fps most of the time with DLSS off in that game, while DLSS on is a constant 120fps at all times. Also MW2019 always pretty much maxes out two CPU cores even when it's not CPU-bound, so there is no difference in CPU power draw between uncapped, 60fps capped or 120fps capped.



Sweet, seems like Nvidia took extra measure to ensure the best implementation of DLSS in Warzone, this game is huge after all. Just hope that there were less cheaters now than before so I can get back to it, or is that wishful thinking?


----------



## Mussels (Apr 25, 2021)

toilet pepper said:


> DLSS  would work wonders in VR. I havent tested a game yet that has it though. The only thing I'm worried about with it in VR is if the ghosting on edges are more visible.


i think its automatically in there already, they run higher res (or just AA?) where you're looking and low res around it


----------



## nguyen (Apr 25, 2021)

Mussels said:


> i think its automatically in there already, they run higher res (or just AA?) where you're looking and low res around it



Yeah that's just MSAA in the middle of your vision while the peripheral get lower res.
I don't think DLSS work well with VR though, the resolutions in VR is generally too low for DLSS to work its magic, also you don't need higher res when your eyes are 2cm from the screen.


----------



## tabascosauz (Apr 25, 2021)

nguyen said:


> Sweet, seems like Nvidia took extra measure to ensure the best implementation of DLSS in Warzone, this game is huge after all. Just hope that there were less cheaters now than before so I can get back to it, or is that wishful thinking?



I don't touch the Warzone area of the game so not sure. While the DLSS update was surprisingly huge and fixed a lot, Raven Software hasn't previously deserved much confidence after taking over IW's game. If you really want to play the new Cold War weapons maybe?

Other than that, I'm not sure if DLSS has a negative impact on long range sniping which is more prevalent in Warzone. With DLSS on some surfaces like corrugated roofs can get a little blurrier when your character is moving, but it took me a long time to notice the difference.

Just in MW2019, there wasn't much "added" except DLSS. Everything else was to fix what Raven previously broke in the last month. The idiotic 680 bug that had your char's left arm bent backwards into your face at all times obstructing vision, the complete loss of all sleight of hand on any weapon, loss of all of the Finn's RoF barrels, etc.

and then there was the Sykov   but at least its nerfed now


----------



## wolf (Apr 27, 2021)

Seems like with Warzone the boost follows the trend from other games, where the lower the starting FPS, the bigger boost DLSS gives.

Like a 3080 or 3090 already get quite high FPS, so the boost is minimal, but a 2060 could see massive boosts.

DLSS really can be a tool to help an older/weaker card keep it's head above water as it ages.


----------



## king of swag187 (Apr 27, 2021)

Games I've tested so far:
Warzone: Not noticing anything major graphics wise with it, FPS does improve drastically while not noticing a huge loss in image quality
Cold War: Looks a bit blurrier with performance mode than Warzone does, but overall it doesn't bother me albeit it is more noticable


----------



## nguyen (Apr 27, 2021)

Yeah with Warzone and Fortnite supporting DLSS, it will be very advantageous for RTX owners. I mean with a 4K 42in or bigger screen I could see everything on the other side of the map while getting 200+fps   .
Right now I'm interested in Naraka: Blade Point which also support DLSS, looks like another fun Battle Royale game with a mix of Dark Soul fightning mechanic


----------



## wolf (Apr 28, 2021)

Spent a bit last night messing around in the bright memory infinite benchmark, after playing some older games with DSR (May payne 3, 6880x2880) and getting amazing IQ results.

Messed around with setting various DSR resolutions and then using different DLSS modes and checking out IQ, that massive output resolution really makes for a clean image, and it made me wonder.

An idea for another DLSS setting - '*Ultra quality'* to be used purely as the name describes. I couldn't say for sure what the input and output factors would need to be, but even quality mode has a large performance uptick over native in most situations, so basically tuning 'Ultra quality" mode to have roughly equal to, maybe 0-5% more performance than native, rather than 25%++, but with certainly even better visual results, for when performance is already excellent but you want another dial to turn up.

Would that appeal to anyone?


----------



## Mussels (Apr 28, 2021)

wolf said:


> Spent a bit last night messing around in the bright memory infinite benchmark, after playing some older games with DSR (May payne 3, 6880x2880) and getting amazing IQ results.
> 
> Messed around with setting various DSR resolutions and then using different DLSS modes and checking out IQ, that massive output resolution really makes for a clean image, and it made me wonder.
> 
> ...


NV wont make it happen, because they're using DLSS as an RTX advertisement


----------



## wolf (Apr 28, 2021)

Mussels said:


> NV wont make it happen,


I wouldn't be so certain, we're only ~2.5 years into the tech and not much over a year into 2.0, it's got a long road ahead of it, especially if they really are banking on it staying around.

2.0 and the 4 modes we have will certainly not be where that road stops.

I'm not saying what I suggested will happen for sure, but I'm positive more modes and options are on the cards, like say an FPS target mode where the output res and FPS are fixed, and the input resolution is dynamic to hit the targets.


----------



## Yttersta (May 2, 2021)

My best DLSS experience has been Control, and the best way I can describe my feelings about it is that I don't notice it. 

Imagine the settings, as they do often in Control, randomly shift about at times. If it was, say, anti-aliasing that turns off after a relaunch of the game and on a low res, such as 1080p screen, then you would notice the AA was off instantly. Or say lighting, volumetric effects, reflections, etc. that changes, again one would notice.

With DLSS that's the difference. It is not as sharp at times as native res. I have seen online, examples of it being worse 90% of the time, nice round number I know; and in the outlying 10% it can turn up very far away details sharper than native, as can be seen in Cyberpunk detailed reviews online, but I haven't got a first hand experience with it so I cannot comment.

So if there's DLSS, it is not that noticeable to the extent that unless you look for whether it is on, it won't matter in return of better performance. So I think I'm positive about it overall, if it is so well implemented as it is in Control though. I look forward to the next weekend for the Metro remaster very much on that front. That'll be the golden sample of DLSS 1 vs 2 comparison.


----------



## oxrufiioxo (May 2, 2021)

At 4k it works really well at 1440p it can be solid but at 1080p it's not very good. When it's implemented properly it can be pretty amazing especially when you consider what it's doing but when it's not it can be a blur fest. 

CP/Control do it really well and the cod games are all decent image quality wise with it on. 1440p and above. 

The thing that makes make me optimistic about this tech is how much it's improved over the last couple years.... It was unusuable imho when it first released but thankfully Nvidia kept working at it.


----------



## wolf (May 5, 2021)

I'm really looking forward to trying out the new Metro update with DLSS 2.1 and the other enhancements, shaping up to possibly be the best RT and DLSS implementation yet.


----------



## birdie (May 5, 2021)

oxrufiioxo said:


> At 4k it works really well at 1440p it can be solid but at 1080p it's not very good. When it's implemented properly it can be pretty amazing especially when you consider what it's doing but when it's not it can be a blur fest.
> 
> CP/Control do it really well and the cod games are all decent image quality wise with it on. 1440p and above.
> 
> The thing that makes make me optimistic about this tech is how much it's improved over the last couple years.... It was unusuable imho when it first released but thankfully Nvidia kept working at it.


The lower the resolution the less data DLSS has to work with, so naturally it's getting better as the resolution increases.

When you're upscaling from 1400p to 4K, you've got 3.6M pixels to work with, when you're upscaling from 768p to 1080p, you only have 1M pixels.

For 8K DLSS will shine, only I find it hard to believe there are actual human beings capable of seeing the fine details of 8K videos. 8K is above our physical limits/acuity unless you're staring at a small part of the screen.


----------



## nguyen (May 5, 2021)

wolf said:


> I'm really looking forward to trying out the new Metro update with DLSS 2.1 and the other enhancements, shaping up to possibly be the best RT and DLSS implementation yet.



Yeah DF shows that even 4K DLSS Performance (1080p internally) looks better than TAA Upsampling at 1500p





And TAA Upsampling is already superior than Upscaling + CAS (FidelityFX CAS), that's why 4A Games didn't even bother integrating FidelityFX toolkit in the Enhanced version.


----------



## wolf (May 5, 2021)

nguyen said:


> Yeah DF shows that even 4K DLSS Performance (1080p internally) looks better than TAA Upsampling at 1500p
> 
> And TAA Upsampling is already superior than Upscaling + CAS (FidelityFX CAS), that's why 4A Games didn't even bother integrating FidelityFX toolkit in the Enhanced version.


Yeah the portions they show off are excellent looking, I do look forward to making my own judgment on it, but given I generally favour Quality DLSS over native for the performance and IQ, I'm predicting very good results. Will be interesting to see how Balanced and Performance modes fare at 3440x1440 too.


----------



## dogwitch (May 6, 2021)

here a question. why dont the run it native rez? with no dlss.


----------



## wolf (May 6, 2021)

dogwitch said:


> here a question. why dont the run it native rez? with no dlss.


Can you elaborate on your question? which game, what settings, what res?

If you're talking about Metro Exodus, I assume that performance without DLSS or TAA upscaling is considerably lower, and given the performance/IQ advantages  and/or trade offs, you'd be mad not to use one of the two, unless you have some crazy hardware combo like an RTX3090 and a 1080p60hz monitor (for argument's sake)


----------



## oxrufiioxo (May 6, 2021)

dogwitch said:


> here a question. why dont the run it native rez? with no dlss.



Because in certain games the image quality is negligible Native vs DLSS but the performances uplift is 30-40% especially at higher resolutions.


----------



## wolf (May 6, 2021)

In Contorl At 3440x1440 using DLSS quality mode, the IQ is better than native to my eyes and the performance uplift is 75%, no brianer. I am so keen to try Exodus tonight!


----------



## Mussels (May 6, 2021)

The higher the res, the better DLSS works.

I genuinely did not want a 4K monitor when i bought my current one due to low FPS, but knowing DLSS is spreading so much (its becoming part of the unity engine!) it makes a 4K monitor more likely in the future


----------



## dogwitch (May 6, 2021)

ok after reading all 3 post. so the gpus cant do it native at a decent performance rate then. with or with out ray tracing on
with higher frame rate and up rez. with a lock cannot dip below 60fps at 1080 or up.


----------



## tabascosauz (May 6, 2021)

dogwitch said:


> ok after reading all 3 post. so the gpus cant do it native at a decent performance rate then. with or with out ray tracing on
> with higher frame rate and up rez. with a lock cannot dip below 60fps at 1080 or up.



What point are you trying to make here?

The idea is to help struggling GPUs (e.g. 2060S @ 1080p CP2077), but it's not like you can't still get the extra 20-40% performance boost if you already push playable frames  if the game has good DLSS with negligible image quality loss it makes zero sense not to be using it just because "native is also playable"

now if the DLSS implementation sucks, then that's a different story, but that's beside the point


----------



## nguyen (May 6, 2021)

dogwitch said:


> ok after reading all 3 post. so the gpus cant do it native at a decent performance rate then. with or with out ray tracing on
> with higher frame rate and up rez. with a lock cannot dip below 60fps at 1080 or up.



There are several practical applications of DLSS:
_Higher IQ due to DLSS is very effective at Anti-Aliasing
_Higher Performance at equivalent or slightly better IQ to Native (DLSS Quality mode)
_Higher Fidelity if you so choose (DSR + DLSS) with no performance cost
_Lower Power Consumption (when you use FPS lock), something like this:

DLSS OFF





DLSS ON (FPS capped at 120)





That 100W less power consumption for no IQ loss.


----------



## dogwitch (May 6, 2021)

so what it is now. is both a boost performance and lower the wattage draw. other wise the gpu would tank in performance and draw way more watts then.
if i understand this then.


----------



## Mussels (May 6, 2021)

dogwitch said:


> so what it is now. is both a boost performance and lower the wattage draw. other wise the gpu would tank in performance and draw way more watts then.
> if i understand this then.


It's just a performance boost, with an effect similar to AA.


FPS caps and lower settings always save power.


----------



## las (May 6, 2021)

DLSS 2.x is insanely good, when implemented right.

Can't wait to try Metro Exodus Enchanced Edition today or tomorrow with it. This title will be pretty much unplayable without DLSS (I demand 100+ fps)


----------



## dogwitch (May 6, 2021)

Mussels said:


> It's just a performance boost, with an effect similar to AA.
> 
> 
> FPS caps and lower settings always save power.


ah.  so are we at the design limit atm  then?
seeing i notice a regression in both game ai and physics. which both used to run on gpu. 
but it seems fast fps and pretty graphic is what has made to regress backwards in some game design then.


----------



## Mussels (May 6, 2021)

dogwitch said:


> ah.  so are we at the design limit atm  then?
> seeing i notice a regression in both game ai and physics. which both used to run on gpu.
> but it seems fast fps and pretty graphic is what has made to regress backwards in some game design then.


no, nvidia did this purely because of RTX being too hard for current GPUs.

The negative effect for them, is that it may hurt future sales - if a 2060 can just run DLSS and last years longer, why upgrade?


----------



## oxrufiioxo (May 6, 2021)

dogwitch said:


> ah.  so are we at the design limit atm  then?
> seeing i notice a regression in both game ai and physics. which both used to run on gpu.
> but it seems fast fps and pretty graphic is what has made to regress backwards in some game design then.



That's a topic for a different thread you can make one if you'd like.... Has nothing to do with DLSS. As a side note the 8 weak jaguar cores are probably mostly responsible for that in last gen consoles.


----------



## dogwitch (May 6, 2021)

oxrufiioxo said:


> That's a topic for a different thread you can make one if you'd like.... Has nothing to do with DLSS. As a side note the 8 weak jaguar cores are probably mostly responsible for that in last gen consoles.


not really. due to how gpu or cpu are design. where some stuff is built in and other stuff get retired .. do  to user not really using it anymore.
remember mpeg 2(v1) support. not really a thing now on cpu  or gpu. instead it was replace with *MPEG*-H Part *2*.


----------



## nguyen (May 6, 2021)

Mussels said:


> no, nvidia did this purely because of RTX being too hard for current GPUs.
> 
> The negative effect for them, is that it may hurt future sales - if a 2060 can just run DLSS and last years longer, why upgrade?



Nvidia just boosted 4K TV/ screens sale with their DLSS tech , without DLSS 4K120hz+ is still a pipedream even for 6900XT/3090 performance class.


----------



## las (May 6, 2021)

nguyen said:


> Nvidia just boosted 4K TV/ screens sale with their DLSS tech , without DLSS 4K120hz+ is still a pipedream even for 6900XT/3090 performance class.



It's not like 1440p looks much different from 2160p on a TV less than 55 inches tho, I sit 3 meters away from my 65 inch OLED and higher quality settings looks way more obvious than a resolution bump. I play all games at 1440p or 2160p at 120 Hz using Gsync, both look great

I doubt many people buy "4K" TVs because of "DLSS". I only uses mine because I upgraded to a 77 inch G1 in the living room, so the C9 went to my bedroom slash gaming room.



Mussels said:


> no, nvidia did this purely because of RTX being too hard for current GPUs.
> 
> The negative effect for them, is that it may hurt future sales - if a 2060 can just run DLSS and last years longer, why upgrade?



I would not say years tho, mostly DLSS is good but sometimes it's not and lets be honest, very few games has it and not many have top-notch implementation (visual artifacts, blur etc)

I will always prefer native I think but since I demand 100 fps minimum, it's a good option to have.


----------



## nguyen (May 6, 2021)

las said:


> It's not like 1440p looks much different from 2160p on a TV less than 55 inches tho, I sit 3 meters away from my 65 inch OLED and higher quality settings looks way more obvious than a resolution bump. I play all games at 1440p or 2160p at 120 Hz using Gsync, both look great
> 
> I doubt many people buy "4K" TVs because of "DLSS". I only uses mine because I upgraded to a 77 inch G1 in the living room, so the C9 went to my bedroom slash gaming room.



The main downfall with 4K was that 1440p144hz provides better gaming experience than 4K60hz. That advantage is now gone when Ampere is totally capable of 4K120hz across the majority of games.
I have the LG CX 48in and 4K is barely sharp enough, so 1440p is a big no-no (I sit 70cm away from the TV ).

DLSS just make 4K120hz more accessible than ever, I mean 2 years ago the 2080Ti was barely capable of 4K60hz and DLSS 1.0 was crap, so who in their right mind would spend 2500usd on 4K144hz screen back then? Also the LG C9 and CX are dirt cheap now, making Ampere and LG OLED a match made in heaven .

And yeah I would choose 4K120hz over 1440p240hz screen every time.


----------



## FireFox (May 6, 2021)

I am late to the party.
I didn't read all post, just a few of them, if i understood right DLSS is better than native resolution?


----------



## oxrufiioxo (May 6, 2021)

FireFox said:


> I am late to the party.
> I didn't read all post, just a few of them, if i understood right DLSS is better than native resolution?


No, but in some games it resolves more detail.
Control/Metro EE


----------



## Mussels (May 6, 2021)

FireFox said:


> I am late to the party.
> I didn't read all post, just a few of them, if i understood right DLSS is better than native resolution?


most of the time no, rare cases yes.
Nvidias goal is to make that an always yes.


----------



## FireFox (May 6, 2021)

oxrufiioxo said:


> No, but in some games it resolves more detail.
> Control/Metro EE





Mussels said:


> most of the time no, rare cases yes.
> Nvidias goal is to make that an always yes.


Then i need to see if there is any improvement in COD CW.


----------



## wolf (May 6, 2021)

las said:


> ...higher quality settings looks way more obvious than a resolution bump...


This statement right here effectively encompasses my overall thoughts. I'd rather play with 'next gen' visuals and have it be slightly less sharp, than look pin sharp but also look 5 years old.


nguyen said:


> There are several practical applications of DLSS:
> _Higher IQ due to DLSS is very effective at Anti-Aliasing
> _Higher Performance at equivalent or slightly better IQ to Native (DLSS Quality mode)
> _Higher Fidelity if you so choose (DSR + DLSS) with no performance cost
> _Lower Power Consumption (when you use FPS lock)


Also
_Much higher performance in Balanced and Performance modes with less visual sacrifice than rendering in the respective DLSS input resolution 

I hear things like "well I've been able to render at lower resolutions for years, what's new?". Well DLSS (2.0 at least) _*is *_different, the reconstruction does exceed the fidelity of simply running at that same lower internal resolution and is, when shining, easily the front running reconstruction/resolution enhancing technique.


----------



## oxrufiioxo (May 6, 2021)

FireFox said:


> Then i need to see if there is any improvement in COD CW.



It's fine in Cold war at 1440p and above. Although my buddy with a 1080p 360hz panel swears he doesn't notice the difference between quality dlss and native.


----------



## xkm1948 (May 6, 2021)

Love DLSS. Use it in all titles that have it implemented.


----------



## FireFox (May 6, 2021)

xkm1948 said:


> Love DLSS. Use it in all titles that have it implemented.


How does it works or do i have to disable any setting to make it works?


----------



## dogwitch (May 6, 2021)

wanted to say thank you to people that answer what i ask about. with a none butt way.


----------



## TheoneandonlyMrK (May 6, 2021)

I would say the DLSs is quite good ,I don't like/choose to use it much/often ,yet.
Given time it might become more useful given my Rtpeasant 2060 isn't the greatest but can't complain on the odd high FPS game that it works in its a joy but I definitely prefer it off in purely IQ terms , sometimes FPS and Hz matter more, , ISH.


----------



## Mussels (May 6, 2021)

wolf said:


> This statement right here effectively encompasses my overall thoughts. I'd rather play with 'next gen' visuals and have it be slightly less sharp, than look pin sharp but also look 5 years old.


see i'm backwards to that, i go play some older games (like call of duty 2) and find that while the textures are bland, the game looks amazingly clean and clear despite its age - and it also works fine at modern res and refresh rates


----------



## wolf (May 7, 2021)

Mussels said:


> see i'm backwards to that, i go play some older games (like call of duty 2) and find that while the textures are bland, the game looks amazingly clean and clear despite its age - and it also works fine at modern res and refresh rates


I can 100% appreciate that too, and it's an aspect of getting a new GFX card that I really love, games that came out a few years or more before it you can crank up to insane levels, if I'm going to play an older game, I love being able to over-render it, like DSR for example.

I guess my point was more the latest and greatest, an example would be I'd rather play control with all the RT effects and DLSS on, than with neither for roughly equal FPS.


----------



## Hachi_Roku256563 (May 7, 2021)

wolf said:


> This statement right here effectively encompasses my overall thoughts. I'd rather play with 'next gen' visuals and have it be slightly less sharp, than look pin sharp but also look 5 years old.


i disagree with this imho
i would much rather have a old game running at high res looking sharp 
rather then a game having next gen shine


----------



## wolf (May 7, 2021)

Isaac` said:


> i disagree with this imho
> i would much rather have a old game running at high res looking sharp
> rather then a game having next gen shine


The beauty of personal preferences   

I mean you really get the best of both worlds right now with a decent RTX 30 series card, the performance they can crank out in raster is still nothing to sniff at, but then if you want to tick those next-gen toggles, you're covered for that experience too.

My lawd Metro Exodus EE is gorgeous.

The lighting is so much improved, much more realistic, dynamic and natural, a stark difference.

Everything set to the highest possible, with RT reflections too and DLSS quality I get higher FPS than I did in 1.0.0.7 maxed out with DLSS. and it looks FAR better, from the vastly improved RTGI, the extra RT reflections, much sharper and more detailed presentation... these guys have knocked it out of the park.

DLSS 1.0 to 2.1 is also massive, Performance setting at 3440x1440 looks better than the on/off DLSS setting previously, while performing hugely better, like 40-50%+, Quality mode looks better than native and performs ~10% better than DLSS 1.0.


----------



## Hachi_Roku256563 (May 7, 2021)

wolf said:


> The beauty of personal preferences
> 
> I mean you really get the best of both worlds right now with a decent RTX 30 series card, the performance they can crank out in raster is still nothing to sniff at, but then if you want to tick those next-gen toggles, you're covered for that experience too.
> 
> ...


i mean imho
i think rollercoaster tycoon 3 running at 1080p
vs planet coaster max (720)
i would take rct3


----------



## purecain (May 7, 2021)

Well I started Cyberpunk playthrough using my Titan V and I was disappointed with the way things looked without RTX. The difference had never really bothered me until then
(probably due to all the reflections and glass in the game).
I could apply RTX shadows but it tanked performance to about 30-40fps. This made streaming the game impossible at higher resolutions. If I could of used DLSS on the V
which imo is basically setting 1080p with improved textures. I wouldn't of upgraded. I thought the 3090 would of been far more powerful than it was, as far as raytracing is concerned.
You have to use DLSS otherwise Raytracing still tanks performance to unplayable levels. I felt like I'd upgraded just so I could use DLSS. 
Ive since grown to appreciate it more but still think Nvidia could of given us more RT cores with the 30 series.
You know the advertising on the next gen will be better RTX performance. The V will be a beast for a few more years to come but I do like the 3090... Although prices now are insane, Ethereum mining or no.
This is cyberpunk on the V on the first video and with the 3090 using raytracing and RTX settings on very high. Also resizable bar has made a massive difference to load times and pop in textures. Everything streams far faster now. I wonder what this would work like on the V.
Titan V








RTX3090


----------



## wolf (May 7, 2021)

Isaac` said:


> i mean imho
> i think rollercoaster tycoon 3 running at 1080p
> vs planet coaster max (720)
> i would take rct3


Yeah I mean that's also just one example, but I can see what you mean. For me it's always case by case too, but I'd say generally for new release games I'd rather see their bells and whistles rendered @1440p than without @4k assuming FPS would be ~same. I can see how some people would be the opposite to that.


purecain said:


> If I could of used DLSS on the V
> which imo is basically setting 1080p with improved textures. I wouldn't of upgraded.


I thought the Titan V had Tensor cores, do they now allow use of DLSS only?


----------



## purecain (May 7, 2021)

There are a few settings only offered by the 30 series, resizable BAR for the memory subsystem and DLSS 2.0.


----------



## wolf (May 7, 2021)

purecain said:


> There are a few settings only offered by the 30 series, resizable BAR for the memory subsystem and DLSS 2.0.


Pretty sure you're spot on with BAR, but DLSS 2.0 works on 20 series too, after all it launched before the 30 series.


----------



## dogwitch (May 7, 2021)

wolf said:


> Pretty sure you're spot on with BAR, but DLSS 2.0 works on 20 series too, after all it launched before the 30 series.


sadly bar hit pretty hard on games atm to. so that a thing to.


----------



## purecain (May 7, 2021)

dogwitch said:


> sadly bar hit pretty hard on games atm to. so that a thing to.


ahh not for the Titan V though...


----------



## dogwitch (May 7, 2021)

purecain said:


> ahh not for the Titan V though...


idk whith tha. seeing it was ref on bar being tested with 3k series. 
but i will put it blunt. it seems nvida really rush everything out. far to early for the consumer.
  amd less so thru.
is it fun to pay a $$$$ beta tax on hardware.


----------



## wolf (May 7, 2021)

dogwitch said:


> but i will put it blunt. it seems nvida really rush everything out. far to early for the consumer.


I put it in a similar bucket to RTX, where they first get the hardware capability out there and then refine the software around it. Indeed they rushed to get BAR done because AMD had something they didn't, but from here on in I don't see it getting worse overall, at least in theory support and gains etc should only get better.

Paying to be a beta tester is one take, I like to have bleeding edge technology/gadgets/graphics, even if it means it's not 100% ready to blow minds level polished yet. For me at least there is enjoyment directly derived from riding that wave.


----------



## las (May 7, 2021)

nguyen said:


> The main downfall with 4K was that 1440p144hz provides better gaming experience than 4K60hz. That advantage is now gone when Ampere is totally capable of 4K120hz across the majority of games.
> I have the LG CX 48in and 4K is barely sharp enough, so 1440p is a big no-no (I sit 70cm away from the TV ).
> 
> DLSS just make 4K120hz more accessible than ever, I mean 2 years ago the 2080Ti was barely capable of 4K60hz and DLSS 1.0 was crap, so who in their right mind would spend 2500usd on 4K144hz screen back then? Also the LG C9 and CX are dirt cheap now, making Ampere and LG OLED a match made in heaven .
> ...



For competitive or serious fps gaming, I'd choose 1440p at 240 Hz myself

I mostly use the TV for single player and slower paced games, it can't match my Asus PG279Q in fast paced shooters at all (screen is too big anyway - can't focus and input lag is higher on TVs, even the best ones are around 3 times higher input lag compared to gaming monitors, 120 Hz vs 165 Hz is very noticable for me.

I'm looking to replace my PG279Q with a 1440p/240 Hz IPS when more models are out. For me, more fps is always better and I can easily spot (and feel) the difference between 100 and 200 (even on a 165 Hz monitor).

I love my OLED TV's but mostly for watching movies, series, console and slower paced pc gaming. Never ever getting a LCD/LED TV again, but for PC monitors (laptops and gaming monitors), I probably still prefer IPS..

1440p/144Hz/IPS and VRR is perfect sweet spot for most and very optainable today, good 4K/UHD gaming monitors are still much more expensive than these.


----------



## nguyen (May 7, 2021)

las said:


> For competitive or serious fps gaming, I'd choose 1440p at 240 Hz myself
> 
> I mostly use the TV for single player and slower paced games, it can't match my Asus PG279Q in fast paced shooters at all (screen is too big anyway - can't focus and input lag is higher on TVs, even the best ones are around 3 times higher input lag compared to gaming monitors, 120 Hz vs 165 Hz is very noticable for me.
> 
> ...



uh, you are supposed to change the input icon on the OLED TV to PC mode to reduce the input delay. After changing to PC mode, the input delay on LG CX is 6.7ms and only 5.3ms on the C1, mind you even the best 1440p240hz IPS screen have 6.6ms input delay 






Also the pixel response time with OLED screen are only 2ms at worst and for IPS screens are 4.5ms




so yeah, unless you are playing competitive CSGO, OLED TV are quite suitable for competitive battle royal titles like Fortnite and Warzone.


----------



## las (May 7, 2021)

nguyen said:


> uh, you are supposed to change the input icon on the OLED TV to PC mode to reduce the input delay. After changing to PC mode, the input delay on LG CX is 6.7ms and only 5.3ms on the C1, mind you even the best 1440p240hz IPS screen have 6.6ms input delay
> 
> View attachment 199547
> 
> ...



I'm using it in game mode with all interpolation disabled and it feels slower than my PC gaming monitor, also a TV is way too big for these kind of games if you ask me. When you have to turn your head you already lost, thats why most serious fps gamers are NOT going ultra wide or using big screens, you won't stand a chance vs good players. None of them are using tv's for sure. Most console players that play competitive (or are serious) even uses pc monitors instead of tv's to get an edge. Focus and reaction is much better on a smaller screen that's 100% in your vision at all times.

I don't like my TV for fast paced shooters and nothing will change this fact. Simply feels off. For slower paced 3rd person shooters it works very well. But yeah mostly uses it for single player games tbh.


----------



## wolf (May 7, 2021)

It's interesting you say this @las, I've sent about 10-12 hours gaming on an LG CX65 and found input response in shooters (admittedly talking about various games in halo MCC) to be very responsive. Albeit, I am far from a competitive gamer, I just play SP/CO-OP shooters on very hard difficulties.


----------



## las (May 7, 2021)

wolf said:


> It's interesting you say this @las, I've sent about 10-12 hours gaming on an LG CX65 and found input response in shooters (admittedly talking about various games in halo MCC) to be very responsive. Albeit, I am far from a competitive gamer, I just play SP/CO-OP shooters on very hard difficulties.


It's not bad, pretty much best in class for tv's but I just feel that my Asus PG279Q is faster and more responsive than both my C9 and G1 OLEDs, keep in mind that I use 165 Hz


----------



## MagnyCours (May 7, 2021)

Combined with sharpening, image quality is pretty dang good I'd say. DLSS 1.0 was terrible, though, glad they've improved the implementation leaps and bounds with 2.0++.


----------



## las (May 7, 2021)

MagnyCours said:


> Combined with sharpening, image quality is pretty dang good I'd say. DLSS 1.0 was terrible, though, glad they've improved the implementation leaps and bounds with 2.0++.



Yeah sadly many people still spread missinformation based on what they read about DLSS 1.0 back in the days


----------



## nguyen (May 7, 2021)

las said:


> It's not bad, pretty much best in class for tv's but I just feel that my Asus PG279Q is faster and more responsive than both my C9 and G1 OLEDs, keep in mind that I use 165 Hz



You are feeling more responsive with the Asus PG279Q because you are getting higher FPS at 1440p than 4K, once you are getting the same FPS at 4K as you do on 1440p, the responsiveness will be the same (or better on the LG OLED).

I have been playing PUBG on my LG CX 48in and the advantages of having a massive and responsive screen are undeniable. Before I had the LG 34GN850-B screen which is 3440x1440 160hz and the LG CX is just superior in every way. And yes I play PUBG with low settings, which give me the same FPS at 4K/1440p because of CPU bottleneck.



las said:


> Yeah sadly many people still spread missinformation based on what they read about DLSS 1.0 back in the days



If anyone think DLSS is the same as upscaling, show them this


----------



## phanbuey (May 7, 2021)

nguyen said:


> uh, you are supposed to change the input icon on the OLED TV to PC mode to reduce the input delay. After changing to PC mode, the input delay on LG CX is 6.7ms and only 5.3ms on the C1, mind you even the best 1440p240hz IPS screen have 6.6ms input delay
> 
> View attachment 199547
> 
> ...




^THIS

I have the Samsung G7 27" sitting in my closet and gaming on my 48" TV instead... the lag difference is perceptible but very very small, whereas the difference in visuals, and gaming experince on a giant 4k 120hz vs a tiny, pixelated 27" at 240hz - no contest the TV is better.

Just wish I knew that within the return period :/

As far as DLSS 2.0 Image quality - it is incredible.  I've used it primarily in Cyberpunk to game at high FPS at 4k and the differences are imperceptible, but the framerate difference is absolutely massive.  DLSS is an absolute must for me at 4K, not negotiable, I wouldn't be able to have this amazing gaming experience without it.


----------



## nguyen (May 9, 2021)

Well getting to play Sam's Story Enhanced for a couple of hours and the performance of DLSS is superb (+70% with 4K DLSS Balanced) but some weapons produce the trailing artifacts that are noticeable in dark area. 
Lightning is looking flawless in the Enhanced edition, there is no fake point light to be seen.


----------



## wolf (May 10, 2021)

nguyen said:


> Lightning is looking flawless in the Enhanced edition


It's blowing me away honestly, this game truly looks a generation apart to most others now and my framerates are fantastic with DLSS quality 3440x1440.


----------



## robot zombie (May 10, 2021)

I'm just using it for the first time on Control. It definitely improved a lot. Very sharp and not smudgy. The dark haloing is still evident, but it's so much cleaner looking, I'd put it down on the level of artifacting you expect from any moderate or heavy AA. Edges are distinctly darker. No ghosting/flickering or anything like that, though. Things like film grain and motion blur still goof it up, you'll start to see the trails in screen space stuff and 'film grain ray confusion' going on. No biggie. This game already has so much temporal aliasing and volumetric light clumping artifacts, film grain has always been the last thing it needed.

I'm impressed. The performance at 1080p is amazing. My 2060 has struggled to hold 60 below native in certain areas, with any RTX effects on. Old DLSS would let me enable a few, but had a lot of fringing, moire, vanishing lines, so on. It  just didn't look the best. It could be okay, but you have too many '...ugh' moments. I'm seeing... well I haven't noticed any yet! I'm sure it's there and I haven't seen the right thing. But before, I would try to turn it on and go back to running at 768p and letting the MSAA pick up the slack. It ran better and looked at least more consistent. That no longer looks as good. I swear, it's pulling a little extra detail out of some images over say, native 1080p with no RTX enabled and maxed settings. Certain textures and details in the edges just pop better. And the performance and appearance of traditionally-processed native res is now surpassed by the DLSS 2.0 in some ways. It's going to be a give and take. I prefer it with the DLSS. Especially with the RT in the mix. That makes it totally worth it. Man.

I mean, really... the main visual benefit is that I now run ALL of the RTX goodness with my settings almost maxed. And my FPS doesn't go below 70 for the most part. I'm running a 165hz monitor at half sync. So I cap out at 82/3 FPS, which is where it tends to stay outside of the absolute heavy zones. The mailroom area is one... with RTX effects it just munches frames ruthlessly. Used to be a 40-50 FPS type of spot for me. Now it's more like 80! What? It _will_ dip as low as 70 at times. But I'm not concerned about that because I have an inkling that frametimes are massively better overall. 1% lows have to be looking good, because even as I'm watching the frame rates fluctuate on the counter, the images remain VERY smooth. Used to be there would be stutter I could see and even feel, even holding flat numbers. But with DLSS 2.0 I'm staying over the waterline and it seems to be keeping the loads consistent enough to avoid stutter, even with 3-7FPS jumps happening.

So yeah... this is pretty great. It's amazing how far this stuff has come. For all of Nvidia's crap, this 2060 keeps on chuggin in my 1080p setup. I'm pleased with this. Metro should be interesting. Apparently this is kind of a big deal lol.


----------



## wolf (May 10, 2021)

robot zombie said:


> I swear, it's pulling a little extra detail out of some images


Appreciate your extended thoughts!, for the part I've quoted specifically, this is one part that blows my mind when I see it, some parts of the image in Control and Metro EE are wildly sharp and detailed with DLSS quality, switch back to Native and that 'wow-factor' or 'pop' disappears.

I agree too there are some possible shortfalls as you mentioned, but nothing really that you don't already see in other AA methods or something it's just an artefact of the engine/rendering anyway. DLSS cops a lot of scrutiny, and rightly so, but it's not as if images we're pixel perfect / have drawbacks with other AA methods anyway. The list of extra positives, like performance and fine details, for example, more than outweigh any negatives imo.


----------



## robot zombie (May 10, 2021)

wolf said:


> Appreciate your extended thoughts!, for the part I've quoted specifically, this is one part that blows my mind when I see it, some parts of the image in Control and Metro EE are wildly sharp and detailed with DLSS quality, switch back to Native and that 'wow-factor' or 'pop' disappears.
> 
> I agree too there are some possible shortfalls as you mentioned, but nothing really that you don't already see in other AA methods or something it's just an artefact of the engine/rendering anyway. DLSS cops a lot of scrutiny, and rightly so, but it's not as if images we're pixel perfect / have drawbacks with other AA methods anyway. The list of extra positives, like performance and fine details, for example, more than outweigh any negatives imo.


I first got it in my head the DLSS could 'recover' details lost to render methods on the game's base pipeline when I saw what it did to CP2077. I can't even run that game without DLSS. And I wouldn't want to, because it actually doesn't look as good. 4k is where people say it has the most noticable effect, but I see it at 1080p too. There are losses there, as well. But it essentially functions as performance-giving AA, so what do you want? The game looks really good with it!

Everything comes with compromises. This is absolutely a net improvement, though. I ticked-off half sync and I can pass 100fps in some spots if I drop down below 720p. For the most part, it's hardly different. 540p starts to get noticeable with things in the distance. And you will see the more contrasted edges "marching" as you move the camera. It's subtle, as though for a frame or two it's still resolving the lower-res source frame. I almost missed it. Not unlike TAA, though with places where you have a lot of fine details partially blocking lights the matrix effect comes through. Most times, you have to look for it - it's not quite like TAA ghosting. More like little jitters.

Honestly, it still looks impressively good at 540p! You'd think it would be completely unusable but outside of a few situations it is as good as the rest for another 20FPS. To me, anyway.

This stuff is pretty cool. This might really be the future, though I'd really like for it to kinda be an industry standard that is available to everyone. It would change _everything_ if only that were true! You can get a lot of things happening with the amount of grunt that frees up. And then as cards get faster and this hardware matures... yeah. If these methods were to get absorbed into everything, we'd be looking at clearing some serious graphical hurdles. I am all for making that power available if it means future games can try previously nonviable graphical things. Less between you and the visual art. More the visual artists can bring to you.

I also really appreciate the value this is bringing to my 2-year-old midrange card. I'm actually getting markedly better performance out of it than when I first got it on some big games! How often does that really happen on such a dramatic level? When I bought it, it wouldn't have been able to run CP2077. Control was pushing it to the absolute limit of 1080/60, with heavier compromises setting in.


----------



## wolf (May 10, 2021)

robot zombie said:


> Honestly, it still looks impressively good at 540p! You'd think it would be completely unusable but outside of a few situations it is as good as the rest for another 20FPS. To me, anyway.


I am legitimately blown away how good "ultra-performance" mode looks, which on my monitor is an input resolution of 1148x480. It's remarkably sharp given the input and side by side against that as a custom resolution there really is no comparison at all, the DLSS version looks phenomenal. The only reason I find it unusable in Metro EE is because at that resolution, no RT setting has a high end sample count to make shadow noise acceptable, it's a crawling dancing mess. But I mean even fine detail and that is simply astonishing to my eyes given what is happening under the hood.

I also agree on the value it has added to an older / slower card, if adoption stays strong or improves drastically, it can really make weaker cards (relative to age, the game etc) keep their head above water for much longer. Especially because, as we see almost universally with DLSS, the lower the starting framerate, the bigger the benefit you get. Like if you start at 100fps on average @ native res, you might see something like quality 120fps, balanced 135fps, performance 150fps. But if you start from 30fps, you might see something more like quality 50fps, balanced 70fps, performance 90fps. So as you can see the % uplift is massively more when the fps starts lower, which in my eyes is all the more incentive to crank up the details.


----------



## Mussels (May 10, 2021)

shoulda put a poll, but i guess the consensus is "the new DLSS is pretty good"


----------



## wolf (May 10, 2021)

Mussels said:


> shoulda put a poll, but i guess the consensus is "the new DLSS is pretty good"


Couuuuld have put a poll, but I find the answers a bit restrictive which is why I gave some suggestions to base answers around, and most people already just dived into more extended thoughts. Also considerably more chance that someone who doesn't fit my request of the thread could just vote and leave if I've understood the way the polls work.

I'm quite enjoying reading people's thoughts on it, I won't sit here and pretend DLSS 2.0 is perfect, but I do think this thread has shielded us from just needlessly ragging on it because that user is perhaps against NVIDIA, the technological foundations/limitations _on paper, _proprietary technologies, adoption etc. The comments here are indeed quite positive overall but we're also exploring more of the nuances of performance and image quality more so than just DLSS = good/bad.


----------



## Mussels (May 10, 2021)

i'm just excited for the fact low end hardware (and LAPTOP hardware) is going to be useful many years later, thanks to DLSS


----------



## robot zombie (May 10, 2021)

Im really underselling how good it can look. I didnt wanna say too much... Im collecting screenshots. But I was really awestruck. Id stop, look around, and just chuckle at how amazing it looked. The sights are really something with all of the RTX enabled.

I kinda thought Id check it out... see how it really looks and runs. Now, I feel like I have to play it through. Far and away the best experience Ive had with the game. It looks great and runs smooth. Combat never looked or felt so good. Stoked to get home and play.


----------



## toilet pepper (May 10, 2021)

I just tried Metro EE and yesterday and DAYUMN it looks good and runs smooth. I wish Cyberpunk runs as good as this. Those are the 2 games that I've seen where my 3080 could stretch my legs.


----------



## wolf (May 11, 2021)

toilet pepper said:


> I just tried Metro EE and yesterday and DAYUMN it looks good and runs smooth.


What monitor/res are you running? was it you too that have an LG CX OLED? keen to hear what DLSS setting you use if you game on a 4k panel.

For my 3440x1440 output resolution, the "Quality" setting has almost the exact same total input pixel count as "Performance" does for a 4k output (2280x960 = 2.19MP, 1920x1080 = 2.07MP), so I would imagine this is why DF found that performance already looked so damn good.


----------



## toilet pepper (May 11, 2021)

wolf said:


> What monitor/res are you running? was it you too that have an LG CX OLED? keen to hear what DLSS setting you use if you game on a 4k panel.
> 
> For my 3440x1440 output resolution, the "Quality" setting has almost the exact same total input pixel count as "Performance" does for a 4k output (2280x960 = 2.19MP, 1920x1080 = 2.07MP), so I would imagine this is why DF found that performance already looked so damn good.


I'm at 3440x1440 @100hz as well, bud. I havent checked if my eyes can tell the difference between the DLSS settings though.


----------



## wolf (May 11, 2021)

toilet pepper said:


> I'm at 3440x1440 @100hz as well, bud. I havent checked if my eyes can tell the difference between the DLSS settings though.


Cheers, I've found each setting below Quality to very subtly become softer, yet they still manage to excellently produce fine detail. My only issue with Ultra performance mode really was that shadows become quite noisy because of the RT sample count at that low an input resolution. I also looked for the weapon ghosting as mentioned in this thread and I could see it, but had to look for it, my eyes aren't usually focused on the weapon itself.


----------



## robot zombie (May 11, 2021)

Alright, some screenshots from Control. RTX High (all effects) with medium settings. Can't remember which are 540p and which are 720p  But it all scales up to 1080p. If these end up causing problems for some people, let me know and I'll see what I can do. I wanted to preserve as much information as possible and these jpeg's are proving really problematic at holding onto the details on dark images. I still wind up losing color information. These are full-size, as well. Zoom in to see the full detail. Just be wary that some areas will be fuzzy/splotchy due to jpeg compression and then whatever your browser does to the colors. A lot of the gradients are visibly squished by that. 



Spoiler










There is a definite smoothing effect that can be seen here. The background looks very natural to me -- painterly, almost. This is a good concession... closer to what I wish AA would actually do. There are no jaggies here, and yet it's not exactly blurry. Soft, yet defined. Very easy to look at. It may not be "crisp," but all of those edges are _clean_ and it is just really striking to me. Something about this quality has a way of pulling back another layer of sheen. The details that count are emphasized, in a way. The image ends up looking more distilled. There's something more 'sophisticated' about it. If you look at the top of the door frame in the first one, or the sign in the second, you'll see those comb-shaped artifacts that are essentially the hallmark of DLSS. What's striking is how minor these are. Almost nothing in the image brings them out. In practice you only see a slight jitter instead of a full jaggy. These are the 'jitters' I was describing before.







Having a hard time on those bars. Notice how clean all of the other most visible edges are. It's only as they drop almost completely into the shadows that they begin to soften. The soft background can be seen past the bars, as well. It's the best treatment a LOD could ask for, almost mimicking a few of the things that come into play with optics out at those distances. There's a hazy, diffraction effect... not quite a fog or gaussian blur. Again, a tactically-elected simplification of the information - fewer distracting bits and more identifying cues.




Notice the comb-shaped distortion on some of the fine edges. A couple of low-contrast edges on the concrete are smudgy, though it's not so apparent because of the low contrast. Meanwhile the dark edges of the concrete contrasting against the harder light coming down are perfect. And yes, that gold-tone trim and those card slots do jitter a little bit.







Just classic shots, fairly sharp and clean all the way back. Full magic of RT effects here. Both of these have some very slight walking edges. I didn't even see them moving the camera around. Can you spot them? These are worth pointing out, as these are the very low-contrast gray transition edges that the RT diffuse light and contact shadows muddle up in obvious, gritty ways. Now instead of grit, there's a much subtler detail loss that's way tougher to spot, just warming up the image a little. I'm looking right at it as I type this. I wonder if anyone else sees it. You might notice it in a broader sense, in the way the light plays off of the materials.


It's subtle, but as you play you really see how much smoother the images are. You can also start to see where it struggles to get far off details accurately. It tends to be minor, unless we're talking about grates, which I'm not convinced the engine can do well enough to ever hold together at any distance or angle. DLSS 2.0 has the most trouble with fine lines at steep angles in this game. Those will begin to 'march' as the camera moves. Sometimes you will see minor comb artifacts. But all in all it just looks more visceral... less... video-gamey. Listen... RTX is still pretty inaccurate right now and the effects have artifacts... this etch and harshness that makes things look exaggerated. I think it so happens that DLSS 2.0 gently goes in and destroys only that crud... like laser surgery. The only major consistent downside I am seeing is color distortion and even slight banding. But it's so minor... the compression on these screenshots exaggerates it.

Screenshots really can't do it justice. It's hard to explain the experience. There's a fluidity to everything, like everything in the image just congeals. It's kind of spooky.


Spoiler: few more















Am I crazy for thinking there's some kind of magic here? If you really look on a still image, you can definitely see the distortion. But it stays in non-critical areas and you don't have the luxury of acquiring it in-game because it is temporal. Also worth noting, turning on all of the RT effects like this causes considerable detail loss even without DLSS... just piling on inaccuracies. I used to avoid the diffuse light (global illumination?) for that very reason... a tendency to etch out surfaces and transitions and at times, introduce horrible shimmer. Those losses appear significantly reduced, everything punches through the RT better and I don't experience that distracting harshness.

I'll have to try comparing in areas where the detail loss with all of the RT is heaviest. But the best way I can describe DLSS 2.0 in this game is that it 'completes' its RTX suite. The effects look better with it on -- far less dirty. The whole game was designed to have RTX effects play off of almost every material in a variety of ways. It's a part of the entire visual design concept with the base stuff being traditional stand-ins. And the architecture in this game is high art -- the things they do with the presentation of materials are amazing. But it was always a bit held back by the huge performance hit and the significant image quality loss introduced by heavy RT processing. Even if you could run it without DLSS, you had to make that call as to whether you wanted a crisp, but plainer image or a noisy/splotchy, but much more dramatic and impactful image.

Now, I feel like if you have an RT card... this is the way to run it. DLSS on with as many RT effects as possible. Can probably turn down regular settings. They'll make less of a visual difference at such a low render resolution. And I mean... almost any RTX card can run it. My 2060 will do it all in 1080p at a steady 80+ fps. The majority of compatible cards are better. I run DLSS 2.0 at the middle resolution (600-odd pixels,) turn everything down to medium except for texture quality (keep on high) and texture filtering (leave on low,) turn SSAO on, turn grain/motion-blur off, and turn on all of the RT effects.


I'm really appreciating what a work of art Control is now. It's really incredible, what they managed to do with these environments. When you get the RT going in these places that were built to feature it and cut out the distractions with the image quality, it just beams into your head how stunning these combinations of colors, materials, and mischievous geometry really are. I feel like I'm finally seeing it as its creators wanted it to be seen. A bit late, but it really is something to experience for yourself! My eyes are popping out of my head half of the time, just trying to take it all in.


----------



## las (May 11, 2021)

nguyen said:


> You are feeling more responsive with the Asus PG279Q because you are getting higher FPS at 1440p than 4K, once you are getting the same FPS at 4K as you do on 1440p, the responsiveness will be the same (or better on the LG OLED).
> 
> I have been playing PUBG on my LG CX 48in and the advantages of having a massive and responsive screen are undeniable. Before I had the LG 34GN850-B screen which is 3440x1440 160hz and the LG CX is just superior in every way. And yes I play PUBG with low settings, which give me the same FPS at 4K/1440p because of CPU bottleneck.
> 
> ...



A massive screen is not what you want for shooters, pretty much all pro's go as small as possible to keep focus high. If a huge TV was better, they would be using that. I perform way better on my 27 inch in shooters, it's not even close. Using a big screen for fast paced shooters feels clunky and you have to turn your head at times, thats NOT OPTIMAL at all. For casual gaming it's fine, which is what I use my OLED for (single player, 3rd person and simulators mostly).

Besides 120 Hz is nothing these days, most serious players use 240-360 Hz now, especially in shooters.

Using an OLED for shooters with HUD, for hours and hours will result in burn-in if you don't limit brightness / OLED LIGHT. When you do this, it is alot less suited for daytime usage and will suck in shooters as a result. Enemies can hide in dark spots with ease, you won't see them.

I don't have burn-in on my C9, even after 1000+ hours of gaming, because i vary games and content and uses screensaver when it's hooked up to my PC. When you are using an OLED with a PC, you need to take precautions or you will see burn-in. If you play the same game for hours and hours every day, I'm pretty sure you will see burn in pretty fast.

Burnin is STILL a problem if you don't change up your content. It's better on 2018+ panels (mostly because of software tho and ABL / Auto Brightness Limiter)
ABL can be VERY distracting when used with a PC tho.

I love OLED but it's not a perfect tech and never will be. When MicroLED is ready for the masses, OLED is going to die out pretty quickly, unless production cost has been lowered ALOT by then.
OLED probably have 5-10 more good years before MLED takes over.


----------



## wolf (May 12, 2021)

robot zombie said:


> Alright, some screenshots from Control.


Man those look fantastic, and all running well on basically the slowest RTX card.

Makes me want to go back to Control again for another playthrough, will have to wait till Metro is finished!


----------



## wolf (May 19, 2021)

DLSS 2.1 coming to No Mans Sky


----------



## dogwitch (May 19, 2021)

wolf said:


> DLSS 2.1 coming to No Mans Sky


and as people have play it.  missing trees etc.


----------



## wolf (May 19, 2021)

dogwitch said:


> and as people have play it. missing trees etc.


Yeah it's another example where the developer and implementation do not account for needing to change the LOD bias which totally fixes the issue due to rendering at lower internal res. Quick fix by the user but it should be baked into the DLSS setting, hopefully, it gets addressed.

Also Metro Exodus EE is patched and fixes HDR+DLSS


----------



## dogwitch (May 19, 2021)

wolf said:


> Yeah it's another example where the developer and implementation do not account for needing to change the LOD bias which totally fixes the issue due to rendering at lower internal res. Quick fix by the user but it should be baked into the DLSS setting, hopefully, it gets addressed.
> 
> Also Metro Exodus EE is patched and fixes HDR+DLSS


its a shame thru. the games use awful hdr.... but at the same time... you will need 15k total for calibrating for tv.. half that on top of a 4k cost monitor that follow the hdr standard.


----------



## looniam (May 19, 2021)

got a 3060 and a cheap 4k tv and for my perspective is cyberpunk is a mess but metro EE is a showcase.i bought the tv to watch 4k videos but most games )(dlss or not) are are barley playabe (IMO) ~35 fps is rather surprising. -esp considering the past generations' . . not a big fan of control's artwork . .

totally looking to grab a 3070ti; i expect that to get 4k/60fps in all games (mostly) dlss or not . .


----------



## dogwitch (May 19, 2021)

looniam said:


> got a 3060 and a cheap 4k tv and for my perspective is cyberpunk is a mess but metro EE is a showcase.i bought the tv to watch 4k videos but most games )(dlss or not) are are barley playabe (IMO) ~35 fps is rather surprising. -esp considering the past generations' . . not a big fan of control's artwork . .
> 
> totally looking to grab a 3070ti; i expect that to get 4k/60fps in all games (mostly) dlss or not . .


going to need a 80 or higher.


----------



## Mussels (May 19, 2021)

looniam said:


> got a 3060 and a cheap 4k tv and for my perspective is cyberpunk is a mess but metro EE is a showcase.i bought the tv to watch 4k videos but most games )(dlss or not) are are barley playabe (IMO) ~35 fps is rather surprising. -esp considering the past generations' . . not a big fan of control's artwork . .
> 
> totally looking to grab a 3070ti; i expect that to get 4k/60fps in all games (mostly) dlss or not . .


not with RTX on and ultra settings, it wont

Keep in mind that a 3090 is pretty much required if you want 4K60 without compromising (DLSS counts there, too) - and even they cant do RTX on and maintain that


----------



## wolf (May 20, 2021)

A 3060 really isn't a great match for 4k resolution above about 30fps, at least not in new and very demanding titles, you will probably find most sort of pre-2017 games you can crank up too.

If you really want a great experience on that TV I'd also say a 3080 or higher should be at least where you start, and even then keep expectations in check.


----------



## Hachi_Roku256563 (May 20, 2021)

I do wonder if one day this technolagy can be used on lower end cards
say the future equivalent of a gt1030 that performs like a 1060
cause if the 1060 has dlss it would legit be amazing


----------



## looniam (May 20, 2021)

Mussels said:


> *not with RTX on and ultra settings, it wont*
> 
> Keep in mind that a 3090 is pretty much required if you want 4K60 without compromising (DLSS counts there, too) - and even they cant do RTX on and maintain that


yeah it does.

look i'm not exaggerating, the little guy is very surprising.


----------



## Mussels (May 20, 2021)

looniam said:


> yeah it does.
> 
> look i'm not exaggerating, the little guy is very surprising.
> View attachment 200988


that... backs up what i said? average is still below 60


----------



## wolf (May 20, 2021)

DLSS Ultra performance might get you there... what DLSS setting was used for those ~50's FPS runs?

50+ could be OK if the TV has HDMI 2.1 and variable refresh (gsync/freesync)


----------



## looniam (May 20, 2021)

Mussels said:


> that... backs up what i said? average is still below 60 fps


i never said it fall below the magically 60fps



wolf said:


> DLSS Ultra performance might get you there... what DLSS setting was used for those ~50's FPS runs?


uber or utra . .it looks better in meto than cyber cupunk, imo.

E: let;'s be clear here; i'm not suggesting a 3060 is a 4K card, just its surprising


----------



## nguyen (May 20, 2021)

Just tried out Outriders today and the DLSS implementation in this game is extreme good, game look very sharp that I don't need to use the sharpen filter and there is no visible DLSS artifact anywhere to be seen.
Seems like games made with Unreal Engine have motion vector tied into everything that DLSS just work outta the box, which is just awesome.





Left is Native 4K and Right is 4K DLSS Balanced, the character's hair look even better on the DLSS with higher details and better anti-aliasing


----------



## dogwitch (May 20, 2021)

nguyen said:


> Just tried out Outriders today and the DLSS implementation in this game is extreme good, game look very sharp that I don't need to use the sharpen filter and there is no visible DLSS artifact anywhere to be seen.
> Seems like games made with Unreal Engine have motion vector tied into everything that DLSS just work outta the box, which is just awesome.
> 
> View attachment 200991
> ...


that sub 2k assets right there on the left.  i can even tell  in pic.


----------



## nguyen (May 20, 2021)

dogwitch said:


> that sub 2k assets right there on the left.  i can even tell  in pic.


Really? well I can't upload full size 4K image because it exceed TPU size limit (12MB for a single image)
here is half pic of the Native 4K image



Converting to lossy JPEG would have blurred out the image.


----------



## wolf (May 20, 2021)

nguyen said:


> Left is Native 4K and Right is 4K DLSS Balanced, the character's hair look even better on the DLSS with higher details and better anti-aliasing


ANother very strong showing for DLSS, perhaps the best yet even? Really nice for the balanced setting too actually.


----------



## dogwitch (May 20, 2021)

nguyen said:


> Really? well I can't upload full size 4K image because it exceed TPU size limit (12MB for a single image)
> here is half pic of the Native 4K image
> View attachment 201007
> Converting to lossy JPEG would have blurred out the image.


interesting. the light bouncing off on a lot of that.is in wrong direction.  if you look at the  green area to left of player. and top right. to.but  ai can only do so much.


----------



## X71200 (May 20, 2021)

In my experience, stuff gets more blurry in Warzone, but performance sky shoots compared to other methods of AA. As such, I've been using it since it has been implemented.


----------



## wolf (May 21, 2021)

dogwitch said:


> light bouncing off on a lot of that.is in wrong direction


Not sure any of that would be the fault of DLSS?


----------



## dogwitch (May 21, 2021)

wolf said:


> Not sure any of that would be the fault of DLSS?


partly due to it having to fake data in the scene for  a higher rez and fps.


----------



## wolf (May 21, 2021)

dogwitch said:


> partly due to it having to fake data in the scene for  a higher rez and fps.


Well the 'fake' data is just their upscale of the lower res image, so in theory, if the lighting is wrong at 4k native, it would also be wrong at 1080p native, and any DLSS setting, there should be no effect on lighting direction in a scene. Do you mean it's correct in the native image and not correct in the DLSS image? If that's what you mean I'm not seeing it.


----------



## Space Lynx (May 21, 2021)

Mussels said:


> The higher the res, the better DLSS works.
> 
> I genuinely did not want a 4K monitor when i bought my current one due to low FPS, but knowing DLSS is spreading so much (its becoming part of the unity engine!) it makes a 4K monitor more likely in the future



4k monitor makes sense too, since so many games are capped at 30 or 60 fps.  which modern gpu's can play older games at native 4k 30/60 fairly easy. i for one am waiting to replay Final Fantasy X when I get a 4k screen... so far no 4k screens impress me. I want a 32" 4k IPS or OLED (never doing VA again, never ever) I'm leaning towards just getting a 4k OLED tv in a year or two... hopefully they have decent 40" OLED 4k panels from LG in like two years... that's my main hope. I wouldn't mind sitting back a little further on my desk for that.


----------



## nguyen (May 21, 2021)

lynx29 said:


> 4k monitor makes sense too, since so many games are capped at 30 or 60 fps.  which modern gpu's can play older games at native 4k 30/60 fairly easy. i for one am waiting to replay Final Fantasy X when I get a 4k screen... so far no 4k screens impress me. I want a 32" 4k IPS or OLED (never doing VA again, never ever) I'm leaning towards just getting a 4k OLED tv in a year or two... hopefully they have decent 40" OLED 4k panels from LG in like two years... that's my main hope. I wouldn't mind sitting back a little further on my desk for that.



How about trying out the 48in CX , they are pretty cheap now, around 1000usd and the gaming experience they offer is far ahead of any 1440p IPS screen


----------



## wolf (May 21, 2021)

I feel like I read somewhere there's going to be a 42" 4k LG OLED this year with all the juicy specs we want. Could be even better suited to gaming? I'd 100% rock the 48" already though if I was in the market, but I haven't even had my 34" 3440x1440 144hz Ultrawide for 12 months yet, which also feels like a great fit for a 3080 class GPU. DLSS where applicable though can really make or break the 4k experience. I have a HDMI 2.0 4k60 TV now, but it's just that much harder without HDMI 2.1 and VRR to tune the experience and cap 60fps and NEVER drop below to have it be seamless, VRR does so much heavy lifting.


----------



## dogwitch (May 21, 2021)

wolf said:


> Well the 'fake' data is just their upscale of the lower res image, so in theory, if the lighting is wrong at 4k native, it would also be wrong at 1080p native, and any DLSS setting, there should be no effect on lighting direction in a scene. Do you mean it's correct in the native image and not correct in the DLSS image? If that's what you mean I'm not seeing it.


lighting just is wrong in the image


----------



## wolf (May 21, 2021)

dogwitch said:


> lighting just is wrong in the image


Fair enough, so it's completely decoupled from DLSS imagine quality then, as the aberration is present no matter the resolution.

It does look unnatural to me I'll admit, but I've been playing quite a few RTX games lately, especially Metro, which is so natural that at select times it's almost drab... like a cloudy day outside actually is.


----------



## nguyen (May 21, 2021)

dogwitch said:


> lighting just is wrong in the image



Well Outriders doesn't support RayTracing, only DLSS.



wolf said:


> I feel like I read somewhere there's going to be a 42" 4k LG OLED this year with all the juicy specs we want. Could be even better suited to gaming? I'd 100% rock the 48" already though if I was in the market, but I haven't even had my 34" 3440x1440 144hz Ultrawide for 12 months yet, which also feels like a great fit for a 3080 class GPU. DLSS where applicable though can really make or break the 4k experience. I have a HDMI 2.0 4k60 TV now, but it's just that much harder without HDMI 2.1 and VRR to tune the experience and cap 60fps and NEVER drop below to have it be seamless, VRR does so much heavy lifting.



4K60hz LED TV have horrible input delay that your UW 1440p screen should be a better option.
Yeah I hope LG start making 42in OLED TV soon


----------



## X71200 (May 21, 2021)

Somebody I know picked up the 55 of that LG TV, he has a 3090 attached to it and he says it's just amazing. Like he doesn't use DLSS, everything maxed and still around 120 FPS in Warzone.  You do need to stay at least a good meter away from those TVs to make the experience worthwhile though. To me however, it's still too massive. 42 is generally left for lower end TVs, as such I wouldn't expect OLED to come there too soon. They did make a 32 inch OLED monitor, but it's overly expensive and for design purposes.


----------



## dogwitch (May 22, 2021)

wolf said:


> Fair enough, so it's completely decoupled from DLSS imagine quality then, as the aberration is present no matter the resolution.
> 
> It does look unnatural to me I'll admit, but I've been playing quite a few RTX games lately, especially Metro, which is so natural that at select times it's almost drab... like a cloudy day outside actually is.


the way my eye see. i spot odd or out of place lighting.
the best use of pro lvl   global illumination was end game suits. fully trick me eyes


----------



## robot zombie (May 23, 2021)

I'm going in blind and have barely gotten to the meat of the game, so I hope I'm not being preemptive... but the things I'm seeing in Metro Enhanced... am I crazy for calling it a transformation? I gotta see what went into this. Not only is the performance absurdly better (I laughed out loud when the game started and I saw 134FPS outta my 2060 as the camera panned down on the Moscow wastes,) but visually it looks totally different. You almost don't notice at first, but after a few minutes of looking at stuff, you realize EVERYTHING is different. And much cleaner. Very cool stuff. Really liking it so far.

Spoke too soon. CTD's are back in the same fashion as in the past. This one seems to have eaten my save, because it forgot 2 hours of progress. It's either that or the save switcheroo phenomenon is back. I don't feel like sifting.

It does look absolutely fantastic though, heh. ALL of the RTX effects are pretty siginificantly expanded and higher in quality. Full GI on the lighting now and it makes a huge difference, especially for interiors. It does make say, that one light in a dark space appear brighter, but it's not distracting because the colors are now more natural, even in the shadows cast around it. All in all the visual quality shoots up pretty much a full generation. This looks like a different era. I always found metro visually impressive, with some nice tricks. But graphically I thought it was just well done... not standout but good. Of its time, still. Now, it really looks like the future. I feel comfortable saying that. If they can just get it stable that would be nice!


----------



## Hardcore Games (May 23, 2021)

I do not have a lot of games with DLSS support but that will change over time. Sadly the rip off bitcoin saga still is stiffing corporations galore.


----------



## wolf (May 26, 2021)

woop woop, got my 5900x today, should go in tonight. There are areas of Metro Exodus EE that seem very CPU limited and I drop below 60fps (as GPU usage plummets to 40-50%), any idea's what sort of improvement I can expect?


----------



## nguyen (May 26, 2021)

wolf said:


> woop woop, got my 5900x today, should go in tonight. There are areas of Metro Exodus EE that seem very CPU limited and I drop below 60fps (as GPU usage plummets to 40-50%), any idea's what sort of improvement I can expect?



I woud expect about 15% higher on the 1% low FPS, which is more important than AVG FPS anyways. 

Also I just tried out Amid Evil, 4K DLSS Balanced bring about 100% perf improvement (~50fps with maxed RT and 100fps with DLSS Balanced). The game look to be quite modern with the inclusion of RayTracing.


----------



## Taraquin (May 26, 2021)

I use a 1080p display, and there is a slight reduction even in qualitymode in Cyberpunk vs no dlss, but the performancebump makes it worth it. I think dlss 1.0 is garbage, BFV and SOTTR was a disgrace, but the semi 2.0 mode in Control and full 2.0 in Cyberpunk is impressive, though at 1440p or above it makes more sense than 1080p.


----------



## nguyen (May 26, 2021)

Taraquin said:


> I use a 1080p display, and there is a slight reduction even in qualitymode in Cyberpunk vs no dlss, but the performancebump makes it worth it. I think dlss 1.0 is garbage, BFV and SOTTR was a disgrace, but the semi 2.0 mode in Control and full 2.0 in Cyberpunk is impressive, though at 1440p or above it makes more sense than 1080p.



You can try Image Shaperning in NVCP to make the image sharper, with DLSS or not. With a 24in 1080p screen I would use around 30% sharpening without DLSS and 50% with DLSS in CP2077. 
This is due to the nature of TAA implementation in CP2077 that make moving characters look soft even without DLSS.


----------



## Taraquin (May 26, 2021)

nguyen said:


> You can try Image Shaperning in NVCP to make the image sharper, with DLSS or not. With a 24in 1080p screen I would use around 30% sharpening without DLSS and 50% with DLSS in CP2077.
> This is due to the nature of TAA implementation in CP2077 that make moving characters look soft even without DLSS.


Yeah, I do that, but still I feel it looks better WO dlss even at quality, but the difference is small while performancebump is large so still worth it


----------



## Mussels (May 26, 2021)

After more farting about, i've decided on my final opinion on DLSS, which is law and must be obeyed at all times or i feed you to my fancy crab with his cute hat

Run the settings you want. Happy with FPS? leave it.
Unhappy with FPS? Try turning DLSS on, before lowering other settings. You may like it.



Oh phew, that was hard.


----------



## nguyen (May 26, 2021)

Mussels said:


> After more farting about, i've decided on my final opinion on DLSS, which is law and must be obeyed at all times or i feed you to my fancy crab with his cute hat
> 
> Run the settings you want. Happy with FPS? leave it.
> Unhappy with FPS? Try turning DLSS on, before lowering other settings. You may like it.
> ...



Hell nawh, screw your Law   . I just set everything to Ultra + DLSS Balanced and game.


----------



## Mussels (May 26, 2021)

nguyen said:


> Hell nawh, screw your Law   . I just set everything to Ultra + DLSS Balanced and game.


*banned*


----------



## wolf (May 31, 2021)

It's interesting how much the 3700X to 5900X upgrade has increased my FPS across the board, but _especially_ in DLSS games. I guess the lower input resolution puts you more into CPU limited territory. Every single game I play atm plays better with this CPU, it's a much better match for a current-gen high-end card.


----------



## robot zombie (May 31, 2021)

wolf said:


> It's interesting how much the 3700X to 5900X upgrade has increased my FPS across the board, but _especially_ in DLSS games. I guess the lower input resolution puts you more into CPU limited territory. Every single game I play atm plays better with this CPU, it's a much better match for a current-gen high-end card.


Interesting. I could see that. 2600 to 3900x was quite an uplift playing at 1080p, especially for high refresh. There was a real bottleneck with CPU utilization. I haven't watched it much running these new DLSS titles, but there are some minor bottlenecks that will knock 30% of the frames off. Usually areas with a lot of actors or places you'd expect to have a lot scripts running, or central elevation points. I can probably count them on one hand. Both Metro and Control had many more of them before all of this, but a few lesser stragglers remain.

However, I do generally get a GPU bottleneck coming out on top. Of course, this a 2060 we are talking about in my case. Less straight raster cajones, less RT moxie. Any frame rate limitations I hit can be alleviated by lowering the base resolution and utilization points to that as well. So we're probably both just in our own sweet spots. Of course, utilization doesn't tell the whole story. I could have a 'secondary' bottleneck with CPU utilization itself. There could still be more gains with a faster CPU.

Though come to think of it... if I shut of RT and just leave Control running only 720p DLSS, it is redlining at 165fps. Tried that down in the quarry. Other areas may not fare as well. Not to mention, the 3900x was a higher-bin, generally higher-boost chip iirc. Might just be enough. Probably big things I'm missing there.

Honestly though, it's hard for me to gauge those things with this card at this point... namely because it only has 6gb of vram.


----------



## wolf (May 31, 2021)

robot zombie said:


> So we're probably both just in our own sweet spots


It does sound that way, I'd wager you're far more in GPU limited territory which is where you want to be ideally IMO. Truly amazing how a 2060 is performing with DLSS on.


----------



## dogwitch (May 31, 2021)

wolf said:


> It does sound that way, I'd wager you're far more in GPU limited territory which is where you want to be ideally IMO. Truly amazing how a 2060 is performing with DLSS on.


i mean when it low the rez and such. that why.


----------



## robot zombie (May 31, 2021)

dogwitch said:


> i mean when it low the rez and such. that why.


Oh, sure. But on this same rig, I was running as low a 540p with a little lumasharpen just to get the 2 main effects running at a choppy 60fps. With DLSS I run at 720p, all of the RT enabled and the lowest 'idles' I see are 70, with 80 being typical and a good chunk of areas going as much as 20fps higher than that. Bone stock with the old DLSS or regular scaling, I might get similar performance with no RT.

So definitely not nothing. Huge difference in consistency and the visuals are just on another level. No game has ever performed remotely this well on this card, save for ME Enhanced. It's really not even close, in terms of the settings you can run. For the most part, I'd usually consider this a 60fps card with some concessions. It generally won't do max settings at 60fps on these kinds of games.


----------



## Mussels (May 31, 2021)

wolf said:


> It's interesting how much the 3700X to 5900X upgrade has increased my FPS across the board, but _especially_ in DLSS games. I guess the lower input resolution puts you more into CPU limited territory. Every single game I play atm plays better with this CPU, it's a much better match for a current-gen high-end card.


The uplift from zen 2 to zen 3 is MASSIVE, and its really hard to convince people of how big it is without going through it personally


----------



## nguyen (May 31, 2021)

Mussels said:


> The uplift from zen 2 to zen 3 is MASSIVE, and its really hard to convince people of how big it is without going through it personally



Well compare to the non-existant IPC and clock speed uplift from Intel camp, Zen 2 to Zen 3 uplift is considered a miracle   .


----------



## wolf (May 31, 2021)

Mussels said:


> The uplift from zen 2 to zen 3 is MASSIVE, and its really hard to convince people of how big it is without going through it personally


It really is and I can see that example being true too, a mate of mine doesn't believe that the CPU upgrade has given such a big boost.. It's like man you gotta come see it in action. 

I'd wager even if I went from a 3700x to a 5600x I'd be seeing 95%+ the same thing as I have with the 5900x, but I couldn't bring myself to drop cores, or pay $699 AUD for the 5800x, 5900x at $859 gave me the right feeling.


----------



## robot zombie (May 31, 2021)

Off-topic... I'd love to get a 5900x but I am so hellbent on squeezing this rig because of its history. I've been running this Strix x370-f since basically the dawn of Ryzen. It saw Zen, Zen+, and Zen 2. I paid nothing for what it had on it (or what it would do.) By luck the b350 I planned on using had problems and I wasn't willing to wait even though I sent it out. Grabbed a better board for less than the b350, which I later recouped in a build for a client (by then it was going for more, saved them some and they were fine with the warranty situation.) It has consistently gotten max performance, and even beyond what was commonly seen on fairly inexpensive, but versatile chips. The 3900x just sort of completed it, for what... $400 bucks? All other areas are covered well, I didn't skip on the b-die or anything. I feel like it's reached a good apex and that's something I think is worth appreciating for all that its worth.

I'd sooner replace this card... but I really do think I got one of the best of the bunch in the Strix 2060. It clocks higher than anything I've seen in benchmarks but the odd outlier and quietly holds 55C at max utilization. And now games are suddenly starting to play better on it. I get to experience some RT for real, for real.

It just feels like the universe is telling me something there, you know? It runs more than well enough that I can continue to put funds elsewhere. Things like this just add to what comes out of what went in. Good usage is value added and all that. Still feels like a shame to upgrade. Especially right now


----------



## dogwitch (Jun 1, 2021)

i mean atm the latest game i got for the rig(in spec of profile) that modern is doom eternal .


----------



## robot zombie (Jun 1, 2021)

dogwitch said:


> i mean atm the latest game i got for the rig(in spec of profile) that modern is doom eternal .


You've got a really interesting rig there, too. Quite a few uses outside of doom eternal 

I can bite, though. The newest games are rarely on my list... I kinda know what I'm looking for and rarely pick them up. When I do, I tend to get a lot out of it because even putting in the time is a calculated decision. I'm totally in it by then. I'm a lifer with my games. More of my gaming time is spent playing somewhat older games that I just happen to love enough to play for years. Every now and again, I grab a new game that goes into that rotation. Another factor in upgrading. Easier decision now, two of the newest additions to my roto are Metro Exodus and Control. I knew on the first playthroughs... at launch. And I have never regretted the time. Love em both - will play until I attain encyclopedic knowledge on them. And would you look at that, they just got the pass to run better on my machine.

It's like a little ray of luck shining down. I really hope this propagates out, and that the process of training can eventually be sped up enough to get this shipping out on launch day. That's the ultimate.


----------



## dogwitch (Jun 1, 2021)

robot zombie said:


> You've got a really interesting rig there, too. Quite a few uses outside of doom eternal
> 
> I can bite, though. The newest games are rarely on my list... I kinda know what I'm looking for and rarely pick them up. When I do, I tend to get a lot out of it because even putting in the time is a calculated decision. I'm totally in it by then. I'm a lifer with my games. More of my gaming time is spent playing somewhat older games that I just happen to love enough to play for years. Every now and again, I grab a new game that goes into that rotation. Another factor in upgrading. Easier decision now, two of the newest additions to my roto are Metro Exodus and Control. I knew on the first playthroughs... at launch. And I have never regretted the time. Love em both - will play until I attain encyclopedic knowledge on them. And would you look at that, they just got the pass to run better on my machine.
> 
> It's like a little ray of luck shining down. I really hope this propagates out, and that the process of training can eventually be sped up enough to get this shipping out on launch day. That's the ultimate.


lol true.
tbh i trying to get thru multi console back log in games,pc to. also really trying to  thin out a bit of bookmark shows on netflix... i max said book mark like 3 times now.... so  unless it 1 hell of a game that  modern.  it takes me awhile .


----------



## wolf (Jun 1, 2021)

In the executive keynote at Computex, they just said DLSS is coming to RDR2.

DOOM Eternal also to receive RTX and DLSS


----------



## nguyen (Jun 1, 2021)

wolf said:


> In the executive keynote at Computex, they just said DLSS is coming to RDR2.
> 
> DOOM Eternal also to receive RTX and DLSS



I have been putting off DOOM Eternal for the RTX, well that took a long ass time.
RDR2 is still very popular, so DLSS is making big stride into ubiquity this year.


----------



## welly321 (Jun 5, 2021)

Honestly I've found it really depends on the game. I've used DLSS on Cyberbunk and Monster Hunter World on my ZOTAC 3060ti and it was awesome. I couldn't even notice the difference between DLSS enabled 1440P and regular 1440P. However on Final Fantasy 15, the DLSS made it very blurry, especially distant objects.


----------



## dogwitch (Jun 6, 2021)

going back on a ltt video.  linus mention they loaded up a costum doom dlss  profile for doom 8k.
that nvidia gave them.
so i find it really odd it took them this long to release it.


----------



## nguyen (Jun 6, 2021)

dogwitch said:


> going back on a ltt video.  linus mention they loaded up a costum doom dlss  profile for doom 8k.
> that nvidia gave them.
> so i find it really odd it took them this long to release it.



DOOM Eternal run on a potato already so there is no point for DLSS haha,
Probably was waiting for RTX so that DLSS make sense for Doom Eternal


----------



## dogwitch (Jun 6, 2021)

nguyen said:


> DOOM Eternal run on a potato already so there is no point for DLSS haha,
> Probably was waiting for RTX so that DLSS make sense for Doom Eternal


with profile it had the rtx on to.


----------



## wolf (Jun 15, 2021)

Looks like June 29th for DOOM Eternal update.

I am quite keen, I pick up a map here and there quite often. Just this weekend was back at my mates on his LG CX and was able to lock 2160p@120hz, with barely a fluctuation in sight, even on the toughest map (super gore nest imo).

My hope is that RTX+DLSS effectively give me equal performance to that I had before, and I am really hopeful for great DLSS IQ on ID tech 7


----------



## nguyen (Jun 15, 2021)

wolf said:


> Looks like June 29th for DOOM Eternal update.
> 
> I am quite keen, I pick up a map here and there quite often. Just this weekend was back at my mates on his LG CX and was able to lock 2160p@120hz, with barely a fluctuation in sight, even on the toughest map (super gore nest imo).
> 
> My hope is that RTX+DLSS effectively give me equal performance to that I had before, and I am really hopeful for great DLSS IQ on ID tech 7



Wolfenstein Youngblood which use ID tech 6 already has very solid RT/DLSS implementation, so I would imagine RT/DLSS work well with DOOM Eternal too.


----------



## dogwitch (Jun 15, 2021)

they been playing with it since tail end last year and early this year.


----------



## mouacyk (Jun 15, 2021)

wolf said:


> Looks like June 29th for DOOM Eternal update.
> 
> I am quite keen, I pick up a map here and there quite often. Just this weekend was back at my mates on his LG CX and was able to lock 2160p@120hz, with barely a fluctuation in sight, even on the toughest map (super gore nest imo).
> 
> My hope is that RTX+DLSS effectively give me equal performance to that I had before, and I am really hopeful for great DLSS IQ on ID tech 7


Finally, a reason for me to complete the game.  The Icon of Sin has just been released, no idea how much further but feels close.


----------



## robot zombie (Jun 19, 2021)

Man... on a whim I jumped into the middle of of my last Metro Exodus playthrough... just finishing The Caspian Sea out. It's not even right, how goddamned good it looks. When you put it down, go play some other games, and pick it back up, it's really striking. There's a little voice in my head constantly going *lookatit* *justlookatit* *lookatit-lookatit-lookatit* as I'm looking around. My eyes just wind up glued to what they're seeing, pouring over everything.

Even in the blander scenes, it is so photorealistic, with nice clean edges, that I just stop and I stare. It's got this evil magic about it, like it really shouldn't be this convincing. And then the drab, subdued early morning tones give way to a glorious sunset shooting through the canyon and it is just so impactful. It looks natural, intuitive. I wish I had taken screenshots but I was really enjoying the moment. Really pretty incredible, just immediately in awe of it again.


----------



## robot zombie (Jun 21, 2021)

Since I missed my last Caspian screenshot opportunities, I thought I'd make up for it with taiga. It really doesn't waste any time giving you things to see and talk about.










The colors are overall more subdued... but somehow they wind up looking more alive, more essential. When you're looking around, the light is quite dynamic, depending a lot on elevation and angle. These scenes are ever-evolving as you move through them and it's such a treat to see. It's all about the transitions. Not just light to shadow, but the unfurling of light through air and across surfaces. The bleed and directionality. I'm gonna pick at it later, but you don't even notice these things directly. It's a bit confusing at first. I've played this game a good dozen times at least and I felt like I was getting lost, seeing these places this way. It felt like things got moved around. But at the same time, it is just so intuitive. Jamais vu. Spooky stuff. That's how you know, though. I've had several Stephen King moments, where I was convinced I didn't know where I was when I did. The entire gestalt gets flipped, it has so much impact.

Agh, it's so hard for me to portray what makes it such a magical experience. If you can get it running at around 80fps and manage to just explore and look around, you can't tell me it's not taking you places. The DLSS is a big part of the impact, too. I would describe it as being fairly lossy, in a different sense. The way it alters information in the image brings out some smaller details while smoothing and defining stark edges. It contributes a lot to this extra suspension of disbelief. Like I've said before, it's a concession for sure, but incidentally a very worthwhile one. Even at 1080p.

That's the big takeaway. What this stuff all does best is peel back a good chunk of your burden to your own suspension of disbelief. The imagery will take you there. It becomes apparent immediately. Like, it really just hits you. Kind of slaps you across the face with a very inviting illusion.

There's so much more to show. I have a lot of screenshots to get through first, though.


----------



## wolf (Jun 22, 2021)

@robot zombie that looks epic, I am enjoying my slow play through too and I really enjoy reading your insights. It really all comes together in this title and is again amazing to show what a 'lowly' RTX 2060 can achieve IQ-wise. Astonishingly brilliant.

Has anyone tried to retrofit the DLSS 2.2 DLL file to other games? a fair bit of chatter about it drastically reducing ghosting in some titles and being a nice iterative step in the right direction IQ wise. I'm reading thins like DLSS 2.2 performance mode being a lot closer to DLSS 2.0/2.1 quality mode in terms of IQ, which is massive really, as it in effect is boosting performance yet further at a given IQ level.

That's Nvidia's play with now with DLSS, refinement and above all, adoption.


----------



## nguyen (Jun 22, 2021)

wolf said:


> @robot zombie that looks epic, I am enjoying my slow play through too and I really enjoy reading your insights. It really all comes together in this title and is again amazing to show what a 'lowly' RTX 2060 can achieve IQ-wise. Astonishingly brilliant.
> 
> Has anyone tried to retrofit the DLSS 2.2 DLL file to other games? a fair bit of chatter about it drastically reducing ghosting in some titles and being a nice iterative step in the right direction IQ wise. I'm reading thins like DLSS 2.2 performance mode being a lot closer to DLSS 2.0/2.1 quality mode in terms of IQ, which is massive really, as it in effect is boosting performance yet further at a given IQ level.
> 
> That's Nvidia's play with now with DLSS, refinement and above all, adoption.



Wow you are right, I just tried DLSS 2.2 on CP2077, Death Stranding and Metro Exodus EE by copying the new nvngx_dlss.dll to their respective game foldlers and the particle trailing artifacts that are associated with DLSS are gone in those 3 games.  
More info here


----------



## nguyen (Jun 23, 2021)

Oh boy, seems like Nvidia just one up AMD again by porting FSR code (or very similar) into their new Sharpen+ filter in the 471.11 driver (only via Geforce Experience overlay, NVCP does not have this option yet).
The result is quite mindblowing, here is an example of CP2077 4K Native vs 4K DLSS Performance with Sharpen+





This make the 4K DLSS Performance look 99% like 4K Native, people would be hard pressed the tell these 2 apart in a blind test.
I have also tried 1080p and DLSS Quality with Sharpen+ look remarkably better than Native due to DLSS having better quality AA, kinda insane really.

Note that you can only take screenshots with Sharpen+ applied with Shift+Print Screen, I was scratching my head off when Afterburner doesn't take screenshots with Sharpen+ applied.


----------



## dogwitch (Jun 23, 2021)

nguyen said:


> Oh boy, seems like Nvidia just one up AMD again by porting FSR code (or very similar) into their new Sharpen+ filter in the 471.11 driver (only via Geforce Experience overlay, NVCP does not have this option yet).
> The result is quite mindblowing, here is an example of CP2077 4K Native vs 4K DLSS Performance with Sharpen+
> View attachment 205071View attachment 205072
> This make the 4K DLSS Performance look 99% like 4K Native, people would be hard pressed the tell these 2 apart in a blind test.
> ...


blow it up past 80 inches. then you will notice it,


----------



## wolf (Jun 23, 2021)

nguyen said:


> Oh boy, seems like Nvidia just one up AMD again by porting FSR code (or very similar) into their new Sharpen+ filter in the 471.11 driver (only via Geforce Experience overlay, NVCP does not have this option yet).
> The result is quite mindblowing


I suspected they might make a move like this, and it's still very early days but clearly nothing is stopping them. If the ease of implementation really is true (example given of a dev taking only ~2 hours to imp it into their game), Nvidia could've had a source get them the code in the last couple of weeks to toy with it. DLSS remains, and they have their own FSR called Sharpen+ either for non DLSS capable cards or available to stack.

Said it before already today but RTX users are the biggest winners here and will get the best of both worlds in any given game. People act like you buy an RTX card *just* to get RT or DLSS and "py the Nvidia tax", the reality is now their entire lineup will be capable, it's just another value add or talking point, considering at given price points cards from both camps are generally within spitting distance of each other or not far from it.


----------



## londiste (Jun 23, 2021)

nguyen said:


> Oh boy, seems like Nvidia just one up AMD again by porting FSR code (or very similar) into their new Sharpen+ filter in the 471.11 driver (only via Geforce Experience overlay, NVCP does not have this option yet).
> The result is quite mindblowing, here is an example of CP2077 4K Native vs 4K DLSS Performance with Sharpen+


It is a matter of perspective or perception to quite a large degree. To my eyes Sharpen+ (unsurprisingly) has a lot of sharpening artifacts.


----------



## wolf (Jun 23, 2021)

dogwitch said:


> blow it up past 80 inches. then you will notice it,


Seems like an extremely unrealistic situation just to nitpick it? Much more so than usual, at least.


----------



## nguyen (Jun 23, 2021)

londiste said:


> It is a matter of perspective or perception to quite a large degree. To my eyes Sharpen+ (unsurprisingly) has a lot of sharpening artifacts.



The beauty of Nvidia freestyle Sharpen+ is that you can adjust Sharpen/Texture as you like, it takes a bit of effort to fine tune the visuals to be as close to 4K Native. However I believe the previous comparison was already too close to tell without pixel peeping.
How about I up the difficulty this time 








For >100% more FPS and IQ to be this close, DLSS Balanced/Performance mode and Sharpen+ are killer combo


----------



## dogwitch (Jun 23, 2021)

wolf said:


> Seems like an extremely unrealistic situation just to nitpick it? Much more so than usual,


4k really for big screen.
go to a movie theater. that what they are using .

smaller screens you want Pixels per inch (ppi).
there 2 different things.


----------



## wolf (Jun 24, 2021)

dogwitch said:


> 4k really for big screen.
> go to a movie theater. that what they are using .


I mean, I get your point, it's going to make pixel peeping easier, but most people gaming on 4k either use a 48" or smaller, usually much smaller, or if you're gaming on a 65"+ chances are you're sitting a bit further back.


----------



## dogwitch (Jun 24, 2021)

wolf said:


> I mean, I get your point, it's going to make pixel peeping easier, but most people gaming on 4k either use a 48" or smaller, usually much smaller, or if you're gaming on a 65"+ chances are you're sitting a bit further back.yeah


yeah and that when 4k is not really good for.


----------



## tabascosauz (Jun 24, 2021)

I don't really play Siege (tbh I used to play exclusively PvE and spent lot of time exploiting map quirks and bot behaviour to "max" out the highest possible score in THunt lol) but installed it because they added DLSS 2.2 to the game. The DLSS implementation itself I didn't find overly impressive (I can generally push 165 or close to it at all times at VHigh/Ultra native 1440p), but I wanted the .dll. Between 3 DLSS 2.0 implementations:

The Siege .dll is the newest, 2.2.6.0.
The MW2019 .dll is 2.1.58.0.
The War Thunder .dll is 2.1.40.0. It's also about 3MB smaller than the others.

It doesn't look like they go back and update the DLSS component at all. What you get at launch appears to be what you get.



The 2.2 .dll is a direct replacement for MW2019, so far doesn't seem to be any issues, but I don't play the slightly more demanding Warzone part of the game. Half of me wants to say that there's a slight IQ improvement when character is in motion (almost like they did a bit of sharpening specifically for when char is in motion), other half of me says there's no difference. No real improvement to performance.

The 2.2 .dll doesn't work for War Thunder. The game still runs all the same, but the graphics menu no longer has a dropdown menu, just a box that shows the "Off" setting. Not a surprise, of course, due to Gaijin being a dumpster fire, the game engine being a dumpster fire, the game always being a buggy dumpster fire, and the DLSS 2.0 implementation being a dumpster fire. Not a loss for me, since I always just run native @ 120fps capped.

I could also test No Man's Sky now with DLSS, but I long since soured on that game since the much-lauded-by-everyone (Origins? I think) update unilaterally broke half my bases by changing the world biomes from temperate to perpetual biohazardous storms.


----------



## nguyen (Jun 29, 2021)

Yup, just tried DOOM Eternal with RTX/DLSS and I'm a little disappointed with RT. The performance penalty with RT is not big but the quality is not good. The reflections shimmer alot, probably caused by low ray counts, and there is no option for higher quality RayTracing, just ON/OFF.

DLSS on the other hand is exellent, with DLSS Quality provide a smidgen higher IQ than Native 4K, the included DLL is version 2.1 so I replaced it with the ver 2.2 DLL for the heck of it.

DLSS Quality with Sharpen+ however is way above Native 4K IQ wise, running at the same performance. (Left is Native, right is DLSS Quality Sharpen+)


----------



## phanbuey (Jun 29, 2021)

That looks awesome on the right


----------



## wolf (Jun 30, 2021)

nguyen said:


> Yup, just tried DOOM Eternal with RTX/DLSS and I'm a little disappointed with RT. The performance penalty with RT is not big but the quality is not good. The reflections shimmer alot, probably caused by low ray counts, and there is no option for higher quality RayTracing, just ON/OFF.


I tend to agree, it looks like a low ray count, there's that bit of shimmer and some noise about the image, like you'd see on Metro with RT set to normal and like Metro also appears to worsen the lower the resolution too. Some reflections have had a nice visual impact but it would seem probably ~80% of gameplay, the level design won't really show it off in a big way.


nguyen said:


> DLSS on the other hand is exellent, with DLSS Quality provide a smidgen higher IQ than Native 4K, the included DLL is version 2.1 so I replaced it with the ver 2.2 DLL for the heck of it.


Yeah agreed here too, even the standard DLL with no modding, DLSS Quality looks stunning. I also added a dash of Sharpen in NVCP as I played the game like that before anyway. This should really benefit the lower teirs of cards that want to pump up their FPS / increase output res. For myself @ 3440x1440 144hz, I got my wish, framecapped at 140fps just like before with no RT/DLSS.


----------



## HiVoltageJL (Jun 30, 2021)

Since I got my 3080 I've only been Playing Battlefield V and for some reason although it shows 
a DLSS tab its always greyed out no matter what my options are set to, It can't be accessed.


----------



## AusWolf (Jun 30, 2021)

My 2070 is still quite new (for me), and I've only tried DLSS in Cyberpunk 2077 so far.

My opinion:

The game is borderline unplayable with RT on Psycho and DLSS off.
By only setting DLSS to Quality, I get 35-40 FPS average which I'm happy with.
I can see the quality difference between DLSS off and Quality when I shift between the two settings being stood in one place, but I'm enjoying a full gameplay experience moving around and doing stuff in the game.
DLSS 'Performance' looks quite bad, and the performance improvement over 'Quality' isn't so great either.
All in all, DLSS 'Quality' is great, but 'Performance' is quite awful.


----------



## kane nas (Jun 30, 2021)

AusWolf said:


> My 2070 is still quite new (for me), and I've only tried DLSS in Cyberpunk 2077 so far.
> 
> My opinion:
> 
> ...


I also have a 2070, try my friend to lock the frames via msi afterburner and through your screen settings to 30 or 45 frames, my experience with rtx and dlss quality in cyberpunk is the best so far the only bad thing in everyone to guess the game is sh@t storm ...


----------



## wolf (Jun 30, 2021)

AusWolf said:


> My 2070 is still quite new (for me), and I've only tried DLSS in Cyberpunk 2077 so far.
> 
> My opinion:
> 
> ...


RTX Psycho is one hell of a heavy tickbox to enable, this setting seems almost more geared for RTX 40 series / RDNA3 and beyond, so playable at all is a feat.

If your specs are correct, you're on a 1080p display, which means that quality (plus a sharpen if it interests you) would be the most viable setting, much less and the internal render res is just too low to salvage good image quality from. I mean performance mode is rendering internally at 540p and it's just a bit slight to get enough base information from at 1080p / 24" output


----------



## nguyen (Jun 30, 2021)

AusWolf said:


> My 2070 is still quite new (for me), and I've only tried DLSS in Cyberpunk 2077 so far.
> 
> My opinion:
> 
> ...



You can download the DLSS ver2.2 .dll from WccfTech and copy it to CP2077 installation folder, that will improve the visuals of the DLSS Performance mode. I tried 1080p DLSS Performance on my laptop and when combine with the new Sharpen+ filter, the visuals is quite comparable to Native 1080p but with 2x the FPS. 
Also like @wolf mention, the Psycho RTX option is not meant for current gen of GPU.


----------



## AusWolf (Jun 30, 2021)

kane nas said:


> I also have a 2070, try my friend to lock the frames via msi afterburner and through your screen settings to 30 or 45 frames, my experience with rtx and dlss quality in cyberpunk is the best so far the only bad thing in everyone to guess the game is sh@t storm ...


The game is not a shitstorm. Don't read launch reviews, they're massively outdated. It still has a few bugs, but if you can see past them, it's a great experience. I wouldn't be playing it otherwise. 

As for locking the fps: why would I do that?



wolf said:


> RTX Psycho is one hell of a heavy tickbox to enable, this setting seems almost more geared for RTX 40 series / RDNA3 and beyond, so playable at all is a feat.
> 
> If your specs are correct, you're on a 1080p display, which means that quality (plus a sharpen if it interests you) would be the most viable setting, much less and the internal render res is just too low to salvage good image quality from. I mean performance mode is rendering internally at 540p and it's just a bit slight to get enough base information from at 1080p / 24" output


Well, RTX Psycho + DLSS Quality is playable, so I can't complain.  My spec info is accurate - I don't yet feel the need to upgrade from 1080p.



nguyen said:


> You can download the DLSS ver2.2 .dll from WccfTech and copy it to CP2077 installation folder, that will improve the visuals of the DLSS Performance mode. I tried 1080p DLSS Performance on my laptop and when combine with the new Sharpen+ filter, the visuals is quite comparable to Native 1080p but with 2x the FPS.
> Also like @wolf mention, the Psycho RTX option is not meant for current gen of GPU.


That sounds interesting. I wonder why it's not included in the game by default.


----------



## wolf (Jun 30, 2021)

Excellent points @nguyen , @AusWolf if you're getting into a lengthy play of CP2077, it could be well worth the minimal time investment to change up that DLL and experiment with sharpening to get DLSS Performance mode looking better, (or even DLSS Balanced) if DLSS Performance doesn't quite get there for you, as IQ can be highly comparable when done right.

For the time being I'm sticking with Sharpen in the NVCP settings as it carries no performance impact, Sharpen+ appears visually demonstrably better, but through the GFE overlay carries a 10-15% performance impact on it's own. So there is certainly room to tweak and match the visuals + performance against your personal tastes and squeeze a significant amount more from it all.


----------



## londiste (Jul 1, 2021)

AusWolf said:


> That sounds interesting. I wonder why it's not included in the game by default.


It is about the release cycle. Slightly older version was implemented in the game because that is what was available when it was worked on. Pretty sure there are compatibility and testing considerations for the developer with switching versions that we do not care about when simply switching the dll. Maybe they will revisit DLSS version in some patch.


----------



## nguyen (Jul 29, 2021)

Just tried out The Ascent and both RT and DLSS surprised me 

on one hand the cost of RT is debilitating, from 97FPS down to 27FPS (78%perf drop off)
https://imgsli.com/NjMxNTI

on the other hand DLSS Performance look to be indistinguishable from 4K Native RT ON (without any sharpen filter) while bringing 2.5x the perf


			The Ascent 4K Ultra RT ON - Imgsli
		


I will explore further into the game and see if DLSS make any difference to RT quality

Edit: 4K DLSS Performance actually looks better than 4K Native, DLSS has better AA resolve. 


			The Ascent 4K Ultra RT ON - Imgsli


----------



## Cheese_On_tsaot (Aug 6, 2021)

I only have a 2060 but yeah it good, all you need to know.



nguyen said:


> uh, you are supposed to change the input icon on the OLED TV to PC mode to reduce the input delay. After changing to PC mode, the input delay on LG CX is 6.7ms and only 5.3ms on the C1, mind you even the best 1440p240hz IPS screen have 6.6ms input delay
> 
> View attachment 199547
> 
> ...


My Gigabyte M27Q is just .2 slower on response time than the 175hz ASUS ROG unit selling at 777 GBP on Amazon right now, I paid 350 for mine from box.co.uk. And beats the ROG on overshoot, RIP.









						Gigabyte M27Q Review
					

The Gigabyte M27Q is an excellent 1440p gaming monitor suitable for a wide variety of uses. It has a large 27 inch screen that provides an immersive gaming exper...




					www.rtings.com


----------



## dogwitch (Aug 7, 2021)

nguyen said:


> Just tried out The Ascent and both RT and DLSS surprised me
> 
> on one hand the cost of RT is debilitating, from 97FPS down to 27FPS (78%perf drop off)
> https://imgsli.com/NjMxNTI
> ...


what funny with that is basic rt to.


----------



## nguyen (Aug 7, 2021)

dogwitch said:


> what funny with that is basic rt to.


The Ascent actually went nuts with RT Reflection


----------



## wolf (Aug 10, 2021)

DLSS looks downright excellent in Back 4 Blood, there is literally only one situation in the whole game I can tell the difference. When at a character selection screen there seems to be some ghosting trails off the character model, perhaps a newer DLL fixes it? In any case it's not present at any point during gameplay which is as good as imperceptibly interchangeable with native.


----------



## nguyen (Aug 12, 2021)

Just tried Naraka Bladepoint (which look awesome btw) and 4K DLSS Balanced is looking sharper than 4K Native TAA



			Naraka Bladepoint 4K Native vs 4K DLSS Balanced - Imgsli


----------



## Cheese_On_tsaot (Aug 19, 2021)

First time playing DOOM Eternal, runs ok on RTX 2060 at 1440p with DLSS.
Without DLSS and RT at native it averages 140fps.
Also have DTS X audio, hearing things is faster than seeing.

Quality wise it was hard to discern any difference between DLSS and native just that edges are not as sharp, detail though is increased.


----------



## wolf (Aug 23, 2021)

Played a BUNCH of DOOM Eternal on that LG CX 65 at my mates place over the weekend, bloody hard to tell the difference between Quality/Balanced/performance modes given the speed of this game, but I can essentially lock 4k120 with DLSS Quality + RT + the command to run reflections at full resolution instead of half-resolution too, given this implementation can leverage on dynamic resolution scaling (albeit limited to min 75% on both axis at 4k/dlss quality iirc). I just set max FPS in NV CP to 120, but have the game target a few fps higher, about 125 or so, and it's just 120FPS hard locked, looks and plays absolutely amazing. No console can touch that experience imo, not even close, but perhaps a 6800XT/6900XT with dynamic res scaling would be a fairly similar experience, would just drop res more often which luckily isn't too noticable when ripping and tearing, and likely can only handle the standard "Ultra Nightmare" half res reflections setting. Blown away how good 1080p input looks when DLSS'd.


----------



## nguyen (Aug 23, 2021)

wolf said:


> Played a BUNCH of DOOM Eternal on that LG CX 65 at my mates place over the weekend, bloody hard to tell the difference between Quality/Balanced/performance modes given the speed of this game, but I can essentially lock 4k120 with DLSS Quality + RT + the command to run reflections at full resolution instead of half-resolution too, given this implementation can leverage on dynamic resolution scaling (albeit limited to min 75% on both axis at 4k/dlss quality iirc). I just set max FPS in NV CP to 120, but have the game target a few fps higher, about 125 or so, and it's just 120FPS hard locked, looks and plays absolutely amazing. No console can touch that experience imo, not even close, but perhaps a 6800XT/6900XT with dynamic res scaling would be a fairly similar experience, would just drop res more often which luckily isn't too noticable when ripping and tearing, and likely can only handle the standard "Ultra Nightmare" half res reflections setting. Blown away how good 1080p input looks when DLSS'd.



Yeah I finished the main campaign at Nightmare difficulty after 20h (4K 120FPS locked, full-res RT, DLSS Balanced, 1st playthrough), it's pretty funny when you hear some techtuber said they prefer to play Doom Eternal at 200-300FPS without RT, highly likely that these techtuber can't aim or no brain  .


----------



## wolf (Aug 24, 2021)

nguyen said:


> Yeah I finished the main campaign at Nightmare difficulty after 20h (4K 120FPS locked, full-res RT, DLSS Balanced, 1st playthrough), it's pretty funny when you hear some techtuber said they prefer to play Doom Eternal at 200-300FPS without RT, highly likely that these techtuber can't aim or no brain  .


I guess I've not tried DOOM on a say 240hz monitor, so I don't nkow what I don't know. But IMO 120hz is spot on already, even with the crazy pace of the game considering it's single-player. Still, there might be something to that, but you'd also be limited to a 1080p or maybe 1440p monitor, rather than the ridiculous fidelity experience on the OLED 4k @ 120. That plus RT occurs to me as the ultimate DOOM Eternal experience, at least in the here and now.


----------



## nguyen (Aug 24, 2021)

wolf said:


> I guess I've not tried DOOM on a say 240hz monitor, so I don't nkow what I don't know. But IMO 120hz is spot on already, even with the crazy pace of the game considering it's single-player. Still, there might be something to that, but you'd also be limited to a 1080p or maybe 1440p monitor, rather than the ridiculous fidelity experience on the OLED 4k @ 120. That plus RT occurs to me as the ultimate DOOM Eternal experience, at least in the here and now.



Have you seen how reviewers play Doom Eternal? I bet they are blaming it on low FPS


----------



## Mussels (Aug 24, 2021)

I like how he's not wearing pants and appears to have a shirt covered in cum stains?


----------



## Bomby569 (Sep 3, 2021)

i was a non believer until i got my RTX card, and oh man does it work, i'm not one to go and make screeshots and compare the little bird tiny penis to the non dlss image, but for normal gameplay it's amazing, and a help if you turn RTX on.


----------



## AusWolf (Sep 3, 2021)

I finished Cyberpunk 2077 not long ago. 1080p with an RTX 2070, Ultra graphics, Psycho RT. Thoughts:

DLSS "quality" looks great, and gives me good FPS (around 40-50 with the settings above). I can see the difference between on and off, but it's not something you look at while playing the game as a normal person. Great experience overall.
DLSS "fast" (or whatever the lowest setting is called) looks like crap, sort of like playing at 720p on my 1080p screen. Blurry as hell, and the performance uplift from "quality" isn't big enough anyway. I would highly recommend against it.
I haven't really experimented with the intermediate settings, but I would imagine the experience is in between the two.


----------



## Ibotibo01 (Sep 20, 2021)

When Control released, I did not like DLSS 1.X even played with DLSS quality. After DLSS 2.X launch, I enjoyed a lot Control with DLSS quality. I used RTX 2060 for 1.5 years then changed to GTX 1660. GTX 1660 is enough for all games but I have felt the lack of DLSS. Quality mode boosts minimum 30% FPS gain if you open DLSS with RT, FPS will go up to 80-90% to Native RT. I bought RTX 3060 with reasonable price then sold my 1660. I played Cyberpunk with DLSS Quality, it is same with native and also, I compared CAS/Reshade to Native with GTX 1660. All in all, DLSS 2.X is lifesaver for Ray Tracing also providing free performance uplift in Rainbow Six Siege, Fortnite and other competitive games. I think it is best way to improve performance.

Ultra Settings at 1080p No FidelityFX No CAS/Reshade:





Ultra Settings FidelityFX CAS 85% with CAS/Reshade ON:




Ultra Settings FidelityFX CAS 90% with CAS/Reshade ON:




Ultra Settings FidelityFX CAS 90% with CAS/Reshade ON and FXAA/Reshade on




My system at that time | Now
i7 4790 | Same OC'ed to 4GHZ
GTX 1660 Ventus XS OC | Gainward RTX 3060 Ghost
MSI H81M P33 | MSI B85M Gaming
Kingston Hyper-X 2X8GB 1600 MHZ DDR3 RAM | Same
Game is installed in Sandisk 480GB SSD | Same

I used DLSS 2.2.18

1080p RT Ultra DLSS Quality:




1080p RT Ultra DLSS OFF:




1080p Ultra RT OFF DLSS OFF:




1080p Ultra RT: OFF, DLSS Quality:


----------



## Mussels (Sep 21, 2021)

God damnit i want DLSS in their control panel where i can force it in any game already


----------



## londiste (Sep 21, 2021)

Mussels said:


> God damnit i want DLSS in their control panel where i can force it in any game already


DLSS lives in the place before UI and effects and requires motion vectors. So a generic solution is not that simple for purely technical reasons.


----------



## Mussels (Sep 21, 2021)

londiste said:


> DLSS lives in the place before UI and effects and requires motion vectors. So a generic solution is not that simple for purely technical reasons.


I know, but i'm having fun playing the rainbow 6 games with no DLSS support right now, so i wish i could just magically force it


----------



## nguyen (Oct 3, 2021)

DLSS version 2.3.0 is out
Image quality is further improved vs 2.2.6
Watch Dogs Legion (300% zoom)


			Imgsli


----------



## GerKNG (Oct 3, 2021)

all they need to do is getting rid of the ghosting/smearing on dark objects with a bright background (like a bird that has a black trail)


----------



## Mussels (Oct 3, 2021)

GerKNG said:


> all they need to do is getting rid of the ghosting/smearing on dark objects with a bright background (like a bird that has a black trail)


zoomed right in i didnt see much change, but zoomed one setting out it's night and day (and i'd heard it fixed the ghosting, too)


----------



## wolf (Oct 4, 2021)

I just started Marvel's Avengers because it dropped on XBGP and the DLSS imp seems excellent, very hard to distinguish between quality and native at 3440x1440, I wonder if the 2.3 DLL improves that even more?


----------



## ThaiTaffy (Oct 4, 2021)

Out of curiousity has anyone tried to force upgrade any dlss files I saw a YouTube video on it Anthony did(Linus) and seemed interesting but as my rtx card is so far away I can only dream.


----------



## nguyen (Oct 4, 2021)

ThaiTaffy said:


> Out of curiousity has anyone tried to force upgrade any dlss files I saw a YouTube video on it Anthony did(Linus) and seemed interesting but as my rtx card is so far away I can only dream.



Yeah we could just manually swap the newer DLL into any DLSS 2.x supported game and it works most of the time (improved image quality and reduced ghosting).
The DLSS Swapper tool only work with Steam game library atm.


----------



## wolf (Oct 13, 2021)

I probably spend too much time on Reddit in PC-related subs.... but man I just don't understand what it is that compels some people to be so thoroughly negative about DLSS image quality when they don't use it themselves or own RTX cards.

If the right people, like yourselves in this thread because I've specifically opted to curate the discussion to avoid the other people, have constructive feedback and discussion I always find that more than welcome, the technology obviously isn't perfect, and has a lot of room to grow and improve, but I am bugged (more than I guess I should be) by people that just appear have an agenda or axe to grind by crapping all over it.

I can understand and engage with other gripes or points of discussion, like time and cost to implement, how many games feature it, the evolution of the technology etc etc, but regurgitating and exaggerating any negative IQ aspect found from tech sites/techtubers carious analysis to fuel their own agenda, I just can't take them seriously, yet this vocal minority seem to have the most to say about it.


----------



## ThaiTaffy (Oct 13, 2021)

Radeon fanboys?


----------



## oxrufiioxo (Oct 13, 2021)

wolf said:


> I probably spend too much time on Reddit in PC-related subs.... but man I just don't understand what it is that compels some people to be so thoroughly negative about DLSS image quality when they don't use it themselves or own RTX cards.
> 
> If the right people, like yourselves in this thread because I've specifically opted to curate the discussion to avoid the other people, have constructive feedback and discussion I always find that more than welcome, the technology obviously isn't perfect, and has a lot of room to grow and improve, but I am bugged (more than I guess I should be) by people that just appear have an agenda or axe to grind by crapping all over it.
> 
> I can understand and engage with other gripes or points of discussion, like time and cost to implement, how many games feature it, the evolution of the technology etc etc, but regurgitating and exaggerating any negative IQ aspect found from tech sites/techtubers carious analysis to fuel their own agenda, I just can't take them seriously, yet this vocal minority seem to have the most to say about it.



To be fair people have similar issues with TAA and FXAA before that. I think part of it is it being limited to RTX cards gamers in general aren't too welcoming of proprietary features on gpus. I also think that the majority of gamers are still rocking 1080p panels where upscaling techniques are the weakest regardless of if it's reconstruction like with DLSS or upscaling with FSR.

I'm hoping in the future we get more stuff like UE4/5 temporal upscaling that works well on all gpus even some of the techniques used on consoles are pretty well done these days.

You see the same sorta negativity towards RTRT even though it's obviously the future and what will make the most differences visually going forward imo.... They have to start somewhere and can't really start developing for it unless the hardware supports it.

Part of it is just blind fanboyism as well but that goes both ways.


----------



## nguyen (Oct 13, 2021)

Well I'm dissapointed with DLSS ver 2.3.x so far, seems like 2.2.11 is still the best ver IMO

Maybe the 2.3.x has some additional parameters that only work properly with 2.3.x games, for example if I put 2.3.1 DLL into CP2077



Looks pretty funky


----------



## LifeOnMars (Oct 13, 2021)

Subbed, as a 3060 ti user this puppy allows 4K gaming even with some newer titles. Now if I could just bypass the 250w power limit I would be screaming


----------



## dogwitch (Oct 13, 2021)

oxrufiioxo said:


> To be fair people have similar issues with TAA and FXAA before that. I think part of it is it being limited to RTX cards gamers in general aren't too welcoming of proprietary features on gpus. I also think that the majority of gamers are still rocking 1080p panels where upscaling techniques are the weakest regardless of if it's reconstruction like with DLSS or upscaling with FSR.
> 
> I'm hoping in the future we get more stuff like UE4/5 temporal upscaling that works well on all gpus even some of the techniques used on consoles are pretty well done these days.
> 
> ...


also oddly the term native.
they dont know the meaning of the word itself....
hell people think oh hdr... brand new... 
it like do this people even google anything for research or is it what on the box.....


----------



## Mussels (Oct 13, 2021)

ThaiTaffy said:


> Radeon fanboys?


The ultra purists who think they themselves are superior for using X product.
They exist for every brand, topic and technology.


----------



## londiste (Oct 13, 2021)

oxrufiioxo said:


> To be fair people have similar issues with TAA and FXAA before that.


I still have issues with those. The visual difference between them and better methods is just too visible. Begrudgingly accepting those because the performance hit or issues with MSAA are getting quite noticeable these days.

Love that DLSS exists but as a personal preference I will not go lower than DLSS Quality as this already shows minor visible artifacts even in good implementations that I do notice quite well in lower quality modes. However, I cannot deny the performance increase relative to lost quality.

Admittedly I am probably a very bad example with a prominent all-sliders-to-the-right syndrome


----------



## oxrufiioxo (Oct 13, 2021)

londiste said:


> I still have issues with those. The visual difference between them and better methods is just too visible. Begrudgingly accepting those because the performance hit or issues with MSAA are getting quite noticeable these days.
> 
> Love that DLSS exists but as a personal preference I will not go lower than DLSS Quality as this already shows minor visible artifacts even in good implementations that I do notice quite well in lower quality modes. However, I cannot deny the performance increase relative to lost quality.
> 
> Admittedly I am probably a very bad example with a prominent all-sliders-to-the-right syndrome



I use it probably different than most. I set the internal resolution to 4k then turn it on in almost all cases it look much better than native 1440p and performs similarly. 

Native 1440p panel.


----------



## AusWolf (Oct 13, 2021)

oxrufiioxo said:


> To be fair people have similar issues with TAA and FXAA before that. I think part of it is it being limited to RTX cards gamers in general aren't too welcoming of proprietary features on gpus. I also think that the majority of gamers are still rocking 1080p panels where upscaling techniques are the weakest regardless of if it's reconstruction like with DLSS or upscaling with FSR.
> 
> I'm hoping in the future we get more stuff like UE4/5 temporal upscaling that works well on all gpus even some of the techniques used on consoles are pretty well done these days.
> 
> ...


I'm speaking from the 1080p gamer perspective with a 2070... though my opinion might be in the minority if what you said is true. 

I think DLSS _Quality_ is free FPS even at 1080p. I'd be stupid not to turn it on in every game that supports it. I wouldn't want to use lower settings, though. Same with RT. If the game has it, I turn it on because I can.

As for how welcoming I am to these proprietary features... well, I don't really have to do anything to "welcome" or adopt them. I install the GPU drivers like every other person. The fact that GTX or Radeon card owners don't have the same options in some games as I do is not my concern. Options that come integrated into my GPU and games don't really feel proprietary at all. If I had to buy a separate RT card (the way PhysX started with Ageia before nvidia bought them), that would be a different story.


----------



## phanbuey (Oct 13, 2021)

oxrufiioxo said:


> I use it probably different than most. I set the internal resolution to 4k then turn it on in almost all cases it look much better than native 1440p and performs similarly.
> 
> Native 1440p panel.


That's genius... going to try on my 1440P panel.


----------



## oxrufiioxo (Oct 13, 2021)

phanbuey said:


> That's genius... going to try on my 1440P panel.



I've been doing this since FFXV which wasn't even 2.0 and because the TAA was trash in that game it still ended up looking better than vanilla 1440p with TAA.


----------



## imrazor (Oct 13, 2021)

In many scenarios I get artifacts with DLSS particularly when used with depth of field. I’ll occasionally see white ‘sparkles’ on screen. Performance without DLSS when using raytracing can verge on unplayable though. So I continue to use it in some games (Control, Cyberpunk) particularly on a 2070 Max-Q. Sometimes I can get away without DLSS on my desktop card (2070 Super.)


----------



## nguyen (Oct 13, 2021)

imrazor said:


> In many scenarios I get artifacts with DLSS particularly when used with depth of field. I’ll occasionally see white ‘sparkles’ on screen. Performance without DLSS when using raytracing can verge on unplayable though. So I continue to use it in some games (Control, Cyberpunk) particularly on a 2070 Max-Q. Sometimes I can get away without DLSS on my desktop card (2070 Super.)



The light sparkles in CP2077 is related to Bloom when DLSS is in used, which you can't turn off from the menu.
You can disable Bloom by going into the Steam game folder and \engine\config\platform\pc

Create a .ini file (any name is fine) with notepad and copy/paste the following:

[Developer/FeatureToggles]
Bloom = false

I don't like the Bloom in CP2077 either (it change the brightness when you enter/exit dark area), so disabling it is more preferable, and no DLSS sparkles either.


----------



## dogwitch (Oct 13, 2021)

it should be not this hard to fix some basic stuff in dlss.
seems users are going it mod ini... files...


----------



## imrazor (Oct 13, 2021)

nguyen said:


> The light sparkles in CP2077 is related to Bloom when DLSS is in used, which you can't turn off from the menu.
> You can disable Bloom by going into the Steam game folder and \engine\config\platform\pc
> 
> Create a .ini file (any name is fine) with notepad and copy/paste the following:
> ...


Actually it’s in Control that I’ve noticed the sparkles the most. I’ll look for a bloom option on the settings menu.


----------



## Mussels (Oct 14, 2021)

dogwitch said:


> it should be not this hard to fix some basic stuff in dlss.
> seems users are going it mod ini... files...


thats CP2077 at fault, not DLSS.


----------



## wolf (Oct 14, 2021)

Well, it's gonna be a B4B weekend this weekend, looking forward to playing with DLSS modes and apparently a built-in sharpness slider now, a surprise, but a welcome one.

Also keen to see how Crysis 2/3 remastered look and play with RT and DLSS, I didn't pick up the original remastered, mostly due to it still being so darn single threaded.



oxrufiioxo said:


> I set the internal resolution to 4k


I assume you mean by using DSR? I've played with that a bit too, with very nicely supersampled-looking results.


----------



## oxrufiioxo (Oct 14, 2021)

wolf said:


> I assume you mean by using DSR? I've played with that a bit too, with very nicely supersampled-looking results.



I prefer using the games built in internal resolution slider if available typically set it to 150% like in Warzone or cod cold war but if not I use DSR as an alternative.


----------



## nguyen (Oct 14, 2021)

New DLSS developement

Deathloop just added DLSS 2.3.0 and with it there are new features:
_Adaptive Resolution (switch between DLSS mode automatically to maintain a Target FPS)
_DLAA (with Adaptive Resolution, set Target FPS to a low figure)

Deathloop 4K Native vs 4K DLAA (74FPS vs 71FPS)


			Imgsli
		


Edit: Seems like DLSS 2.3.0 is not needed for Adaptive Resolution, I replaced it with ver 2.2.11 and Deathloop work just fine, and 2.2.11 has less smearing than 2.3.0

Baldur's Gate 3, 4K Native vs 4K DLSS Quality, 69FPS vs 90FPS. There is a Sharpness Slider that can be used for Native, DLSS and FSR. 


			Baldur's Gate 3 - Imgsli
		

Somehow DLSS is making the character's behind look better IMO


----------



## wolf (Oct 21, 2021)

I have quite the hankering to play Crysis 2 Remastered and crank up all the bells and whistles, but EGS.... I only go there for my free games 

In related awesome news, God of War confirmed for PC w/DLSS and reflex and.... 21:9 support!  Pretty glad I never got around to borrowing a PS for it.


----------



## AusWolf (Oct 21, 2021)

I've started playing Deliver Us The Moon again, just to see how RT and DLSS are implemented (the last time I played it, I had a GTX 1650). Funny enough, I can't really see a big difference among the different options, unlike in CP77, where "balanced" was quite bad at 1080p. DUTM would be playable even at "ultra performance", but I'm keeping it on "quality" to achieve the best performance to image quality ratio.


----------



## looniam (Oct 21, 2021)




----------



## dogwitch (Oct 21, 2021)

looniam said:


> View attachment 221694


yes .but we want more dog with a yellow hat!


----------



## arni-gx (Oct 27, 2021)

i love it very much this DLSS v2.0...... now, i can play RT all ultra with very ease...... on WD legion and CBP 2077, with DLSS performance......the quality of their eye candy, not bad......


----------



## Bomby569 (Oct 29, 2021)

Personally i think it's the best new technology for graphic cards i seen in a long time, will make upgrading a thing you can avoid for a lot longer. Works great, and you can hardly see any visible difference in the higher settings.


----------



## wolf (Nov 4, 2021)

Bomby569 said:


> and you can hardly see any visible difference in the higher settings.


The kicker, on top of the performance boost, is that quite often now, the DLSS image is the preferable one to me.


----------



## dogwitch (Nov 5, 2021)

if anyone wants a good read.
on gi.
here a link 








						The Essential 3D Motion Design Glossary
					

Get ready for a full-blown knowledge bomb of 3D definitions ranging from Alembic to ZBrush.




					www.schoolofmotion.com


----------



## nguyen (Nov 17, 2021)

DLAA + NIS 85% should be the new DLSS UQ guys










DLAA + NIS 85% has noticeably less shimmering on the fence, and possibly less ghostings too. I was hoping new DLSS supported games would include DLAA but it's still missing :/


----------



## Mussels (Nov 17, 2021)

NIS 85% genuinely seems a good option for 4k on my sony UHDTV, i cant see a visible difference and get ~10FPS more in the games i've tried


----------



## mrpaco (Nov 20, 2021)

oxrufiioxo said:


> I use it probably different than most. I set the internal resolution to 4k then turn it on in almost all cases it look much better than native 1440p and performs similarly.
> 
> Native 1440p panel.


how do you that? you mean using DSR 4k that game and then using dlss?


----------



## Mussels (Nov 20, 2021)

mrpaco said:


> how do you that? you mean using DSR 4k that game and then using dlss?


His monitor might be like mine, where it 'overclocks' and accepts 4K inputs


----------



## mrpaco (Nov 20, 2021)

but the scaling from the monitors is  usually pretty bad and make a mess  of the image quality, ghosting, etc.

anyway, I just want to know how he do it to get that much better quality, so I try myself. If it's using the monitor, I think I could do it with the 27GL850b

so, should be using DSR on some specific games, or using custom resolution on the whole system (can't see why would do that) but his words makes me whonder what specific option is he using because It's not clear.


----------



## oxrufiioxo (Nov 20, 2021)

mrpaco said:


> how do you that? you mean using DSR 4k that game and then using dlss?



Correct, but now I game on an LG C1 so it's natively 4k


----------



## Mussels (Nov 20, 2021)

mrpaco said:


> but the scaling from the monitors is  usually pretty bad and make a mess  of the image quality, ghosting, etc.
> 
> anyway, I just want to know how he do it to get that much better quality, so I try myself. If it's using the monitor, I think I could do it with the 27GL850b
> 
> so, should be using DSR on some specific games, or using custom resolution on the whole system (can't see why would do that) but his words makes me whonder what specific option is he using because It's not clear.


Normally yes, but in the case of my two screens - no. It genuinely looks and behaves like native 4K. Quite a few high refresh 1440 displays seem to also handle 4k60 'natively' but have the option hidden.

In my case i just plug my chromecast in (the new one with a remote) and if it defaults to 4K60hz i know the monitor can take it, and fart about with custom resolutions.


----------



## wolf (Jan 16, 2022)

OK so hear me out.

Enable DLDSR 2.25x in NVCP using the latest 511.23 driver.

Set your DLSS supported game to that resolution, for me using a native 3440x1440 monitor, this is 5120x2160.

Now enable DLSS in Performance mode, for me this is now an internal render res of 2560x1080.

Now tell me that doesn't look better than native, and for me, it performs better than native still too.

Nvidia, you've done it.


----------



## Space Lynx (Jan 16, 2022)

wolf said:


> OK so hear me out.
> 
> Enable DLDSR 2.25x in NVCP using the latest 511.23 driver.
> 
> ...



I'm glad it works, but I'm still confused as ****


----------



## Mussels (Jan 16, 2022)

I posted in the other thread, but i ran the new 2.25x thingy with my 1440p monitor overclocked to 4k 80Hz, and played starcraft in 6K 80hz.


I was beautiful. and a little laggy., but i have no idea the FPS because the FPS counter was so small it was in the subpixel range.


----------



## wolf (Jan 16, 2022)

I'm playing God of War, the settings I mentioned above at df's optimised settings and it's 100-120fps, I can't believe how clean and detailed it looks


----------



## Mussels (Jan 16, 2022)

I just took screenshots of some of the main menu screens for SC2 (using GFE's screenshot method)

The files are too big to upload (~8MB each!) ... this is a 12 year old DX9 game with terrible antialiasing so its really visible in person when the shimmering edges are gone.


Edit: need to find another way to host...

Ugh how about a zoomed in crop. You were never meant to see this close up


----------



## dogwitch (Jan 16, 2022)

side note. i think its funny as hell the god of war. sdr checker board game.... needs dlss to hit above 60fps...


----------



## wolf (Jan 16, 2022)

dogwitch said:


> side note. i think its funny as hell the god of war. sdr checker board game.... needs dlss to hit above 60fps


I don't follow? I can run ultra everything and be well above 60fps at 3440x1440 without DLSS, even 4k is doable for 60 target. DLSS is the icing on the cake here, and optimised settings are the smarter choice over original or ultra everything. All things considered I'd say the game generally runs pretty well.


----------



## dogwitch (Jan 16, 2022)

wolf said:


> I don't follow? I can run ultra everything and be well above 60fps at 3440x1440 without DLSS, even 4k is doable for 60 target. DLSS is the icing on the cake here, and optimised settings are the smarter choice over original or ultra everything. All things considered I'd say the game generally runs pretty well.


i mean game assets look nice but are  vastly under whelming. due to when game was being dev. they learn early on. checker board would be only way to play the game at a decent frame rate for ps4.
are you doing 4k (seeing it uses checker boarding)  at 60 fps?

it was clever design from a game design point due to limited hardware. even going forward into ps5 its limited due to hardware.

i do miss the og style of combat. but current hardware simple cant do that. even on pc using some form of dlss .
physics and partical effects nuke performance. that even before basic GI is used to.



its a chicken/egg problem.

also a side note.

2 things
1 that the real reason why there been no new red faction game. we hit a point where the oh shiny.... and not real world physics or something near it.


 2
its  a profit driven thing to. where the best of the best software. went to vfx area. do to ROI and better margins.
if you can watch corrid digital how they talk about certain software and such. plus the need for god  spec hardware above even gaming pc. to render  before finale render happens. 
very interesting  info and how current game dev/design and such is limited.


----------



## wolf (Jan 16, 2022)

dogwitch said:


> are you doing 4k (seeing it uses checker boarding) at 60 fps?


PC doesn't use checkerboarding, either full native res with TAA or DLSS, FSR is there too which comes after a native+TAA downsample.

DLDSR + DLSS is amazing


----------



## Mussels (Jan 16, 2022)

Mussels said:


> I just took screenshots of some of the main menu screens for SC2 (using GFE's screenshot method)
> 
> The files are too big to upload (~8MB each!) ... this is a 12 year old DX9 game with terrible antialiasing so its really visible in person when the shimmering edges are gone.
> 
> ...


Oh i thought this might help:

1080p WITH AA for comparison - the edges of the hair and armour look like theyre from totally different game engines




Monitor "overclock" disabled so that the new DSR gives me a virtual 4K 165Hz, vs the "real" 4k 80hz

At one point she looks to the side, at the higher res you can make out that her eyes are bloodshot, a totally hidden detail at lower res


----------



## Ibizadr (Jan 17, 2022)

I use dldsr in cod warzone and the final IQ it's better than native. with dldsr(1440p) + dlss quality, maintaining the same fps +/-


----------



## wolf (Jan 17, 2022)

@Mussels it's kind of crazy how much GPU horsepower these 3080/90's have, the higher res you run just plays into their strength and they just deliver. crazy the amount of games I can run 5120x2160 and get above 60 fps already, and there's still room to wiggle with downward either DLSS, FSR or just a lowered render scale to boost performance while keeping the majority of that super sampled crisp look. It's breathing new looks/life into a lot of games for me.


----------



## Mussels (Jan 17, 2022)

wolf said:


> @Mussels it's kind of crazy how much GPU horsepower these 3080/90's have, the higher res you run just plays into their strength and they just deliver. crazy the amount of games I can run 5120x2160 and get above 60 fps already, and there's still room to wiggle with downward either DLSS, FSR or just a lowered render scale to boost performance while keeping the majority of that super sampled crisp look. It's breathing new looks/life into a lot of games for me.


I mean, just turn off AA to compensate and you break even in a lot of games


----------



## wolf (Jan 17, 2022)

Mussels said:


> I mean, just turn off AA to compensate and you break even in a lot of games


You're not wrong. Like you I think playing with this is going to eat up a lot of my time, it's fascinating.


----------



## oxrufiioxo (Jan 17, 2022)

It doesn't have DLSS but it's pretty impressive how much it helps Witcher 3 image quality wise running 4k with the 2.25 settings pretty blown away it's a game changer for older games with shit AA implementations


----------



## phanbuey (Jan 17, 2022)

i cant seem to make this work -- my games don't seem to have the 1.75x and 2.25x resolutions available...

Edit: nvm saw it in the nvpanel... too bad game cant pick it up automatically yet.


----------



## dogwitch (Jan 17, 2022)

wolf said:


> PC doesn't use checkerboarding, either full native res with TAA or DLSS, FSR is there too which comes after a native+TAA downsample.
> 
> DLDSR + DLSS is amazing
> 
> View attachment 232603


the game engine does. my guess they simple turn it off for pc version.


----------



## wolf (Jan 17, 2022)

oxrufiioxo said:


> It doesn't have DLSS but it's pretty impressive how much it helps Witcher 3 image quality wise running 4k with the 2.25 settings pretty blown away it's a game changer for older games with shit AA implementations


It's going to be an amazing tool for older games/bad AA games/having some performance overhead, DLSS in combination is icing on the cake. I'm just loading up title after title experimenting


----------



## phanbuey (Jan 17, 2022)

Yeah actually I started downloading a bunch of old games haha...

Might be time for another Oblivion & Fallout: New Vegas runthrough.  

Oblivion Engine: "We cannot do "HDR" and Anti-Aliasing at the same time...."
DLDSR: "Hold my beer"


----------



## AusWolf (Jan 17, 2022)

I've just tried in Mass Effect 3 LE. Whenever I set a higher resolution, the game switches back to 1080p on its own for some reason. Maybe I have to restart the game, just the engine forgets to ask for it, I don't know.

Also, it's weird that I have 2.25x (1440p) DL available, but 4x (4K) is only there conventionally.


----------



## Mussels (Jan 17, 2022)

AusWolf said:


> I've just tried in Mass Effect 3 LE. Whenever I set a higher resolution, the game switches back to 1080p on its own for some reason. Maybe I have to restart the game, just the engine forgets to ask for it, I don't know.
> 
> Also, it's weird that I have 2.25x (1440p) DL available, but 4x (4K) is only there conventionally.


Only works in fullscreen exclusive mode, not windowed or borderless (might be ME's issue)

It's not weird, we only get the 2.25x - you're meant to get the fidelity of the 4x, with a lower performance loss


----------



## AusWolf (Jan 17, 2022)

Mussels said:


> Only works in fullscreen exclusive mode, not windowed or borderless (might be ME's issue)


It's set to fullscreen. Though I've also got Vsync turned on, and I get between 120-240 fps constant. Something weird is going on with the ME:LE engine.



Mussels said:


> It's not weird, we only get the 2.25x - you're meant to get the fidelity of the 4x, with a lower performance loss


Ah, so nvidia is saying that 2.25x comes at a lesser performance loss than 4x! Who would've thought?  I guess we've got to test 2.25x DL vs 2.25x conventional. Otherwise, performance metrics mean nothing.


----------



## Mussels (Jan 17, 2022)

AusWolf said:


> It's set to fullscreen. Though I've also got Vsync turned on, and I get between 120-240 fps constant. Something weird is going on with the ME:LE engine.
> 
> 
> Ah, so nvidia is saying that 2.25x comes at a lesser performance loss than 4x! Who would've thought?  I guess we've got to test 2.25x DL vs 2.25x conventional. Otherwise, performance metrics mean nothing.


Yeah but they're saying the 2.25x is the same image quality as the original 4x, we should get more options later - it literally just launched


----------



## AusWolf (Jan 17, 2022)

Mussels said:


> Yeah but they're saying the 2.25x is the same image quality as the original 4x, we should get more options later - it literally just launched


2.25x standard looks pretty crisp on my monitor already, so I have no way to test that claim. The only thing I can (and will) test is the performance of 2.25x DL vs 2.25x standard.


----------



## Mussels (Jan 17, 2022)

This new feature has made me move my PC and spare desk to the living room, so that i can now use the 55" UHDTV to play games in 6K res


----------



## AusWolf (Jan 17, 2022)

Mussels said:


> This new feature has made me move my PC and spare desk to the living room, so that i can now use the 55" UHDTV to play games in 6K res


No chance I'm gonna try that with a 2070. 

Right, I've got some performance results in World of Tanks enCore RT:

1080p + 1.76x DSR (1440p), Ultra graphics, Ultra RT: 12954 points,
1080p + 1.76x *DL*DSR (1440p), same everything: 12665 points.

I'm not sure if points convert into FPS, but if they do, that's a 2.2% performance loss in DL mode. Though I have to add that the picture looks a tiny bit sharper, so it's a good trade-off, I guess. 

Although, I still don't think I'm gonna turn on any DSR, as the resolution switch messes up the size and position of my apps on my second screen.


----------



## nguyen (Jan 18, 2022)

5K DLDSR (1.78x) looks even sharper than 8K DSR (4x), 8K DSR is slightly sharper than 4K Native in PUBG 


			PUBG 4K vs 8K DSR vs 5K DLDSR - Imgsli
		


Looks to me like DLDSR is a more advanced sharpening filter than Nvidia's Sharpen/Sharpen+ and AMD's RIS. There is hardly any ringing artifacts with DLDSR, though there is some noticeable contrast modification similar to RIS.


----------



## oxrufiioxo (Jan 18, 2022)

nguyen said:


> 5K DLDSR (1.78x) looks even sharper than 8K DSR (4x), 8K DSR is slightly sharper than 4K Native in PUBG
> 
> 
> PUBG 4K vs 8K DSR vs 5K DLDSR - Imgsli
> ...



I was noticing that also that 1.78 looks better than the 4x on my LG C1 but that dudes smoothness option is set way too high.


----------



## Mussels (Jan 19, 2022)

I noticed in 7D2D if i had the smoothness setting above 33%, my white aiming reticule lines became half as thick and turned grey instead making them hard AF to see


----------



## wolf (Jan 25, 2022)

I'm about ~10 hours into God Of War and the visuals are just stunning (dad life means slim game time)

Experimented a lot with optimized this and that, in game settings, resolutions etc, but the 3080 just *powers *through anything, so my current settings are basically give me all the IQ.

Native res is 3440x1440, 2.25x DLDSR sets a resolution of 5120x2160, using DLSS quality mode for a render res matching my native 3440x1440.... DLAA on crack?

Optimised settings are smart, but using the Ultra preset, and the above settings, I'm still getting 67-80fps in typical gameplay, sometimes over 80, not once below 60 and it feels exceptionally buttery and clear.

Yeah, pretty glad I never played this on checkerboarded original settings on PS.

Helps that the game is fucking awesome of course.


----------



## oxrufiioxo (Jan 25, 2022)

From a normal sitting distance on a 4k TV the checkerboard implementation is hard to notice.... Played it on both the PS4 pro and PS5 and now on PC the game wowed me more when it came out than it does now running on it's RX570 ish level gpu in the pro vs my 3080ti. Seems to run 4k native max settings around 60-70fps or 90-100fps with dlss ultra quality. I find the DLSS implementation to be pretty good. HDR in this game on my C1 oled is pretty amazing though and makes a bigger difference than it not being checkerboard visually. Although the pro and the ps5 already did that well.

Even more so than AI upscaling pc gamers deserve better monitors hopefully 2022 fixes that with both the QD Oled and mini led displays coming out hopefully they've fixed some of the issues from previous released ones and they hopefully won't cost a kidney.


----------



## wolf (Jan 25, 2022)

oxrufiioxo said:


> HDR in this game on my C1 oled is pretty amazing


I've had my 3080 rig plugged into a mates CX65, and I must agree, not only does it do the 120hz/Gsync thing, but the HDR is breathtaking.


oxrufiioxo said:


> Even more so than AI upscaling pc gamers deserve better monitors


A C2 42 is on my shopping list this year, I can't wait.


----------



## oxrufiioxo (Jan 25, 2022)

wolf said:


> A C2 42 is on my shopping list this year, I can't wait.



A very good plan..... You definitely will not regret it and from a normal sitting distance DLSS works even better than on a monitor regardless of the screen size.


----------



## dogwitch (Jan 26, 2022)

just a heads up on the game. it just really good sdr.
seeing i have play it. on both platforms and  they went back in and touch up sdr to look better on the pc.


----------



## wolf (Jun 1, 2022)

Picked up Deathloop recently, really enjoying DLSS in this title, especially as it's implemented with a dynamic mode with FPS target, set and forget basically. Interesting game concept too, lets see how long it can keep my interest.

Still waiting on this bloody LG C2 42" that I paid for what feels like 2 months ago... I can't wait to see a bit more of DLSS's capability explored where it has the best chance to shine.


----------



## dogwitch (Jul 8, 2022)

wolf said:


> Picked up Deathloop recently, really enjoying DLSS in this title, especially as it's implemented with a dynamic mode with FPS target, set and forget basically. Interesting game concept too, lets see how long it can keep my interest.
> 
> Still waiting on this bloody LG C2 42" that I paid for what feels like 2 months ago... I can't wait to see a bit more of DLSS's capability explored where it has the best chance to shine.


dlss on streaming asset games.. wont work correctly. flight sim and 1 cod tried it.
 its utter garbage.  rant done
yes deathloop nice.


----------



## wolf (Jul 13, 2022)

I've been playing FS2020 for a week or so now on my new LG 42 C2, optimised settings, getting around 60-70 fps when GPU limited which is not bad at all. Was trawling the web last night and saw I could opt into the Beta and get build 10 preview which has DLSS and glad I did. 80-90 FPS now when GPU limited, it was ever so slightly softer with standard sharpening but increasing that somewhat has made it very comparable to native, I'm impressed. Will be keen to test FSR when it comes too which I believe has been annoucned.


----------



## anfazi54 (Jul 13, 2022)

Using RTX 3050, i try DLSS 2.0 not looking up for the image quality. I am just looking for better fps. I should check it for image quality


----------



## wolf (Jul 13, 2022)

anfazi54 said:


> Using RTX 3050, i try DLSS 2.0 not looking up for the image quality. I am just looking for better fps. I should check it for image quality


It can depend a lot on the resolution of your monitor, the higher it is, the better the visual result. So at 4k you get very comparable to native results, sometimes aspects are actually better. On the whole though 1080p would almost certainly appear softer, but like you say, it's more about FPS increase while minimising quality loss, which it's also excellent at.


----------

