• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA DLSS 2.5.1

Yeah, I agree the Witcher 3 looks better but it's game dependant. But it's still nothing similar to a 4K image. When I played CP2077 first time around, I think the DLSS was super early implementation. Never really noticed it that much, but I also had RT on lower settings (or it did it anyway because of DLSS?). Anyway, I had bought a Gsync monitor so playing single player fps didnt ned high frames, 40-45 fps was smooth enough for me.

DLSS has evolved to a point where a newer version like 2.5.1 will provide the best IQ across all DLSS2 games, instead of the stock DLL that shipped with the game. CP2077 originally shipped with version 2.1 and it didn't even have negative mipmap (which Nvidia advised should be used with DLSS).

With the latest DLSS 2.5.1, you can try notch lower DLSS setting (like Quality --> Balanced) and the resulting IQ is comparable to older version at higher setting, sure you can play game at 40-45FPS but getting 10-15% higher FPS without any noticeable IQ loss would be nice.

I just spent like 10h playing Witcher 3 on my old rig with 2080Ti at 3440x1440 DLSS Performance and it looks good with RT, play nice enough at 40-45FPS.
 
WOW... This new version seems pretty sh*t :( And I am not even comparing it to the native resolution, just the 2.4.3 version...

Way to go NVIDIA!
 
Ultra performance IS the 720p option

Its the lowest of the low, so at least they've improved upon it


When your 3050 laptop GPU is 5 years old, you'll be grateful it has DLSS Ultra performance

In newer games usage of DLSS or similar technology is already required to conserve VRAM as it has only 4 GB or so... otherwise I really enjoy the 3050 mobile, rather solid little GPU if you ask me.
 
GPU power consumption during Witcher 3 RT is significantly lower than CP2077. on my GPU Witcher 3RT uses 320-350 watts, where CP2077 uses 420-450 watts.

Even in these videos you can clearly see the difference in GPU power consumption. CP2077 consuming 230 to 250watts, and Witcher 3RT consuming 180-200 watts.

Lower performance in Witcher 3RT is not from being "more GPU demanding", but due to being a badly optimized mess that doesn't use the GPU completely, and then runs horribly because of it.

This often happens when a game is not built around DX12/Vulkan from the beginning. The Witcher ends up being CPU-bound in DX12, which is the exact opposite of what this API is supposed to do. And it is not because of ray-tracing, as even with all RT features off, the game runs much slower compared to DX11.
 
DLSS Ultra Finewine :D
Isn't it just perhaps a little bit amusing, over the past two-ish years Nvidia RTX cards have certainly gotten 'finewine' drivers and additional software betterments, like DLSS improvements. Not only that, but instead of feeling like something was left on the table at launch, like buggy AMD cards that don't live up to the hype, we got all the juicy good stuff and performance at launch, and then it became yet better.
Also it's ridiculous that there are people who think 1080p Native looks anywhere close to 4K DLSS Ultra Perf
without a doubt in my mind. DLSS 2.5.1 Ultra Performance mode at 4k, looks closer to 4k than it does native 720p. Don't believe me? go try it, 720p is mud compared to DLSS UP mode on 4k. Will be funny to invariably in some other thread of course, hear how that can't be true and it must be worse because someone saw some screenshots somewhere, fresh from the mouth of someone who doesn't own RTX hardware. You may think that sounds so daft it's impossible, but it happens.
 
DLSS Ultra Finewine :D

Also it's ridiculous that there are people who think 1080p Native looks anywhere close to 4K DLSS Ultra Perf

W3 4K DLSS UP vs 1080p TAA


Quite sad that so much games don't have FSR2.0/DLSS 2.x

If by default, it was in every game, 4K monitor would be the way to go and you would just use the upscale you need to get correct perf. You would get better image quality with a 1080p hardware than with a 1080p monitor.

But it's far from being the case sadly. And still plenty of game get released without those feature so it's not going to happen anytime soon.
 
I hope you all don't mind me jumping into this thread to ask about DLSS (and related technologies) and how it works. From my understanding, the way it works is:
- You set your game's rendering resolution to lower than the native for your monitor.
- You enable DLSS.
- DLSS then upscales the game on the fly to your native resolution at the driver level (using a mixture of things I don't have a hope of understanding).

The idea being that if you can't run a game at, say, 4K native with decent performance DLSS can make it look native (to its best effort, hence things like ghosting) while maintaining the performance. Have I got that right?
 
Please use PNG or RAW instead of JPG for comparison shots, if possible. The images are overly compressed.:toast:
 
I`m impressed! Even at 1080p ultra performance is now actually usable vs DLSS 2.4 where it was unusable. Good news for those sitting on 3050 or 2060-cards :)
 
Quite sad that so much games don't have FSR2.0/DLSS 2.x

If by default, it was in every game, 4K monitor would be the way to go and you would just use the upscale you need to get correct perf. You would get better image quality with a 1080p hardware than with a 1080p monitor.

But it's far from being the case sadly. And still plenty of game get released without those feature so it's not going to happen anytime soon.
An image or a game on a 1080p native monitor is way sharper and clear than a 4K native with 1080p scaled up.
 
I hope you all don't mind me jumping into this thread to ask about DLSS (and related technologies) and how it works. From my understanding, the way it works is:
- You set your game's rendering resolution to lower than the native for your monitor.
- You enable DLSS.
- DLSS then upscales the game on the fly to your native resolution at the driver level (using a mixture of things I don't have a hope of understanding).

The idea being that if you can't run a game at, say, 4K native with decent performance DLSS can make it look native (to its best effort, hence things like ghosting) while maintaining the performance. Have I got that right?
Not quite. You don't set the game rendering resolution - the DLSS preset will do that for you.

DLSS presets line up with a render scale percentage of your native resolution setting.

Quality = 66.6%
Balanced = 58.0%
Performance = 50.00%
Ultra Performance = 33.3%

For example, if your in-game resolution is set to 1440p, enabling DLSS Quality within the game will have it internally render at 960p then smart upscale using data fed by the game back to the native resolution. And yes, it allows for things like 4K rendering at the performance of 1080p (which is equivalent to DLSS Performance mode).
 
I hope you all don't mind me jumping into this thread to ask about DLSS (and related technologies) and how it works. From my understanding, the way it works is:
- You set your game's rendering resolution to lower than the native for your monitor.
- You enable DLSS.
- DLSS then upscales the game on the fly to your native resolution at the driver level (using a mixture of things I don't have a hope of understanding).
Don't change the resolution to anything lower than native, DLSS does the rest for you by choosing a preset.
The idea being that if you can't run a game at, say, 4K native with decent performance DLSS can make it look native (to its best effort, hence things like ghosting) while maintaining the performance. Have I got that right?
That's about the gist of it.

Use your monitors native res, and choose a DLSS setting that has the right balance of visual fidelity and frames per second, to your preference.
 
I thought you might be trolling, so I checked out your statement. My goodness, you're not wrong.

I don't play many games anymore, so I'm a bit 'meh' about these things. But what I'm seeing called DLSS is basically a pretty version of 1080p. It looks like what I see when my wife watches Murder She Wrote at 720p on my 4k OLED.

That's nothing like 4k. So why not just play at 1080 instead?

View attachment 280165

View attachment 280168


Because ultra performance at 4K looks much better than 1080p. Has less jagged and shimmering and smothers lines

You only focused on lights.... But look at jagged edges on the window and blurry image. I say ultra performance look better in most aspects. 1080p on 4K display looks blurry

And also ultra performance at 4K has higher fps than 1080p

Personally, I would not recommend going below performance mode but in some rare cases ultra performance mode can be useful (for example if you have RTX 2060 and 4K display)
 
Last edited:
Because ultra performance at 4K looks much better than 1080p. Has less jagged and shimmering and smothers lines

You only focused on lights.... But look at jagged edges on the window and blurry image. I say ultra performance look better in most aspects. 1080p on 4K display looks blurry

And also ultra performance at 4K has higher fps than 1080p

Personally, I would not recommend going below performance mode but in some rare cases ultra performance mode can be useful (for example if you have RTX 2060 and 4K display)

At 4K it targets an enhanced 720p image. I'd still say that to achieve 1080p level of fidelity you will want to use performance, if not balanced mode. Ultra performance is intended to be used at 8K resolution, or at least it was back when DLSS 2 was first announced alongside the RTX 3090.

I still remember the marketing fluff:

1674693839026.png



Fast forward two years, it's barely fast enough to get Forspoken playable on ultra high targeting a smooth and consistent 1080p 60 Hz without the blasted DLSS :nutkick:
 
Fast forward two years, it's barely fast enough to get Forspoken playable on ultra high targeting a smooth and consistent 1080p 60 Hz
Turns out, that's pretty much forspokens fault, not DLSS's fault.

4k using DLSS uktra performance looks better than 1080p scaled to a 4k panel, not a doubt in my mind, mostly because over the last few days I've dropped the 2.5.1 DLL in every dlss games folder I can and the results are fantastic.

The best way I can think to describe it, is using DLSS ultra perf mode on my 42" 4k OLED, looks strikingly similar in sharpness/softness (depending which way you slice it) to 1080p native on a 27" display, or 2560x1080 @ 34/35", while providing a larger more rich image overall. It's not the pinnacle of image quality, neither are, but it certainly looks closer to 4k than it does 720p.
 
DLSS 2.5 just destroyed that lightboard on the first Cyberpunk image. Whilst you can tell it clears up thin line stuff pretty well, it looks too aggressive.
The lightboard looks worse, but several other stuff looks better, including the concrete below the lightboard.

DLSS can make some stuff look worse, and other stuff look better, than native.

I have seen that in numerous games. Improvements compared to native. Sharper and better textures. Less jaggy lines etc.
 
This often happens when a game is not built around DX12/Vulkan from the beginning. The Witcher ends up being CPU-bound in DX12, which is the exact opposite of what this API is supposed to do. And it is not because of ray-tracing, as even with all RT features off, the game runs much slower compared to DX11.
Precisely. Which is why I wished to comment on that “heavy game” statement that TPU made. Completely unfounded.
 
An image or a game on a 1080p native monitor is way sharper and clear than a 4K native with 1080p scaled up.
I beg to differ. Native 1080p on my 1080p monitor in absolutely no way looks better than DLSS performance mode on my 4k monitor. Now, the monitors themselves differ, as does Upscaling methods, but there is no doubt in my mind which image is preferable.
 
I beg to differ. Native 1080p on my 1080p monitor in absolutely no way looks better than DLSS performance mode on my 4k monitor. Now, the monitors themselves differ, as does Upscaling methods, but there is no doubt in my mind which image is preferable.

I think they meant regular upscaling from 1080p to 4K. That definitely looks blurry, and it will look better on a native 1080p screen of the same size.

My friend used to have a 55" 1080p TV and it looked great from a normal viewing distance.

With my LG OLED, I actually prefer when the TV scales from 1080p than when consoles or the PC GPU (regular upscale) do it. It looks sharper.

On PC you can use integer scaling for 1080p which looks insanely sharp. Text looks a bit pixelated, but actual games look awesome, especially with TAA. When I had a 2070 Super, I played The Outer Worlds in 1080p with integer scaling and it actually looked better than 1440p upscaled to 4K by the GPU (which still looks good on a TV).

But 4K with DLSS Performance is definitely the most detailed image you can get in terms of upscaling. There are some drawbacks in certain games, but overall the technology is amazing.
 
CD Projekt almost owned by Nvidia at this point. :)
 
I beg to differ. Native 1080p on my 1080p monitor in absolutely no way looks better than DLSS performance mode on my 4k monitor. Now, the monitors themselves differ, as does Upscaling methods, but there is no doubt in my mind which image is preferable.
I didn't mean DLSS, just normal upscaling done by monitor, or GPU.
 
I didn't mean DLSS, just normal upscaling done by monitor, or GPU.
Been a long time since I tested that, you're likely correct. FSR or DLSS however, very different story, especially given you much they can clean up from a 4k render with meh TAA
 
Back
Top