Wednesday, January 12th 2022

The Power of AI Arrives in Upcoming NVIDIA Game-Ready Driver Release with Deep Learning Dynamic Super Resolution (DLDSR)

Among the broad range of new game titles getting support, we are in for a surprise. NVIDIA yesterday announced a feature list of its upcoming game-ready GeForce driver scheduled for public release on January 14th. According to the new blog post on NVIDIA's website, the forthcoming game-ready driver release will feature an AI-enhanced version of Dynamic Super Resolution (DSR), available in GeForce drivers for a while. The new AI-powered tech is, what the company calls, Deep Learning Dynamic Super Resolution or DLDSR shortly. It uses neural networks that require fewer input pixels and produces stunning image quality on your monitor.
NVIDIAOur January 14th Game Ready Driver updates the NVIDIA DSR feature with AI. DLDSR (Deep Learning Dynamic Super Resolution) renders a game at higher, more detailed resolution before intelligently shrinking the result back down to the resolution of your monitor. This downsampling method improves image quality by enhancing detail, smoothing edges, and reducing shimmering.

DLDSR improves upon DSR by adding an AI network that requires fewer input pixels, making the image quality of DLDSR 2.25X comparable to that of DSR 4X, but with higher performance. DLDSR works in most games on GeForce RTX GPUs, thanks to their Tensor Cores.
NVIDIA Deep Learning Dynamic Super Resolution NVIDIA Deep Learning Dynamic Super Resolution
This suggests that the new DLDSR technology will require GeForce RTX graphics cards only, as they have Tensor cores. As far as GeForce GTX users, they are left with non-AI DSR scaling techniques. We have to wonder what the performance looks like on other titles, so be sure to stay tuned for more information and possible benchmarks coming from our review team.
Source: NVIDIA
Add your own comment

16 Comments on The Power of AI Arrives in Upcoming NVIDIA Game-Ready Driver Release with Deep Learning Dynamic Super Resolution (DLDSR)

#1
PilleniusMC
It sounds like an interesting concept for non ray traced games, so that may be cool.
The performance is going to be interesting.
Posted on Reply
#2
Ferrum Master
This is the only and ultimate way of getting proper AA without ill side effects like shimmering, vaseline smearing etc... especially if the game doesn't support any of it.
Posted on Reply
#3
AusWolf
It would have been interesting to see a 1:1 comparison instead of the 4x:2.25x we see here. I bet the differences aren't so obvious then.

Also, upscaling to larger than native resolution is nice, but what if I want to play at native with a lower render resolution and DLDSR to gain some performance benefits? Would it be possible?
Posted on Reply
#5
Verpal
Going to try this with FFXIV, AA solution in FFXIV is absolute garbage, maybe we can finally utilize tensor core in meaningful manner.
Posted on Reply
#6
Ibizadr
Sometimes I use some dsr resolutions when my fps it's above my monitor hz in single player games and the gains are questionable, let's see how they introduce that
Posted on Reply
#7
stimpy88
All this A.I. and Deep Learning stuff... 99% marketing BS.
Posted on Reply
#8
Nephilim666
stimpy88All this A.I. and Deep Learning stuff... 99% marketing BS.
And 1% unbeatable upscaling in the case of DLSS. I'm no fanboy of NVIDIA but even I will hand it to them they have the best solution currently available. To say otherwise is a bit naive.
Posted on Reply
#9
Dragokar
There still is nothing compared to raw GPU-power and high resolutions with good SGSSAA, but instead of this we get 450W monsters and some crappy AI-Up-Downscaling. Sometimes it can look better, but there are far too many times when it does not look good.
Posted on Reply
#10
stimpy88
Nephilim666And 1% unbeatable upscaling in the case of DLSS. I'm no fanboy of NVIDIA but even I will hand it to them they have the best solution currently available. To say otherwise is a bit naive.
I do think some of what nVidia say about how this stuff works is genuine. But it's everybody else and their grandmother jumping on the A.I. bandwagon that my comment was influenced by.

DLSS is cool tech, that has taken 3 years to get to "unbeatable", it was crap in the beginning, less we forget, and has only got good relatively recently, unless you want to ignore how blurry and full of artefacts it's been for 75% of its life, and was not available to use for a very long tome on more than a handful of games. Oh, and give the others a chance BTW, they got years to get "unbeatable".
Posted on Reply
#11
qubit
Overclocked quantum bit
I've found that the best use of DSR for me is to benchmark my graphics card at 4K when not connected to a 4K monitor, so I can find out how it will perform at that resolution. I like the sheer novelty of it, too. And yes, you can also use it when connected to a 4K monitor to find out what it will look like on 8K monitor. The desktop icons are so small, they start to disappear! And the desktop real estate is vast, really vast.

For regular gaming, it's better to use AA at 1080p to smooth things out. Much lower performance hit.
Posted on Reply
#12
Kohl Baas
Smells fishy. The right image looks more crisp and sharp than the middle. No upsampled image should look better than the native rendered one. The only thing you should be able to get is performance. Whenever this isn't true, I suspect the material is rigged.
Posted on Reply
#13
bhappy
Kohl BaasSmells fishy. The right image looks more crisp and sharp than the middle. No upsampled image should look better than the native rendered one. The only thing you should be able to get is performance. Whenever this isn't true, I suspect the material is rigged.
The reason why it looks better than the image in the middle is because it is using deep learning as well as down sampling, which is the whole idea behind DLDSR. It looks better than simply down sampling from a higher resolution.
Posted on Reply
#14
Vayra86
Still trying to sell those overpriced tensor cores :) Nvidia is first, at least. Let's see how this really works if there is no per-game support required. That on its own is pretty neat, I had to read article twice but it really says it there, works on most games.

Because for DLSS so far Nvidia is still shitting game ready drivers left and right, wherever it suits them.
Posted on Reply
#15
KainXS
I'm really hoping it has some VR support of some kind, it could be very useful for that since you might be able to upscale a much lower performance hit. On the flipside it is possible to use FSR in some VR titles using the OpenVR(FSR plugin) but then you have to deal with FSR's shimmering.

If it works I will use it, DLSS 2.0 is pretty useful in some titles at high res so I am welcoming it and also hoping it might support window mode also.
Posted on Reply
#16
Lycanwolfen
stimpy88All this A.I. and Deep Learning stuff... 99% marketing BS.
Ya Amen. I remember when Nvidia was going to make cards with less power and better graphics output. Well that went out the window fast. Now its back to big power hungry single based cards. I have already seen a prototype Dual GPU card from Nvidia comming soon. History repeat 3dfx did the same thing. Single card more GPU's.

It's sad really my Twin 1080Ti's in SLI use less power than a single 3080ti and still can push 100 fps in 4k gaming.
Posted on Reply
Add your own comment
Dec 22nd, 2024 18:01 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts