Monday, January 6th 2025

NVIDIA Introduces DLSS 4 with Multi-Frame Generation for up to 8X Framerate Uplifts

With the GeForce RTX 50-series "Blackwell" generation, NVIDIA is introducing the new DLSS 4 technology. The most groundbreaking feature being introduced with DLSS 4 is multi-frame generation. The technology relies on generative AI to predict up to three frames ahead of a conventionally rendered frame, which in and of itself could be a result of super resolution. Since DLSS SR can effectively upscale 1 pixel into 4 (i.e. turn a 1080p render into 4K output), and DLSS 4 generates the following three frames, DLSS 4 effectively has a pixel generation factor of 1:15 (15 in every 16 pixels are generated outside the rendering pipeline). When it launches alongside the GeForce RTX 50-series later this month, over 75 game titles will be ready for DLSS 4. Multi-frame generation is a feature exclusive to "Blackwell."
Add your own comment

133 Comments on NVIDIA Introduces DLSS 4 with Multi-Frame Generation for up to 8X Framerate Uplifts

#26
EsliteMoby
It will be the same frame interpolation trick that used in FSR frame gen anyway.
Posted on Reply
#27
londiste
AretakNo man, there are very serious and complex reasons why they couldn't generate the extra frames on older cards. You know, despite the fact that Lossless Scaling introduced 4x frame generation six months ago and it works on everything.
Not saying that there is a hard requirement for RTX 5000 series but there is very likely a soft one.

Why would you think there isn't? DLSS relies on specific hardware bits. Nvidia primarily refers to Optical flow accelerator with whatever new features (or performance) the new generation has that enables doing more advanced stuff. It is entirely plausible that the previous generation cards would have problems with it. There is a lot of annoying space between works and does not work. For example, read posts from this thread - latency is a real concern here. And that is just the first one that comes to everyone's mind.

And as @Dr. Dro pointed out - looking back at previous DLSS updates, Nvidia has brought some new DLSS features to older generations as well.

Want a more extreme example from the other camp? Remember when RTX came out, AMD went pschaw and said they'd bring DXR support to their cards running on existing hardware, shaders basically... until they didn't. Nvidia had the same problems with 1000 series but actually did roll out DXR support. Turns out it worked fine - just very slowly - and was thus useless.
Posted on Reply
#28
Od1sseas
Hecate91I don't consider progress having to spend $1000+ on a new gpu to have the newest features, only to lower the resolution while adding AI generated frames.
Also wait for real benchmarks, we have yet to see how it looks or how the latency is.
Think of it this way: I’m running at a lower res but I still get a higher quality image than native.
I’m running FG, but the latency is the same as not using DLSS 4.
240FPS in 4K in CP2077 with PATH TRACING.
Yes, that’s progress.
ZubasaSo DLSS will look better than DLAA which is native resolution, got it.
Yes. DLSS 4 will look better than current DLAA 3
Posted on Reply
#29
Dr. Dro
londisteAnd as @Dr. Dro pointed out - looking back at previous DLSS updates, Nvidia has brought some new DLSS features to older generations as well.

Want a more extreme example from the other camp? Remember when RTX came out, AMD went pschaw and said they'd bring DXR support to their cards running on existing hardware, shaders basically... until they didn't. Nvidia had the same problems with 1000 series but actually did roll out DXR support. Turns out it worked fine - just very slowly - and was thus useless.
www.3dmark.com/sb/26155

Decent enough to get 3DMark Solar Bay to 60ish fps on a 1070, as you can see my 1070 Ti there doing 55-80fps so even in full software emulation you could still get phone-level RT on midrange Pascal cards at 1080p. My biggest argument for software DXR (and I requested AMD multiple times to reevaluate and ship this back then) is that it allowed developers to acclimate and experiment with ray tracing, even if the performance is not good enough to ship an RT AAA on pretty much anything below a Titan Xp.

Yet people still act shocked that NV managed to dominate this segment from day one. RT has effectively become RTX, because in the earliest days of DX raytracing... AMD just decided not to bother with it. It's a self inflicted wound which originates from a dismissive, arrogant mindset.
usinameHuang won't give you extra discount
That's ok I am proud to tell you I fully paid off all of my bills and am starting 2025 with no debt whatsoever :D

I only wish it was with this shill money some posters said I take. I wish I did, imagine whaling for C6R5 Mavuika on Jensen Huang's dime!? You thought I'd buy a 5090 with that money? :eek::laugh:
Posted on Reply
#30
Pepamami
but if has the same latency, what the point of having 400 fps instead of 100.
Od1sseasBefore you guys make stupid comments about latency, the latency is actually the same (Nvidia has a video comparing DLSS FG vs Multi-Frame DLSS FG). They also just released Reflex 2 which further reduces latency by 50%.
So more generated frames but LOWER latency at the same time.


its not lower, its more generated frames with the SAME latency
Posted on Reply
#31
Od1sseas
Pepamamibut if has the same latency, what the point of having 400 fps instead of 100.




its not lower, its more generated frames with the SAME latency
Yes, that’s what I said. Same latency.
The lower latency claim is for Reflex 2
Posted on Reply
#32
Zubasa
Od1sseasYes. DLSS 4 will look better than current DLAA 3.
So upscaling is not better than native res when the same process is applied to both.
Posted on Reply
#33
londiste
Pepamamibut if has the same latency, what the point of having 400 fps instead of 100.
Smoother picture/movement?
Posted on Reply
#34
Capitan Harlock
Gpus pulling Consoles performance using tricks to get more fps with games unoptimized by people that don't know how to optimize fricking amazing :kookoo:
Posted on Reply
#35
TheToi
usinameBy the way, Am I the only one who have the feeling that 5090 is ultra slow garbage that is barely doing 30 fps in 4k when DLSS is disabled? Its not me, its Nvidia's own claim
Cyberpunk 2077 runs between 15-18 fps on a 4090 in 4K with max settings native. (using path tracing, as mentioned on the Nvidia picture)
On this chart the 5090 runs it around a bit over 30 fps, which is close to x2.
So i wouldn't call x2 perf as slow garbage.
Posted on Reply
#36
Legacy-ZA
Pepamamibut if has the same latency, what the point of having 400 fps instead of 100.




its not lower, its more generated frames with the SAME latency
Not every scene will run at 400 FPS, but if you lock the FPS to your monitors maximum refresh rate, with G-Sync enabled, you will ensure that you get the maximum smooth frame locked gaming experience possible.

I am so glad this is finally possible.

A lot of competitor gamers like to say, "yeah, I get 500 FPS" "muh 600Hz" it's not just the high FPS that counts, but the CONSISTANCY, when you move, aim & learn at "constant" framerate, you will always aim the same way, no matter the scene rendered and in most cases, different games, so you as the gamer will remain constant across the board according to your abilities.

Fluctuations causes inconsistency, inconstancy causes bad gameplay and could cost you the match, aiming a half mm too short, or too far, just because of a framerate fluctuation.
Posted on Reply
#37
Bagerklestyne
Od1sseasIf you knew anything about computer graphics you would know that raster as a whole is 100% fake. They are just TERRIBLE approximations of how real lighting works.

Progress is running things at a lower resolutions but still looking better than traditional native resolution.
DLSS 4 SR has done it.
So you're saying the raster is terrible, but interpolating it with upscaling and frame generation isn't ?

Or did I misread what you're saying
Posted on Reply
#38
Od1sseas
ZubasaSo upscaling is not better than native res when the same process is applied to both.
Upscaling is better than native vs previous AA technologies like older DLSS and especially TAA.
BagerklestyneSo you're saying the raster is terrible, but interpolating it with upscaling and frame generation isn't ?

Or did I misread what you're saying
DLSS 4 looks better than native = Progress
DLSS FG double performance for the same latency (vs native with no DLSS SR and Reflex) = Progress
DLSS MFG Triples performance for the same latency = Progress.

This is “fake” progress only in the eyes of AMD fanboys or low IQ individuals
Posted on Reply
#39
Pepamami
Legacy-ZANot every scene will run at 400 FPS, but if you lock the FPS to your monitors maximum refresh rate, with G-Sync enabled, you will ensure you get the maximum smooth frame locked gaming experience possible. So glad this is finally possible.

A lot of competitor gamers like to say, "yeah, I get 500 FPS" "muh 600Hz" it's not just the high FPS that counts, but the CONSISTANCY, when you move, aim & learn at "constant" framerate, you will always aim the same way, no matter the scene rendered and in most cases, different games, so you as the gamer will remain constant across the board according to your abilities.
if I have the same input time/ latency, the game will feel the same for me despite the fps number. If i have 20 ms with 60fps, and have the same 20ms with ultra cool generated 200fps, it will feel the same, coz 20ms is 20ms. Its prob cool for goofy OLED displays, that goes brrrrr flickering on low refresh numbers.
Posted on Reply
#40
Legacy-ZA
Pepamamiif I have the same input time/ latency, the game will feel the same for me despite the fps number. If i have 20 ms with 60fps, and have the same 20ms with ultra cool generated 200fps, it will feel the same, coz 20ms is 20ms. Its prob cool for goofy OLED displays, that goes brrrrr flickering on low refresh numbers.
Correct, so, get a 165Hz monitor, lock it to that Hz, enjoy 6ms latency. :D

This is why I can never get used to older systems anymore, the high refresh locks ruined me for life and I NEED, a better GPU to keep it near that 165Hz in AAA titles, these technologies from nVidia will ensure it, although, perhaps at a little graphical loss, but from what I saw, it doesn't seem that way anymore.

Anyways, we will see from W1zzards testing. :)
Posted on Reply
#41
JustBenching
If this keeps latency the same as normal FG (which it should I guess, since it's still using a frame before and after) then it's super cool. If it increases latency, meeeh.
Posted on Reply
#42
Pepamami
Od1sseasThis is “fake” progress only in the eyes of AMD fanboys or low IQ individuals
ok if go this road.
I think 5070 has unacceptable RAM size.
Legacy-ZACorrect, so, get a 165Hz monitor, lock it to that Hz, enjoy 6ms latency. :D
I wish it worked like that xd
Posted on Reply
#43
Od1sseas
Pepamamiok if go this road.
I think 5070 has unacceptable RAM size.


I wish it worked like that xd
12GB is indeed low.
But it seems like Nvidia tries to “force” devs to use Neural Textures.
Same or even better quality textures for up to 7x less VRAM usage.
Posted on Reply
#44
mb194dc
Probably but image quality is going to be dire and they'll be other trade offs. E,g input lag, screen tearing, artifacting, shimmering and the rest.
Posted on Reply
#45
LittleBro
Od1sseasBefore you guys make stupid comments about latency, the latency is actually the same (Nvidia has a video comparing DLSS FG vs Multi-Frame DLSS FG). They also just released Reflex 2 which further reduces latency by 50%.
So more generated frames but LOWER latency at the same time.
You get so many frames at such low latency that you barely see what that games were meant to look like. 3 generated frames per 1 real frame, rendering accuracy my ass.
Posted on Reply
#46
JustBenching
mb194dcimage quality is going to be dire and they'll be other trade offs. E,g input lag, screen tearing, artifacting, shimmering and the rest.
That reminds me of native TAA. Smearing, ghosting, input lag (without reflex input lag is horrible) etc. :roll:
Posted on Reply
#47
Vya Domus
Yep, called it many months ago, a gazillion interpolated frames. Screw it just modify the driver so that it always reports 99999999 FPS, why keep doing this ? That's the end game anyway.
Posted on Reply
#48
Knight47
Od1sseas12GB is indeed low.
But it seems like Nvidia tries to “force” devs to use Neural Textures.
Same or even better quality textures for up to 7x less VRAM usage.
So does the game needs to support Neutral Rendering? If so, I'm sol, most of the games I play the devs are lazy and most of the time there's only Fsr1 Dlss2.0(does almost nothing due to crap dev implementation)
Posted on Reply
#49
boomheadshot8
NiceumemuI don't care about fake frames or fake resolution, show me pure raster performance
Yep, more blurryness for unoptimised games ??
sound great
usinameBy the way, Am I the only one who have the feeling that 5090 is ultra slow garbage that is barely doing 30 fps in 4k when DLSS is disabled? Its not me, its Nvidia's own claim
Alan wake : UE5 under 50 fps at 4k
YAY :rockout:
Posted on Reply
#50
Od1sseas
boomheadshot8Yep, more blurryness for unoptimised games ??
sound great


Alan wake : UE5 under 50 fps at 4k
YAY :rockout:
Alan Wake is not Unreal Engine. It’s Northlight Engine.
Posted on Reply
Add your own comment
Jan 8th, 2025 21:56 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts