Monday, January 6th 2025

NVIDIA Introduces DLSS 4 with Multi-Frame Generation for up to 8X Framerate Uplifts

With the GeForce RTX 50-series "Blackwell" generation, NVIDIA is introducing the new DLSS 4 technology. The most groundbreaking feature being introduced with DLSS 4 is multi-frame generation. The technology relies on generative AI to predict up to three frames ahead of a conventionally rendered frame, which in and of itself could be a result of super resolution. Since DLSS SR can effectively upscale 1 pixel into 4 (i.e. turn a 1080p render into 4K output), and DLSS 4 generates the following three frames, DLSS 4 effectively has a pixel generation factor of 1:15 (15 in every 16 pixels are generated outside the rendering pipeline). When it launches alongside the GeForce RTX 50-series later this month, over 75 game titles will be ready for DLSS 4. Multi-frame generation is a feature exclusive to "Blackwell."
Add your own comment

133 Comments on NVIDIA Introduces DLSS 4 with Multi-Frame Generation for up to 8X Framerate Uplifts

#51
Prima.Vera
btarunrMulti-frame generation is a feature exclusive to "Blackwell."
Scumbag nGreedia is scumbag.
Actually, this was to be expected for a callous greedy corporation such as them. Nothing surprising here.
Posted on Reply
#52
Chomiq
Just what we need, more fake frames.
Posted on Reply
#53
W3RN3R
Using fake frames as a marketing tool to present your new products... Okay then.
Posted on Reply
#54
Markosz
Hahaha, of course they have another series exclusive "feature"... Doesn't matter Lossless Scaling had multiple frame generation for ANY GPU quiet a while now as a downloadable app.
There is basically 0 progress in hardware between the generations, just brute force: more power, more cores, more expensive.
Posted on Reply
#55
Chomiq
W3RN3RUsing fake frames as a marketing tool to present your new products... Okay then.
Nvidia wants to sell you their AI tech, not raster performance
Posted on Reply
#56
dyonoctis
Vya DomusYep, called it many months ago, a gazillion interpolated frames. Screw it just modify the driver so that it always reports 99999999 FPS, why keep doing this ? That's the end game anyway.
At that point, I think that expecting a massive uplift in rasterization is not coping with the reality of the GPU market haha. There's literally not a single GPU maker that seems able to pull it off, it seems that we are entering into a 3 year cycle, but with a lower/equal raw perf uplift than we could have a few decades ago when it was a yearly cycle.

People somehow expected AMD to be savior, the champion of the true gamers, with their MCM expertise, but that doesn't seem to be happening. And with UDNA they will stop to have a gaming/compute split, and they are spending a lot of time developing software tricks to increase the performance. (and becoming more software-oriented generally, FSR4 wasn't needed according to some people, but it's coming anyway and seems to be exclusive to RDNA 4).

Every actor is doing the same thing: software tricks are their current battlefield, and nobody wants to be caught sleeping, even if forum dwellers seem to say that raw raster is were the real money is.
Posted on Reply
#57
LittleBro
Dr. DroI'm not sure the cynicism works, chief. Note that NV did not disable any of the new features on Ada hardware and they also allowed the new enhanced DLSS super resolution and ray reconstruction features to run on hardware as old as Turing. MFG obviously require faster hardware only present in Blackwell, and the one thing people hyperfocused on the 4x frame generation model haven't figured out that with this ample optical flow performance, lowering the generation factor to 3 or back to 2 will probably result in a more accurate image with frame generation on over Ada.
Since RTX 4090 has AI TOPS similar to RTX 5070 (Nvidia's own claim) forgive me but I am somehow unable to see why at least RTX 4090 is not capable of handling newest FG technology. Nvidia once again screwed with it's own customers as they did back with RTX3000 & DLSS 3. In other words: in your face, RTX <5000 owners. I understand Nvidia, I mean Jensen. It's all about the money. Create something not needed, persuade others about it's neccessity, than sit back and earn money. Next year release new generation, make subtle improvement and don't forget to cut older versions from access to it. It's greedy as hell but people don't give a f*, so it works for Nvidia.

Did you like his new jacket? Must have been designed by so called AI LJM (Large Jacket Model) itself.
Dr. DroI can't wait to see the tests, but nobody with an existing GeForce RTX card is being shafted out of features and this is a first for NV in a very long time. People who bought an RTX 2080 almost 7 years ago are getting new features, while the 5700 XT can barely run games since it's downlevel hardware below 12_2 base and the driver support is terrible.
RX 5700 XT is a performer of lower tier than RTX 2080 (Ti) with considerably portional launch price. AMD current best upscaling technology is also supported on RX 5000 and higher SKUs, on Nvidia's and Intel's GPUs too. Only thing that RX 5000 series lack is RT and FG support. You can't blame AMD and Intel for not supporting DLSS as it's locked proprietary technology of other manufacturer. AMD and Intel developed their own technologies and open-sourced them.

And once again, what is wrong with driver support? I've had no problems with AMD drivers for years. My problems only existed with newest titles when I forgot or was lazy to update drivers. When talking about driver support, please do also mention how Nvidia's drivers (not just once) allowed their own GPUs to get destroyed literally by playing games. It's been only weeks ago when Nvidia f*cked up their overlay in new Nvidia app and that caused significant performance loss (up to 15%) in games. AFAIK, it has not been fixed YET. The only solution that Nvidia was able to came up with was to disable overlays by default.

If you say A, please do also say B. AMD drivers were garbage long time ago.
Od1sseasThink of it this way: I’m running at a lower res but I still get a higher quality image than native.
I’m running FG, but the latency is the same as not using DLSS 4.
240FPS in 4K in CP2077 with PATH TRACING.
Yes, that’s progress.
It's impossible to think about it the way you do because it denies physics and logic. Lower resolution means less data rendered, less data at hand. You cannot make higher quality out of something by artificially upscaling it and extra/inter polating data between/before/after. Data is data and guessing between is still guessing. Guessed data ALWAYS carry portion of uncertainty because nothing can really predict the future.
Pepamamibut if has the same latency, what the point of having 400 fps instead of 100.



its not lower, its more generated frames with the SAME latency
Exactly. Input latency simply cannot improve with FG technology.
Od1sseasUpscaling is better than native vs previous AA technologies like older DLSS and especially TAA.


DLSS 4 looks better than native = Progress
DLSS FG double performance for the same latency (vs native with no DLSS SR and Reflex) = Progress
DLSS MFG Triples performance for the same latency = Progress.

This is “fake” progress only in the eyes of AMD fanboys or low IQ individuals
Look, people have opinions and they have right to have them.
Calling others fanboys ... okay ... I get it, but telling people they are dumb because they have other opinion? Not nice.

Do you actually understand how does FG work? FG interpolates between rendered frames and basically multiplies this interpolations with help of accelerated neural network (with use of vectors which are calculated based on real rasterized frames in between those interpolated). It improves framerate, that's true, but what you see is mostly not rendered rasterized image but guessed.

In no way can this approach generate lower latency than native rendering because hardware needs additional time to process FG.
This latency penalty can be mitigated by additional hardware resources, which is what Nvidia did.

Why some people have doubts about this being a progress is because instead of improving rasterized pefrormance, Nvidia brought artificiality to the game, created enormously huge and power hungry GPUs that do more guessing than rasterizing. Rasterizing performance is thus progressing very slowly.
Posted on Reply
#58
b1k3rdude
Od1sseasProgress denier. I see
It isnt progress when a) Corps are ripping thier customers off, b) they are using Ai to mask true perf and c) the visual quality of the this 'masking' has been demonstrably shown to be glitchy. Its got to the point where some game devs are using Frame-gen as a crutch to not optimise their games.
Posted on Reply
#59
boomheadshot8
FSR3 create blurry when you move, I dont understand whhy people use it also it's add a lot of latence for fps is unplayble (stalker2)
b1k3rdudeIt isnt progress when a) ripping your customers off, b) your using Ai to mask true perf and c) the visual quaility of the this 'masking' has been demonsteably shown to be glitchy. Its got to the point some some game devs are using Frame-gen as a crutch to not optomise thier games.
Indeeed ,also games are looking way worse than ~2014-2018 era ,with a big gpus and lot of VRAM
Posted on Reply
#60
b1k3rdude
usinameBy the way, Am I the only one who have the feeling that 5090 is ultra slow garbage that is barely doing 30 fps in 4k when DLSS is disabled? Its not me, its Nvidia's own claim
I am getting the pop-corn ready for the HardwareUnboxed review.
Posted on Reply
#61
Whitestar
Maybe a bit off-topic, but can someone point me in the direction of a good deep dive into frame generation with regards to input lag and vsync?

I'm running BFI on my monitor you see (Viewsonic XG2431), and that requires fps>hz. It also requires vsync being on. Currently I try to run all my games at a minimum of 75hz/fps (no ray tracing). So I'm basically wondering how much input lag frame generation will introduce given that scenario.
Posted on Reply
#62
adilazimdegilx
So, exactly same thing that is available even on a 3rd party tool like Lossless Scaling for both old and new AMD, Nvidia and Intel GPUs, exclusive only to 5000 series...
Very nvidiaish indeed.
Posted on Reply
#63
boomheadshot8
WhitestarMaybe a bit off-topic, but can someone point me in the direction of a good deep dive into frame generation with regards to input lag and vsync?

I'm running BFI on my monitor you see (Viewsonic XG2431), and that requires fps>hz. It also requires vsync being on. Currently I try to run all my games at a minimum of 75hz/fps (no ray tracing). So I'm basically wondering how much input lag frame generation will introduce given that scenario.
ThreatInteractive on youtube talks about frame generation and ue5 problems maybe it can help you
Posted on Reply
#65
Lycanwolfen
WOW more software AI gimics, Seem that everything now is 1080p scaled to 4k or 8k with AI. Guess they cannot make pure 4k or 8k rendering. Lame.

So cyber punk runs 27 FPS without software gimics um my 1080ti can run game better than that.
Posted on Reply
#66
BSim500
boomheadshot8Yep, more blurryness for unoptimised games ?? sound great
They need to relearn to optimise them better then. The gaslighting here though is unreal where "native" can only possibly be the worst examples of Unreal TAA blur-fest (which never was necessary to force on in the first place) whilst DLSS is constantly declared "better than native" like a new unquestionable religious dogma, until you go back and compare half the games released this year and even compared to TAA, DLSS often contains very obvious visual aberrations like weirdly greyed out telephone wires that look worse than those in Half Life 2, to irritating as hell Jittery HUD / UI's, etc, and we're all supposed to "pretend" to not see...

At best, DLSS is a tool in a toolbox that should make a game run better at almost the same visual quality (and sometimes does). At worst, it also can (and sometimes does) look worse than the same native resolution it's trying to approximate to, plus has reduced the incentive for developers to do any optimisation at all. Imagine DLSS, FSR, etc, weren't invented, then developers would have no choice but to put in more effort (as they did in the past during multiple periods of hardware stagnation). That should have been the healthy "baseline norm" optimisation we should be enjoying for post-2020 AAA's with DLSS gains being on top (aka "A rising tide lifts all boats"), not a "Now we can churn out sh*tty performance turds with less effort and more excuses than ever before!" bait & switch instead of, which is exactly how half of last year's games ended up. Half the unhealthy gushing hype here though every time nVidia announces a new version involves cheering it on as a crutch rather than an enhancement, and that's what people are (rightfully) calling out with "More Fake Frame BS" comments, ie, it's "progress" that has also noticeably regressed the AAA gaming industry in other areas...
Posted on Reply
#67
Francoporto
Perfect for the dev to optimize their game lesser
Posted on Reply
#68
AGlezB
usinameif(gpu.modelNumber > 5000)
{DLSS4 = true}
else
{DLSS4 = false}
DLSS4 = gpu.modelNumber > 5000

Sorry. Couldn't resist. :p
Posted on Reply
#69
Daven
I guess the summary is that Nvidia has moved on from traditional gen ras rendering to AI/RT/DLSS rendering. If you buy a 5000 series card, expect to implement this new render method or fail to get your money’s worth.

For the rest of us who just want to game old school on an old school budget, Intel and AMD will do just fine.
Posted on Reply
#70
NoneRain
Od1sseasProgress denier. I see
Nvidia loves to compare a game struggling at 30FPS to it running at 200FPS with DLSS and sh1t. Sure, that's impressive. But what would actually be impressive is if the game could be run natively at 150FPS.
Also, I couldn't care less about RT. Devs are abusing fake-frames, fake-res, temporalAA and sh1t, to push out poorly optimized games, relying heavily on GPU generative techs instead of proper optimization. None of this is necessary to make a great-looking game.

Now, to play AAA games, you need all of Nvidia’s fancy tech just to end up with worse graphics, more artifacts and blur —when a properly optimized game would look and run better natively.

I’d happily trade all this progress for the neanderthal approach of running games at high specs without AI-driven trickery.
Posted on Reply
#71
droopyRO
NiceumemuI don't consider faking things progress, especially when it comes with noticeable impacts on quality and latency
If you like it, and you cant notice the impacts yourself, I won't stop you but what I'm interested in is raster
I play Warhammer: Darktide with DLSS on Quality and don't notice any input lag. With frame gen on, it is a bit of input lag, but nothing to ruin my gameplay or aim.
DLSS/FSR/XeSS and "fake frames" are a God sent for people with weaker hardware, but also for game devs that don't try as hard to optimize their games, so a win-win ?
Posted on Reply
#72
JustBenching
DavenI guess the summary is that Nvidia has moved on from traditional gen ras rendering to AI/RT/DLSS rendering. If you buy a 5000 series card, expect to implement this new render method or fail to get your money’s worth.

For the rest of us who just want to game old school on an old school budget, Intel and AMD will do just fine.
AMD emphasis was on RT this time around, so yeah, nope.
Posted on Reply
#73
LittleBro
BSim500At best, DLSS is a tool in a toolbox that should make a game run better at almost the same visual quality (and sometimes does). At worst, it also can (and sometimes does) look worse than the same native resolution it's trying to approximate to, plus has reduced the incentive for developers to do any optimisation at all. Imagine DLSS, FSR, etc, weren't invented, then developers would have no choice but to put in more effort (as they did in the past during multiple periods of hardware stagnation). That should have been the healthy "baseline norm" optimisation we should be enjoying for post-2020 AAA's with DLSS gains being on top (aka "A rising tide lifts all boats"), not a "Now we can churn out sh*tty performance turds with less effort and more excuses than ever before!" bait & switch instead of, which is exactly how half of last year's games ended up. Half the unhealthy gushing hype here though every time nVidia announces a new version involves cheering it on as a crutch rather than an enhancement, and that's what people are (rightfully) calling out with "More Fake Frame BS" comments, ie, it's "progress" that has also noticeably regressed the AAA gaming industry in other areas...
NoneRainNvidia loves to compare a game struggling at 30FPS to it running at 200FPS with DLSS and sh1t. Sure, that's impressive. But what would actually be impressive is if the game could be run natively at 150FPS.
Also, I couldn't care less about RT. Devs are abusing fake-frames, fake-res, temporalAA and sh1t, to push out poorly optimized games, relying heavily on GPU generative techs instead of proper optimization. None of this is necessary to make a great-looking game.

Now, to play AAA games, you need all of Nvidia’s fancy tech just to end up with worse graphics, more artifacts and blur —when a properly optimized game would look and run better natively.

I’d happily trade all this progress for the neanderthal approach of running games at high specs without AI-driven trickery.

Exactly my point of view. Finally some people understand how this DLSS and FG helps game devs to neglect proper optimizations.

One would think that with current capabilities of GPUs we would be using supersampling (rendering in higher than native res and shrinking it down to native res) instead of rendering in low res and upscaling to higher.
droopyROI play Warhammer: Darktide with DLSS on Quality and don't notice any input lag. With frame gen on, it is a bit of input lag, but nothing to ruin my gameplay or aim.
DLSS/FSR/XeSS and "fake frames" are a God sent for people with weaker hardware, but also for game devs that don't try as hard to optimize their games, so a win-win ?
Exactly. To achieve reasonable framerate with lower tier hardware on QHD or 4K resolutions. It's win for game devs but definitely not for gamers in terms of poor game optimizations. Look at Wukong, that's so badly optimized game. Even mighty RTX 4090 struggles badly in Wukong. Check out how the game looks and compare it to how it runs on most modern cards. Have you seen better graphics at much better performance? I did, without any doubt.

What comes next? Devs will be even lazier. Generating textures and other graphics assets by AI, no more designers needed. Same for game script, soundtrack, etc. Give us $79.99 for this shiny new title and just ramp up the upscaling and you'll be fine. This is what you pay for when you purchase the game?
Posted on Reply
#74
remekra
TBH once FSR4 is out and has improved image quality, alongside with AntiLag 2, it seems like AMD can be on par at least when it comes to upscaling and FG. Achieving the same 3 generated frames per real one, should not be hard for them considering you can do that now if you enable FSR3.1 in game and AFMF in the driver. Yeah it looks bad but that's mostly due to FSR image quality that should be fixed with FSR4 and also due to the fact AFMF doesn't use motion vectors.
It's not like I'm excited but it's better than last time around where they were missing FG for a year.
Posted on Reply
Add your own comment
Jan 8th, 2025 22:14 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts