Monday, January 6th 2025

NVIDIA Introduces DLSS 4 with Multi-Frame Generation for up to 8X Framerate Uplifts

With the GeForce RTX 50-series "Blackwell" generation, NVIDIA is introducing the new DLSS 4 technology. The most groundbreaking feature being introduced with DLSS 4 is multi-frame generation. The technology relies on generative AI to predict up to three frames ahead of a conventionally rendered frame, which in and of itself could be a result of super resolution. Since DLSS SR can effectively upscale 1 pixel into 4 (i.e. turn a 1080p render into 4K output), and DLSS 4 generates the following three frames, DLSS 4 effectively has a pixel generation factor of 1:15 (15 in every 16 pixels are generated outside the rendering pipeline). When it launches alongside the GeForce RTX 50-series later this month, over 75 game titles will be ready for DLSS 4. Multi-frame generation is a feature exclusive to "Blackwell."
Add your own comment

133 Comments on NVIDIA Introduces DLSS 4 with Multi-Frame Generation for up to 8X Framerate Uplifts

#76
dyonoctis
DavenI guess the summary is that Nvidia has moved on from traditional gen ras rendering to AI/RT/DLSS rendering. If you buy a 5000 series card, expect to implement this new render method or fail to get your money’s worth.

For the rest of us who just want to game old school on an old school budget, Intel and AMD will do just fine.
Inb4 Intel/AMD introduce a similar tech. Even in the wake of the RX 9000 slide and FSR4, Xess2 people still seem to watch the GPU market with half rose tinted glasses and really believe that AMD/Intel won’t follow in Nvidia footsteps. RDNA4 is going to be the last Gaming focused arch from AMD, UDNA will merge the HPC side with the consumer side. With ever better performance in AI tasks, even if gamers don’t care. Same pattern as Nvidia.

Battlemage also got it’s own set of issues if you don’t use it with a recent fast CPU, the driver overhead is massive. And can diminish it’s price/performance ratio in some games.

All I’m seeing are companies offering a slightly better price to performance ratio because they struggle to take the crown. Even with their focus on software trickery, Nvidia is somehow still selling the fastest GPU in rasterisation (when the other are supposedly rasterisation specialists )
Posted on Reply
#77
zigzag
The difference in details between the new transformers based models and the old CNN based models seems huge.

Old vs New:


At 4:26 he says that the new models require four times more compute during inference. Inference takes only a small part of the whole frame time, so the final performance impact won't be nearly so dramatic.

Interested to see a review of how the quality/performance of the new models compares to the old models.
Posted on Reply
#78
JustBenching
dyonoctis. Even with their focus on software trickery, Nvidia is somehow still selling the fastest GPU in rasterisation (when the other are supposedly rasterisation specialists )
Thats crazy right? People complaining about generated frames and what not when the company that has those features also has the fastest cards in both RT and raster
Posted on Reply
#79
Blueberries
usinameBy the way, Am I the only one who have the feeling that 5090 is ultra slow garbage that is barely doing 30 fps in 4k when DLSS is disabled? Its not me, its Nvidia's own claim
Looks like the same performance as my 4090!
Posted on Reply
#80
Whitestar
As usual a good and sober take from HUB:

Posted on Reply
#81
JustBenching
usinameBy the way, Am I the only one who have the feeling that 5090 is ultra slow garbage that is barely doing 30 fps in 4k when DLSS is disabled? Its not me, its Nvidia's own claim
If the 5090 is doing 30 just imagine what every card does.
Posted on Reply
#82
RedelZaVedno
Something's not right. Not a single slide showing pure rasterization performance of new GPUs Vs 40 series. I wonder why?:rolleyes: Is it because it sucks as predicted based on shader counts and bit bus numbers (with exception of 5090)?
And this 4:1 frame generation, if Nvidia's "optical flow" used with OPenXR in VR for reprojection (FG), where extrapolating 30hz to 90Hz mostly sucks, it's just a big gimmick to mask poor rasterization advances in Blackwell series.
Posted on Reply
#83
Whitestar
JustBenchingThats crazy right? People complaining about generated frames and what not when the company that has those features also has the fastest cards in both RT and raster
I don't see what those two things have to do with another. Complainers want GPU manufacturers to focus less on frame insertion and more on actual raw performance gains from one generation to the next.
It's a completely valid complaint.
I mean, look at those "performance" comparisons from the 5000 series marketing. Comparing 4000 series with a card that inserts 3x more frames. That's absurd at best.
Posted on Reply
#84
redeye
AGlezB
DLSS4 = gpu.modelNumber > 5000

Sorry. Couldn't resist. :p
i like it, i like it alot… but,
wheres the declaration that variable DLSS4 is a boolean ?… and the declaration that gpu.modelNumber is an int variable.. LOL,

/possibly-wrong-have-not-programmed-in-20-years… so i am not sure what happens when variable are not declared in the lastest ”fancy” programming language… may AI compilers will clean that up…( lol)
but in any case, like it when people can refactor code into something shorter…. and means that they can debug code which is more important than writing code, IMO.
Posted on Reply
#85
SOAREVERSOR
NiceumemuI don't consider faking things progress, especially when it comes with noticeable impacts on quality and latency
If you like it, and you cant notice the impacts yourself, I won't stop you but what I'm interested in is raster
Everything about rendering games is fake. You do know that right? It's just different methods of fakery.
Posted on Reply
#86
RedelZaVedno
SOAREVERSOREverything about rendering games is fake. You do know that right? It's just different methods of fakery.
I don't care if it's fake or not, all I care about is image quality I see on screen and performance. All I know is that native rasterization at high resolutions (I mainly game at 3160x3160p*2 in VR) looks much better than either DLSS, FSR, XSEE, or Valve's reprojection method or Nvidia's "optical flow" reprojection. DLSS frame generation not being available in VR says a lot about the quality of inserted frames. You can fool your brains when looking at the monitor, but it's hard to do the same on 10 megapixel panels inches away from your eyes.
Posted on Reply
#87
SOAREVERSOR
RedelZaVednoSomething's not right. Not a single slide showing pure rasterization performance of new GPUs Vs 40 series. I wonder why?:rolleyes: Is it because it sucks as predicted based on shader counts and bit bus numbers (with exception of 5090)?
And this 4:1 frame generation, if Nvidia's "optical flow" used with OPenXR in VR for reprojection (FG), where extrapolating 30hz to 90Hz mostly sucks, it's just a big gimmick to mask poor rasterization advances in Blackwell series.
Rasterization is dead. You just don't know it yet.

Rasterization is just a way of painting something. That's it. There is more than one way. With CUDA nvidia turned into an AI company. So these are AI cards period. However AI is another way to paint a game if you want to. As nvidia is the market leader they are going to drag the entire industry to this. In the future AI and all these tricks will be how you render a game and rasterization will be dead as horse buggy. You don't have a choice in this it's already happening. Rasterization is on the way out and will be gone.

Once that's done even the engine and other things are going to move to AI. You just don't realize it yet. Everything is moving to AI and the cloud and PC gamers still have their heads in the sand about what's been going on for years now even though the companies involved have been talking about it openly.
Posted on Reply
#88
RedelZaVedno
SOAREVERSORRasterization is dead. You just don't know it yet.

Rasterization is just a way of painting something. That's it. There is more than one way. With CUDA nvidia turned into an AI company. So these are AI cards period. However AI is another way to paint a game if you want to. As nvidia is the market leader they are going to drag the entire industry to this. In the future AI and all these tricks will be how you render a game and rasterization will be dead as horse buggy. You don't have a choice in this it's already happening. Rasterization is on the way out and will be gone.

Once that's done even the engine and other things are going to move to AI. You just don't realize it yet. Everything is moving to AI and the cloud and PC gamers still have their heads in the sand about what's been going on for years now even though the companies involved have been talking about it openly.
I'll believe it when I see it. Sure, maybe in 10, 15 years down the road rasterization will be dead, until then I'm still gonna buy GPUs on a basis of how fast GPU can rasterize. Software companies don't move as fast as hardware devs want them to move if at all. It took 10 years for Nvidia to force Cudas in general software. We had hyperthreading hardware option for ages and it's only today that parallel computing has really being implemented to some extend and in most cases still poorly I may add.
Posted on Reply
#89
SOAREVERSOR
RedelZaVednoI'll believe it when I see it. Sure, maybe in 10, 15 years down the road rasterization will be dead, until then I'm still gonna buy GPUs on a basis of how fast GPU can rasterize. Software companies don't move as fast as hardware devs want them to move if at all. It took 10 years for Nvidia to force Cudas in general software. We had hyperthreading hardware option for ages and it's only today that parallel computing has really being implemented to some extend and in most cases still poorly I may add.
Then you'll get run over. There's a reason all the performance is TOPS now. You don't have a choice in this. You're seeing it now. Clinging to rasterization is like humping a dead pig now. Sure you can do it, but it doesn't mean you aren't humping a dead pig.
Posted on Reply
#90
RedelZaVedno
SOAREVERSORThen you'll get run over. There's a reason all the performance is TOPS now. You don't have a choice in this. You're seeing it now. Clinging to rasterization is like humping a dead pig now. Sure you can do it, but it doesn't mean you aren't humping a dead pig.
I've been in this hobby since Amiga/Commodore 64 days and I'm still here. New tech comes and go, lots of hype, some changes from time to time once dust settles as game goes on. No need to rush.
Posted on Reply
#91
SOAREVERSOR
RedelZaVednoI've been in this hobby since Amiga/Commodore 64 days and I'm still here. New tech comes and go, lots of hype, some changes from time to time once dust settles as game goes on. No need to rush.
Same and I remember when people threw fits about Steam and digital distribution and here we are. Nvidia is an AI company they keep saying it. These are AI cards that are moving more and more of painting the game over to AI. The industry is following them. You mean not like it but rasterization is dead. The move is already in process. And gamers do not get a vote in it. The only vote is to stop gaming on the PC. Do you game on PC? You're voting for AI and no rasterization and cloud gaming then.
Posted on Reply
#92
evernessince
Od1sseasBefore you guys make stupid comments about latency, the latency is actually the same (Nvidia has a video comparing DLSS FG vs Multi-Frame DLSS FG). They also just released Reflex 2 which further reduces latency by 50%.
So more generated frames but LOWER latency at the same time.
Bold choice to take Nvidia's marketing at face value given it's common knowledge that their numbers are always fanciful.

Wait for reviews, period.
Posted on Reply
#93
AGlezB
redeyei like it, i like it alot… but,
wheres the declaration that variable DLSS4 is a boolean ?… and the declaration that gpu.modelNumber is an int variable.. LOL,

/possibly-wrong-have-not-programmed-in-20-years… so i am not sure what happens when variable are not declared in the lastest ”fancy” programming language… may AI compilers will clean that up…( lol)
but in any case, like it when people can refactor code into something shorter…. and means that they can debug code which is more important than writing code, IMO.
You'll have to direct those questions to the OP because the only thing I did was a very little (and offtopic) refactoring based on the assumption that the original code was writen in a sane language. By sane I mean where true is true and not something like #define true whatever. You see, as a regular reader of The Daily WTF I have Opinions(TM) regarding the best ways to write code. :laugh:
Posted on Reply
#94
RedelZaVedno
SOAREVERSORSame and I remember when people threw fits about Steam and digital distribution and here we are. Nvidia is an AI company they keep saying it. These are AI cards that are moving more and more of painting the game over to AI. The industry is following them. You mean not like it but rasterization is dead. The move is already in process. And gamers do not get a vote in it. The only vote is to stop gaming on the PC. Do you game on PC? You're voting for AI and no rasterization and cloud gaming then.
Cloud gaming's gonna be a thing when latency, bandwith and stability aren't a problems anymore. I can see most PC and console gamers move to cloud in a 10, 20 years time, but not just yet. The proof is MSFS 2024, nearly 100% cloud based game. If flight simulator on Intel servers can't stream fast and stable enough, we're still far away from achieving satisfactory results for spoiled PC gaming crowd. I believe it will be a combo of streaming and local stored data when it comes to open world games for some time, before moving everything into the cloud.
Posted on Reply
#95
evernessince
Vya DomusYep, called it many months ago, a gazillion interpolated frames. Screw it just modify the driver so that it always reports 99999999 FPS, why keep doing this ? That's the end game anyway.
Might as well just have the AI completely simulate the game at that point, no need to buy or install it. According to Nvidia you can have your AI simulated game upscaled with AI enabled DLSS with AI FG, AI textures, AI Animations, AI compression, AI, AI, and more AI. All the while the pasta you are eating was designed by AI, manufactured by AI, the grain grown and picked by AI, and even the factory operation optimized by AI. AI can be used to teach other AI and AI can be used to check the quality of work done by AI. That's the future Nvidia is pushing. I have to wonder where the humans come into the equation. What they are describing are AIs to replace humans, not supplement them. The highly specialized agents are designed to replace professionals. All this tech costs money of course and seeing as the rich are not the generous type I have a pretty good idea of who the primary beneficiaries are.
Posted on Reply
#96
rv8000
zigzagThe difference in details between the new transformers based models and the old CNN based models seems huge.

Old vs New:


At 4:26 he says that the new models require four times more compute during inference. Inference takes only a small part of the whole frame time, so the final performance impact won't be nearly so dramatic.

Interested to see a review of how the quality/performance of the new models compares to the old models.
If the second image is the upscaled version, its very over sharpened, aliasing is far worse, and “details” that don’t exist are being added. Easily a worse end result.
Posted on Reply
#97
SOAREVERSOR
RedelZaVednoCloud gaming's gonna be a thing when latency, bandwith and stability aren't a problems anymore. I can see most PC and console gamers move to cloud in a 10, 20 years time, but not just yet. The proof is MSFS 2024, nearly 100% cloud based game. If flight simulator on Intel servers can't stream fast and stable enough, we're still far away from achieving satisfactory results for spoiled PC gaming crowd. I believe it will be a combo of streaming and local stored data when it comes to open world games for some time, before moving everything into the cloud.
The catch is what the PC gaming crowd wants doesn't matter one damn bit. PC crowd wants raster cards and free nvidia. The 5090 is an AI, ML, DL, NN card you can run games on and priced as such because gamers do not matter. All the other cards are now AI cards as well because gamers do not matter. Rasterization is being tossed out now for AI and other ways to render because gamers do not matter.

Gamers can toddler stomp footsies in the corner all they want and it doesn't change squat.
RedelZaVednoSomething's not right. Not a single slide showing pure rasterization performance of new GPUs Vs 40 series. I wonder why?:rolleyes: Is it because it sucks as predicted based on shader counts and bit bus numbers (with exception of 5090)?
And this 4:1 frame generation, if Nvidia's "optical flow" used with OPenXR in VR for reprojection (FG), where extrapolating 30hz to 90Hz mostly sucks, it's just a big gimmick to mask poor rasterization advances in Blackwell series.
Let me put it this way. You yourself see it happening here. And then you spin around and say it won't happen and rasterization will stay. You see it with your own eyes and talk about it and then deny reality because you don't want it to be true. But that's the issue. It is true. And if you want to game on your PC you have to eat it now. And if you don't want to eat it you have to get off the PC. In the end, nvidia won and did exactly what they have been telling you they would do, were doing, and now did do!
Posted on Reply
#98
rv8000
SOAREVERSORThe catch is what the PC gaming crowd wants doesn't matter one damn bit. PC crowd wants raster cards and free nvidia. The 5090 is an AI, ML, DL, NN card you can run games on and priced as such because gamers do not matter. All the other cards are now AI cards as well because gamers do not matter. Rasterization is being tossed out now for AI and other ways to render because gamers do not matter.

Gamers can toddler stomp footsies in the corner all they want and it doesn't change squat.


Let me put it this way. You yourself see it happening here. And then you spin around and say it won't happen and rasterization will stay. You see it with your own eyes and talk about it and then deny reality because you don't want it to be true. But that's the issue. It is true. And if you want to game on your PC you have to eat it now. And if you don't want to eat it you have to get off the PC. In the end, nvidia won and did exactly what they have been telling you they would do, were doing, and now did do!
This is an interesting way to state that you don’t understand rasterization or ray tracing has to exist for AI/Frame Gen/and Upscalers to do what they do. A game has to be rendered via rasterization or ray tracing/path tracing in order to interpolate additional frames. And seeing as no card is truly capable of ray/path tracing, in real time, without the assistance of denoisers and upscalers, rasterization is going no where.
Posted on Reply
#99
SOAREVERSOR
evernessinceMight as well just have the AI completely simulate the game at that point, no need to buy or install it. According to Nvidia you can have your AI simulated game upscaled with AI enabled DLSS with AI FG, AI textures, AI Animations, AI compression, AI, AI, and more AI. All the while the pasta you are eating was designed by AI, manufactured by AI, the grain grown and picked by AI, and even the factory operation optimized by AI. AI can be used to teach other AI and AI can be used to check the quality of work done by AI. That's the future Nvidia is pushing. I have to wonder where the humans come into the equation. What they are describing are AIs to replace humans, not supplement them. The highly specialized agents are designed to replace professionals. All this tech costs money of course and seeing as the rich are not the generous type I have a pretty good idea of who the primary beneficiaries are.
The end game is games designed by prompt and then created in AI by an engine done by AI and then rendered by AI and served by the cloud. The cloud part is up there. Games are already being rendered more and more by AI. Parts of games are already being designed by AI. This isn't a distant in the future parts of it are already in place and it has been speeding up. People just refuse to admit it because they think they are special snowflakes because of their gaming PC and so it won't happen despite the fact that PC gaming is what's leading the way to this future dragging everything else with it.
Posted on Reply
#100
zigzag
rv8000If the second image is the upscaled version, its very over sharpened, aliasing is far worse, and “details” that don’t exist are being added. Easily a worse end result.
Both are upscaled (left: old model, right: new model). Note that these two images are from a screenshot of a Youtube video, so there are too many video compression and resizing artefacts for in-depth quality analysis.

1) I agree that the new one looks over-sharpened. Comparisons from more games are needed to see if it's caused by the new model or if it's game specific or if it's due to resizing and video compression.

2) Aliasing might seem worse to you because of the added detail (sharper images always make aliasing more visible), but if you look at the video, you will see that there is more aliasing in the old one (even though it has softer image which should hide aliasing)

3) These details may exist in previous frames. Each frame a different sub-pixel shift is added, which makes a real resolution increase possible when data from multiple frames is merged together. But even if all these micro details are completely made up, if they are fitting and make sense, then they make CG graphics better. You are not getting 100% what artist has envisioned anyway (production time constraints and tools limitations, texture compression, real-time 3D rendering limitations, game size constraints etc.).
Posted on Reply
Add your own comment
Jan 8th, 2025 22:08 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts