• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Introduces DLSS 4 with Multi-Frame Generation for up to 8X Framerate Uplifts

Joined
Jul 10, 2008
Messages
341 (0.06/day)
Location
Wasteland
System Name Cast Lead™
Processor Intel(R) Core(TM) i5-7600 CPU @ 3.50GHz
Motherboard Asus ROG STRIX H270F GAMING
Cooling Cooler Master Hyper 212 LED
Memory G.SKILL Ripjaws V Series 8GB (2 x 4GB) DDR4 2400
Video Card(s) MSI Radeon RX 480 GAMING X 8G
Storage Seagate 1TB 7200 rpm
Display(s) LG 24MP59G
Case Green Z4
Audio Device(s) Realtek High Definition Audio
Power Supply Green 650 UK Gold
Software Windows 10™ Pro 64 bit
my body is ready
 
Joined
Oct 28, 2012
Messages
1,204 (0.27/day)
Processor AMD Ryzen 3700x
Motherboard asus ROG Strix B-350I Gaming
Cooling Deepcool LS520 SE
Memory crucial ballistix 32Gb DDR4
Video Card(s) RTX 3070 FE
Storage WD sn550 1To/WD ssd sata 1To /WD black sn750 1To/Seagate 2To/WD book 4 To back-up
Display(s) LG GL850
Case Dan A4 H2O
Audio Device(s) sennheiser HD58X
Power Supply Corsair SF600
Mouse MX master 3
Keyboard Master Key Mx
Software win 11 pro
I guess the summary is that Nvidia has moved on from traditional gen ras rendering to AI/RT/DLSS rendering. If you buy a 5000 series card, expect to implement this new render method or fail to get your money’s worth.

For the rest of us who just want to game old school on an old school budget, Intel and AMD will do just fine.
Inb4 Intel/AMD introduce a similar tech. Even in the wake of the RX 9000 slide and FSR4, Xess2 people still seem to watch the GPU market with half rose tinted glasses and really believe that AMD/Intel won’t follow in Nvidia footsteps. RDNA4 is going to be the last Gaming focused arch from AMD, UDNA will merge the HPC side with the consumer side. With ever better performance in AI tasks, even if gamers don’t care. Same pattern as Nvidia.

Battlemage also got it’s own set of issues if you don’t use it with a recent fast CPU, the driver overhead is massive. And can diminish it’s price/performance ratio in some games.
IMG_3694.png


All I’m seeing are companies offering a slightly better price to performance ratio because they struggle to take the crown. Even with their focus on software trickery, Nvidia is somehow still selling the fastest GPU in rasterisation (when the other are supposedly rasterisation specialists )
 
Joined
Sep 21, 2023
Messages
37 (0.08/day)
The difference in details between the new transformers based models and the old CNN based models seems huge.

Old vs New:
1736256100371.png
1736256107698.png


At 4:26 he says that the new models require four times more compute during inference. Inference takes only a small part of the whole frame time, so the final performance impact won't be nearly so dramatic.

Interested to see a review of how the quality/performance of the new models compares to the old models.
 
Joined
Jun 14, 2020
Messages
3,754 (2.25/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
. Even with their focus on software trickery, Nvidia is somehow still selling the fastest GPU in rasterisation (when the other are supposedly rasterisation specialists )
Thats crazy right? People complaining about generated frames and what not when the company that has those features also has the fastest cards in both RT and raster
 
Joined
Feb 24, 2013
Messages
193 (0.04/day)
System Name Vaksdal Venom
Processor Ryzen 7 7800X3D
Motherboard Gigabyte B650 Aorus Elite AX V2
Cooling Thermalright Peerless Assassin 120 SE
Memory G.Skill Flare X5 DDR5-6000 32GB (2x16GB) CL30-38-38-96
Video Card(s) MSI GeForce RTX 3080 SUPRIM X
Storage WD Black SN750 500GB, Samsung 840 EVO 500GB, Samsung 850 EVO 500GB, Samsung 860 EVO 1TB
Display(s) Viewsonic XG2431
Case Lian Li O11D Evo XL (aka Dynamic Evo XL) White
Audio Device(s) ASUS Xonar Essence STX
Power Supply Corsair HX1000 PSU - 1000 W
Mouse Logitech G703 Hero
Software Windows 10 Home 64-bit
As usual a good and sober take from HUB:

 
Joined
Jun 14, 2020
Messages
3,754 (2.25/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
By the way, Am I the only one who have the feeling that 5090 is ultra slow garbage that is barely doing 30 fps in 4k when DLSS is disabled? Its not me, its Nvidia's own claim
View attachment 378677
If the 5090 is doing 30 just imagine what every card does.
 
Joined
Apr 10, 2020
Messages
549 (0.32/day)
Something's not right. Not a single slide showing pure rasterization performance of new GPUs Vs 40 series. I wonder why?:rolleyes: Is it because it sucks as predicted based on shader counts and bit bus numbers (with exception of 5090)?
And this 4:1 frame generation, if Nvidia's "optical flow" used with OPenXR in VR for reprojection (FG), where extrapolating 30hz to 90Hz mostly sucks, it's just a big gimmick to mask poor rasterization advances in Blackwell series.
 
Joined
Feb 24, 2013
Messages
193 (0.04/day)
System Name Vaksdal Venom
Processor Ryzen 7 7800X3D
Motherboard Gigabyte B650 Aorus Elite AX V2
Cooling Thermalright Peerless Assassin 120 SE
Memory G.Skill Flare X5 DDR5-6000 32GB (2x16GB) CL30-38-38-96
Video Card(s) MSI GeForce RTX 3080 SUPRIM X
Storage WD Black SN750 500GB, Samsung 840 EVO 500GB, Samsung 850 EVO 500GB, Samsung 860 EVO 1TB
Display(s) Viewsonic XG2431
Case Lian Li O11D Evo XL (aka Dynamic Evo XL) White
Audio Device(s) ASUS Xonar Essence STX
Power Supply Corsair HX1000 PSU - 1000 W
Mouse Logitech G703 Hero
Software Windows 10 Home 64-bit
Thats crazy right? People complaining about generated frames and what not when the company that has those features also has the fastest cards in both RT and raster
I don't see what those two things have to do with another. Complainers want GPU manufacturers to focus less on frame insertion and more on actual raw performance gains from one generation to the next.
It's a completely valid complaint.
I mean, look at those "performance" comparisons from the 5000 series marketing. Comparing 4000 series with a card that inserts 3x more frames. That's absurd at best.
 
Joined
Dec 24, 2010
Messages
584 (0.11/day)
Location
mississauga, on, Canada
System Name YACS amd
Processor 5800x,
Motherboard gigabyte x570 aorus gaming elite.
Cooling bykski GPU, and CPU, syscooling p93x pump
Memory corsair vengeance pro rgb, 3600 ddr4 stock timings.
Video Card(s) xfx merc 310 7900xtx
Storage kingston kc3000 2TB, amongst others. Fanxiang s770 2TB
Display(s) benq ew3270u, or acer XB270hu, acer XB280hk, asus VG 278H,
Case lian li LANCOOL III
Audio Device(s) obs,
Power Supply FSP Hydro Ti pro 1000w
Mouse logitech g703
Keyboard durogod keyboard. (cherry brown switches)
Software win 11, win10pro.
Code:
DLSS4 = gpu.modelNumber > 5000

Sorry. Couldn't resist. :p
i like it, i like it alot… but,
wheres the declaration that variable DLSS4 is a boolean ?… and the declaration that gpu.modelNumber is an int variable.. LOL,

/possibly-wrong-have-not-programmed-in-20-years… so i am not sure what happens when variable are not declared in the lastest ”fancy” programming language… may AI compilers will clean that up…( lol)
but in any case, like it when people can refactor code into something shorter…. and means that they can debug code which is more important than writing code, IMO.
 
Last edited:
Joined
Apr 10, 2020
Messages
549 (0.32/day)
Everything about rendering games is fake. You do know that right? It's just different methods of fakery.
I don't care if it's fake or not, all I care about is image quality I see on screen and performance. All I know is that native rasterization at high resolutions (I mainly game at 3160x3160p*2 in VR) looks much better than either DLSS, FSR, XSEE, or Valve's reprojection method or Nvidia's "optical flow" reprojection. DLSS frame generation not being available in VR says a lot about the quality of inserted frames. You can fool your brains when looking at the monitor, but it's hard to do the same on 10 megapixel panels inches away from your eyes.
 
Joined
Apr 13, 2022
Messages
1,230 (1.23/day)
Something's not right. Not a single slide showing pure rasterization performance of new GPUs Vs 40 series. I wonder why?:rolleyes: Is it because it sucks as predicted based on shader counts and bit bus numbers (with exception of 5090)?
And this 4:1 frame generation, if Nvidia's "optical flow" used with OPenXR in VR for reprojection (FG), where extrapolating 30hz to 90Hz mostly sucks, it's just a big gimmick to mask poor rasterization advances in Blackwell series.
Rasterization is dead. You just don't know it yet.

Rasterization is just a way of painting something. That's it. There is more than one way. With CUDA nvidia turned into an AI company. So these are AI cards period. However AI is another way to paint a game if you want to. As nvidia is the market leader they are going to drag the entire industry to this. In the future AI and all these tricks will be how you render a game and rasterization will be dead as horse buggy. You don't have a choice in this it's already happening. Rasterization is on the way out and will be gone.

Once that's done even the engine and other things are going to move to AI. You just don't realize it yet. Everything is moving to AI and the cloud and PC gamers still have their heads in the sand about what's been going on for years now even though the companies involved have been talking about it openly.
 
Joined
Apr 10, 2020
Messages
549 (0.32/day)
Rasterization is dead. You just don't know it yet.

Rasterization is just a way of painting something. That's it. There is more than one way. With CUDA nvidia turned into an AI company. So these are AI cards period. However AI is another way to paint a game if you want to. As nvidia is the market leader they are going to drag the entire industry to this. In the future AI and all these tricks will be how you render a game and rasterization will be dead as horse buggy. You don't have a choice in this it's already happening. Rasterization is on the way out and will be gone.

Once that's done even the engine and other things are going to move to AI. You just don't realize it yet. Everything is moving to AI and the cloud and PC gamers still have their heads in the sand about what's been going on for years now even though the companies involved have been talking about it openly.
I'll believe it when I see it. Sure, maybe in 10, 15 years down the road rasterization will be dead, until then I'm still gonna buy GPUs on a basis of how fast GPU can rasterize. Software companies don't move as fast as hardware devs want them to move if at all. It took 10 years for Nvidia to force Cudas in general software. We had hyperthreading hardware option for ages and it's only today that parallel computing has really being implemented to some extend and in most cases still poorly I may add.
 
Joined
Apr 13, 2022
Messages
1,230 (1.23/day)
I'll believe it when I see it. Sure, maybe in 10, 15 years down the road rasterization will be dead, until then I'm still gonna buy GPUs on a basis of how fast GPU can rasterize. Software companies don't move as fast as hardware devs want them to move if at all. It took 10 years for Nvidia to force Cudas in general software. We had hyperthreading hardware option for ages and it's only today that parallel computing has really being implemented to some extend and in most cases still poorly I may add.
Then you'll get run over. There's a reason all the performance is TOPS now. You don't have a choice in this. You're seeing it now. Clinging to rasterization is like humping a dead pig now. Sure you can do it, but it doesn't mean you aren't humping a dead pig.
 
Joined
Apr 10, 2020
Messages
549 (0.32/day)
Then you'll get run over. There's a reason all the performance is TOPS now. You don't have a choice in this. You're seeing it now. Clinging to rasterization is like humping a dead pig now. Sure you can do it, but it doesn't mean you aren't humping a dead pig.
I've been in this hobby since Amiga/Commodore 64 days and I'm still here. New tech comes and go, lots of hype, some changes from time to time once dust settles as game goes on. No need to rush.
 
Joined
Apr 13, 2022
Messages
1,230 (1.23/day)
I've been in this hobby since Amiga/Commodore 64 days and I'm still here. New tech comes and go, lots of hype, some changes from time to time once dust settles as game goes on. No need to rush.
Same and I remember when people threw fits about Steam and digital distribution and here we are. Nvidia is an AI company they keep saying it. These are AI cards that are moving more and more of painting the game over to AI. The industry is following them. You mean not like it but rasterization is dead. The move is already in process. And gamers do not get a vote in it. The only vote is to stop gaming on the PC. Do you game on PC? You're voting for AI and no rasterization and cloud gaming then.
 
Joined
Jul 13, 2016
Messages
3,391 (1.09/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Before you guys make stupid comments about latency, the latency is actually the same (Nvidia has a video comparing DLSS FG vs Multi-Frame DLSS FG). They also just released Reflex 2 which further reduces latency by 50%.
So more generated frames but LOWER latency at the same time.

Bold choice to take Nvidia's marketing at face value given it's common knowledge that their numbers are always fanciful.

Wait for reviews, period.
 
Joined
Sep 13, 2022
Messages
232 (0.27/day)
i like it, i like it alot… but,
wheres the declaration that variable DLSS4 is a boolean ?… and the declaration that gpu.modelNumber is an int variable.. LOL,

/possibly-wrong-have-not-programmed-in-20-years… so i am not sure what happens when variable are not declared in the lastest ”fancy” programming language… may AI compilers will clean that up…( lol)
but in any case, like it when people can refactor code into something shorter…. and means that they can debug code which is more important than writing code, IMO.
You'll have to direct those questions to the OP because the only thing I did was a very little (and offtopic) refactoring based on the assumption that the original code was writen in a sane language. By sane I mean where true is true and not something like #define true whatever. You see, as a regular reader of The Daily WTF I have Opinions(TM) regarding the best ways to write code. :laugh:
 
Joined
Apr 10, 2020
Messages
549 (0.32/day)
Same and I remember when people threw fits about Steam and digital distribution and here we are. Nvidia is an AI company they keep saying it. These are AI cards that are moving more and more of painting the game over to AI. The industry is following them. You mean not like it but rasterization is dead. The move is already in process. And gamers do not get a vote in it. The only vote is to stop gaming on the PC. Do you game on PC? You're voting for AI and no rasterization and cloud gaming then.
Cloud gaming's gonna be a thing when latency, bandwith and stability aren't a problems anymore. I can see most PC and console gamers move to cloud in a 10, 20 years time, but not just yet. The proof is MSFS 2024, nearly 100% cloud based game. If flight simulator on Intel servers can't stream fast and stable enough, we're still far away from achieving satisfactory results for spoiled PC gaming crowd. I believe it will be a combo of streaming and local stored data when it comes to open world games for some time, before moving everything into the cloud.
 
Joined
Jul 13, 2016
Messages
3,391 (1.09/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage P5800X 1.6TB 4x 15.36TB Micron 9300 Pro 4x WD Black 8TB M.2
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) JDS Element IV, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse PMM P-305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
Yep, called it many months ago, a gazillion interpolated frames. Screw it just modify the driver so that it always reports 99999999 FPS, why keep doing this ? That's the end game anyway.

Might as well just have the AI completely simulate the game at that point, no need to buy or install it. According to Nvidia you can have your AI simulated game upscaled with AI enabled DLSS with AI FG, AI textures, AI Animations, AI compression, AI, AI, and more AI. All the while the pasta you are eating was designed by AI, manufactured by AI, the grain grown and picked by AI, and even the factory operation optimized by AI. AI can be used to teach other AI and AI can be used to check the quality of work done by AI. That's the future Nvidia is pushing. I have to wonder where the humans come into the equation. What they are describing are AIs to replace humans, not supplement them. The highly specialized agents are designed to replace professionals. All this tech costs money of course and seeing as the rich are not the generous type I have a pretty good idea of who the primary beneficiaries are.
 
Joined
Apr 14, 2018
Messages
747 (0.30/day)
The difference in details between the new transformers based models and the old CNN based models seems huge.

Old vs New:
View attachment 378787 View attachment 378788

At 4:26 he says that the new models require four times more compute during inference. Inference takes only a small part of the whole frame time, so the final performance impact won't be nearly so dramatic.

Interested to see a review of how the quality/performance of the new models compares to the old models.

If the second image is the upscaled version, its very over sharpened, aliasing is far worse, and “details” that don’t exist are being added. Easily a worse end result.
 
Joined
Apr 13, 2022
Messages
1,230 (1.23/day)
Cloud gaming's gonna be a thing when latency, bandwith and stability aren't a problems anymore. I can see most PC and console gamers move to cloud in a 10, 20 years time, but not just yet. The proof is MSFS 2024, nearly 100% cloud based game. If flight simulator on Intel servers can't stream fast and stable enough, we're still far away from achieving satisfactory results for spoiled PC gaming crowd. I believe it will be a combo of streaming and local stored data when it comes to open world games for some time, before moving everything into the cloud.
The catch is what the PC gaming crowd wants doesn't matter one damn bit. PC crowd wants raster cards and free nvidia. The 5090 is an AI, ML, DL, NN card you can run games on and priced as such because gamers do not matter. All the other cards are now AI cards as well because gamers do not matter. Rasterization is being tossed out now for AI and other ways to render because gamers do not matter.

Gamers can toddler stomp footsies in the corner all they want and it doesn't change squat.

Something's not right. Not a single slide showing pure rasterization performance of new GPUs Vs 40 series. I wonder why?:rolleyes: Is it because it sucks as predicted based on shader counts and bit bus numbers (with exception of 5090)?
And this 4:1 frame generation, if Nvidia's "optical flow" used with OPenXR in VR for reprojection (FG), where extrapolating 30hz to 90Hz mostly sucks, it's just a big gimmick to mask poor rasterization advances in Blackwell series.
Let me put it this way. You yourself see it happening here. And then you spin around and say it won't happen and rasterization will stay. You see it with your own eyes and talk about it and then deny reality because you don't want it to be true. But that's the issue. It is true. And if you want to game on your PC you have to eat it now. And if you don't want to eat it you have to get off the PC. In the end, nvidia won and did exactly what they have been telling you they would do, were doing, and now did do!
 
Joined
Apr 14, 2018
Messages
747 (0.30/day)
The catch is what the PC gaming crowd wants doesn't matter one damn bit. PC crowd wants raster cards and free nvidia. The 5090 is an AI, ML, DL, NN card you can run games on and priced as such because gamers do not matter. All the other cards are now AI cards as well because gamers do not matter. Rasterization is being tossed out now for AI and other ways to render because gamers do not matter.

Gamers can toddler stomp footsies in the corner all they want and it doesn't change squat.


Let me put it this way. You yourself see it happening here. And then you spin around and say it won't happen and rasterization will stay. You see it with your own eyes and talk about it and then deny reality because you don't want it to be true. But that's the issue. It is true. And if you want to game on your PC you have to eat it now. And if you don't want to eat it you have to get off the PC. In the end, nvidia won and did exactly what they have been telling you they would do, were doing, and now did do!

This is an interesting way to state that you don’t understand rasterization or ray tracing has to exist for AI/Frame Gen/and Upscalers to do what they do. A game has to be rendered via rasterization or ray tracing/path tracing in order to interpolate additional frames. And seeing as no card is truly capable of ray/path tracing, in real time, without the assistance of denoisers and upscalers, rasterization is going no where.
 
Joined
Apr 13, 2022
Messages
1,230 (1.23/day)
Might as well just have the AI completely simulate the game at that point, no need to buy or install it. According to Nvidia you can have your AI simulated game upscaled with AI enabled DLSS with AI FG, AI textures, AI Animations, AI compression, AI, AI, and more AI. All the while the pasta you are eating was designed by AI, manufactured by AI, the grain grown and picked by AI, and even the factory operation optimized by AI. AI can be used to teach other AI and AI can be used to check the quality of work done by AI. That's the future Nvidia is pushing. I have to wonder where the humans come into the equation. What they are describing are AIs to replace humans, not supplement them. The highly specialized agents are designed to replace professionals. All this tech costs money of course and seeing as the rich are not the generous type I have a pretty good idea of who the primary beneficiaries are.
The end game is games designed by prompt and then created in AI by an engine done by AI and then rendered by AI and served by the cloud. The cloud part is up there. Games are already being rendered more and more by AI. Parts of games are already being designed by AI. This isn't a distant in the future parts of it are already in place and it has been speeding up. People just refuse to admit it because they think they are special snowflakes because of their gaming PC and so it won't happen despite the fact that PC gaming is what's leading the way to this future dragging everything else with it.
 
Top