Monday, January 6th 2025

NVIDIA Introduces DLSS 4 with Multi-Frame Generation for up to 8X Framerate Uplifts

With the GeForce RTX 50-series "Blackwell" generation, NVIDIA is introducing the new DLSS 4 technology. The most groundbreaking feature being introduced with DLSS 4 is multi-frame generation. The technology relies on generative AI to predict up to three frames ahead of a conventionally rendered frame, which in and of itself could be a result of super resolution. Since DLSS SR can effectively upscale 1 pixel into 4 (i.e. turn a 1080p render into 4K output), and DLSS 4 generates the following three frames, DLSS 4 effectively has a pixel generation factor of 1:15 (15 in every 16 pixels are generated outside the rendering pipeline). When it launches alongside the GeForce RTX 50-series later this month, over 75 game titles will be ready for DLSS 4. Multi-frame generation is a feature exclusive to "Blackwell."
Add your own comment

145 Comments on NVIDIA Introduces DLSS 4 with Multi-Frame Generation for up to 8X Framerate Uplifts

#126
londiste
Capitan HarlockDepending on the game 4K does smooth edges on is own without any AA needed so that's already not true that is necessary always.
No, resolution has absolutely nothing to do with edge aliasing.
Like I said before, your perception of aliasing might, but that is a different topic. And not a thing with normal monitor pixel densities.
Capitan HarlockRtx is not necessary always because is just like Physx used to be a thing in the past.
Physx is a whole different topic. And it did not go away because the idea was bad but primarily because CPUs got more and more cores and physics engines that could be well enough parallelized got much more performance to play with. Nvidia hobbled Physx for a while so that the version of it running on CPUs was single threaded but in the end gave up and made it work properly. Physx itself did not go away, the forced-to-run-on-GPU variant did.
developer.nvidia.com/physx-sdk
en.wikipedia.org/wiki/PhysX#PhysX_in_video_games
Posted on Reply
#127
Blueberries
SOAREVERSORThe 2000 ask is not for gamers. It's for pros. The 1000 ask is for the inbetween. The below are for gamers. People need to get that through their skulls and fast. There is a reason AMD is not competing with those because those are not gaming cards. The companies are telling us this. Yet PC gamers refuse to accept what the specs, the price, and the companies themselves keep saying. Because they are spoiled fucking shits.
That's just objectively false. Anything "RTX" is a gaming card and will not work with professional drivers.
Posted on Reply
#130
SOAREVERSOR
BlueberriesMy mistake then, that must have changed in recent years.

That's not to say the RTX 5090 is not marketed as a gaming card though.
It's marketed as a prosumer card. CUDA does not need drivers. Quadro drivers are only for CAD CAM. You're either a liar or a fucking idiot. 5090 is a prosumer card. Hell even 5080 straddles that line. Bellow that are the gaming cards. Get a clue.

Hint CUDA is a toolkit. Usually ran on linux. developer.nvidia.com/cuda-downloads that right there is the justification for the 5090. You might have missed it. That was the justification for the 8800 GTX. It's only got more extreme since Titan. That's why GPUs get banned to China. Not because 5090s will let their gold farmers in WoW out gold farm our gold farmers.

Rasterization is dead. You're buying an AI product and it's fakery. Make no mistake about it.
Posted on Reply
#131
Whitestar
SOAREVERSORThe 2000 ask is not for gamers. It's for pros.
I present to you the Nvidia 5090 page, tadaaaa! GeForce RTX 5090 Graphics Cards | NVIDIA
It's the fastest gaming card ever. It's marketed towards gamers. Please explain how it's not a gaming card other than "The companies are telling us this".
SOAREVERSORIt's marketed as a prosumer card. You're either a liar or a fucking idiot.
Again, in what way would you say they are not trying to attract gamers with that page? Good luck!

Oh and that last part; I have a feeling you'd better apologize or delete it.
Posted on Reply
#132
igormp
BagerklestyneNothing looks better than native, can look as close to native as possible, can't look better. Can't. It's interpreted math. That's not progress (getting closer is.) That's like saying AI upscaling a picture makes it look better at the same resolution.
I agree with your word, but disagree with what you meant. Reminder that the only actual "native" thing for us would be you actually seeing something with your own eyes.
Renders are just some math that try to approximate what we have in the real world. Our current approximations for games (in raw raster) is reasonable, but far from good given the real time constraints. We can achieve better quality with longer renders that do more math, and more precise math (that are still far from real life, but improving), but that'd be useless for games given the speed.

Upscaling with machine learning is nothing more than another different way to try to do this approximation in a non-deterministic way.
BagerklestyneThing is, that's taken the sub optimal time degraded image and restored it.

If you took a picture of the same subject today with a quality modern camera at the same detail level (pretend for arguments sake the original was 6000x4000 pixels - so a camera that uses the same resolution) then you couldn't improve upon the picture taken without interpolation, which is not the original image then.
Any representation that's not the real thing is a sub-optimal representation still. Even a modern camera at 6000x4000, even though it's considered good nowadays, still can't capture all details present.
That same 6000x4000 in some years will also look like a graded image that we will find ways to improve with future tech.
BlueberriesThat's just objectively false. Anything "RTX" is a gaming card and will not work with professional drivers.
Their professional lineup is also called "RTX", without the geforce branding since they dropped the "Quadro/Tesla" monikers:
www.nvidia.com/en-us/design-visualization/rtx-6000/
SOAREVERSORIt's marketed as a prosumer card. CUDA does not need drivers. Quadro drivers are only for CAD CAM. You're either a liar or a fucking idiot. 5090 is a prosumer card. Hell even 5080 straddles that line. Bellow that are the gaming cards. Get a clue.

Hint CUDA is a toolkit. Usually ran on linux. developer.nvidia.com/cuda-downloads that right there is the justification for the 5090. You might have missed it. That was the justification for the 8800 GTX. It's only got more extreme since Titan. That's why GPUs get banned to China. Not because 5090s will let their gold farmers in WoW out gold farm our gold farmers.

Rasterization is dead. You're buying an AI product and it's fakery. Make no mistake about it.
FWIW CUDA does need drivers. And on linux it's a single driver for all product lines, be it quadro, tesla or geforce (only the tegra ones get a different driver).
Posted on Reply
#134
Dawora
N3utroI'd bet these people saying they dont like "fake frames" wouldn't be able to tell which is the AI generated frames gameplay vs non AI generated at the same framerate in a blind test. Probably would say the AI generated gameplay would even look better than the original one at the same framerate.
ppls say because they use AMD and Nvidia is now Rank#1 in FG/Upscaling
So they dont like FG or DLSS because AMD clearly is 2# now

If Amd is better than Nvidia in FG/upscaling Amd fans will love it.
Posted on Reply
#135
Krit
A little bit closer to the truth!

[COLOR=rgb(226, 80, 65)]I was right about the 5090. It’s BAD[/COLOR]

Posted on Reply
#136
kapone32
Daworappls say because they use AMD and Nvidia is now Rank#1 in FG/Upscaling
So they dont like FG or DLSS because AMD clearly is 2# now

If Amd is better than Nvidia in FG/upscaling Amd fans will love it.
We don't need it. Buy the card that fits your resolution.
Posted on Reply
#137
mouacyk
Do I want smoother past frames or do I want future frames to arrive faster? I want both, but I prefer the latter over the former. Leather jacket is primarily selling the former this generation, in the disguise of both.
Posted on Reply
#138
londiste
KritA little bit closer to the truth!

[COLOR=rgb(226, 80, 65)]I was right about the 5090. It’s BAD[/COLOR]

tl;dw?
I would be quite surprised if he would find 5090 was good.
Posted on Reply
#139
Raffles
Legacy-ZACorrect, so, get a 165Hz monitor, lock it to that Hz, enjoy 6ms latency. :D

This is why I can never get used to older systems anymore, the high refresh locks ruined me for life and I NEED, a better GPU to keep it near that 165Hz in AAA titles, these technologies from nVidia will ensure it, although, perhaps at a little graphical loss, but from what I saw, it doesn't seem that way anymore.

Anyways, we will see from W1zzards testing. :)
This is actually the main reason why I refuse to play games at over 60fps. I don't want to get used to it, and for 60fps to feel like 30fps, as I happen to be a console gamer and PC gamer equally. If 120 becomes the new 60, suddenly even nice smooth 60fps console games might start to feel more like 30 ;)

And 60 feels perfectly smooth to me, visually. To me, above 60hz for games is a law of diminishing returns, other than latency. Below 60, well that's another matter ;)
Posted on Reply
#140
oddrobert
Od1sseasProgress denier. I see
Fake frames are in Hi-Fi home theatres for decades, most people don't use it, because it is just worse than source 99%. Some upscales to anime were OK ish, but nothing noticeable.
Nvidia just took that used in drivers slam AI deep learning $$ name on it and priced it double.
Hi FI marketing won't work here, same as didn't work with phones.

Hi-Fi
Posted on Reply
#141
londiste
oddrobertFake frames are in Hi-Fi home theatres for decades, most people don't use it, because it is just worse than source 99%. Some upscales to anime were OK ish, but nothing noticeable.
TV and movies work at fixed framerate and deviations from that are not well received even if it is not fake frames. Soap opera effect is the most known-common problem.
Games on the other hand benefit from increased framerate and getting more frames is valued and preferred.

And you are too late - regardless of whether you or me likes it, it already worked. Nvidia introduced frame generation more than 2 years ago now. AMD followed last year. Consoles somewhat surprisingly have not adopted it yet but they are on a bit older hardware as well.
Posted on Reply
#142
tpa-pr
It's a bit disappointing that upscaling went from "a neat technology to increase the longevity of older hardware" to "a requirement to get decent performance" when in-game graphic fidelity has plateaued. I'd be happy to use upscaling if it seemed warranted (eg: some of the Cyberpunk photo-realism mods I've seen make me think that yes, a GPU is going to need help rendering that), but when we have games that look the same as counterparts from 5 years ago requiring exponentially more resources it seems less like "advancement" and more "cheating".

EDIT: And just to add, I don't think it's fair to call people critical of upscaling "progress deniers". Being critical of a new technology doesn't automatically mean they're in favour of stagnation. Being critical of a technology could lead to improvement if enough people are heard.
Posted on Reply
#143
oddrobert
So ye It's here, and we will be able to top out monitor refresh rate with it, regardless need strong base performance because the best effect it gives in 2x( still artefacts) with 4x and 8x it can be very sloppy.
So if you get 120 fps pure raster and want to trigger it to make it 240 is OK, ish. At 60 fps base is questionable.
So it's a gimmick you want to use if the game runs already very well. With very questionable use cases. But putting it in first place when describing GPU is smoke and mirrors.

Anything more than 500usa for GPU is high end. Nvidia clearly points toward 4070 ti super/5070 ti to upscale its profits from player base.
And It's OK its company which want to make profit, but keep on doing product which is genuine good and no marketing and consumer disinformation capitalization.
Hope people will just stop being gamers, at least for next 6ish years, vote with your wallets people please.
But if you're 30fps with lower resolution and then make it blown in resolution and frames it will look like crap or generated slop at best.
Posted on Reply
#144
oddrobert
ps6 is something like 6700xt so in the GPU department for a price progress was slower than I would like of course.
Ngreedia have no value proposal in reasonable prices at all (by design) aiming for 600£+ hi fi gimics.

So anything like, 6700xt till next game consoles will do just enough.
Intel delivered b5800 which is just that for good price in the USA, here 6700xt still is cheaper.
Which is real progress for most of the costumers.

And ps6 is still 1 or 2 GPU generations away, same with next gen gaafet transistors on chips etc.
300£ 6700xt will do just OK for another 3 years. b580 same.

Just wish there wuld be something much better amiable at 400-500.
7800xt look great if price meets.


Seeing nvidia huge margins, there will be new players taking some of it for sure, and i w8 for them to hand my money.
If AMD delivers performant GPU for a price it will be there, if there will be intel aviable in good price they claim to be same (it isn't good value in EU at all)
Posted on Reply
#145
zigzag
tpa-prwhen we have games that look the same as counterparts from 5 years ago requiring exponentially more resources it seems less like "advancement" and more "cheating".
It's just that in-game graphics have reached that level of fidelity. Offline 3D rendering has reached that level long ago – visually minor improvements require exponentially more compute.

There used to be a huge visual difference between low and high graphic settings in games. In modern games that difference is not nearly that big. Low settings often look as good as games from 5 years ago and high settings are not much better, but the difference between GPU requirements to be able to run low or high settings remains big.

EDIT: Think of a modern game that you think it's optimized. Then compare visual fidelity and FPS between low and high settings in that game. It's probably going to be a small graphics quality difference and a big FPS difference.
Posted on Reply
Add your own comment
Mar 6th, 2025 21:54 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts