Monday, January 6th 2025

NVIDIA Introduces DLSS 4 with Multi-Frame Generation for up to 8X Framerate Uplifts
With the GeForce RTX 50-series "Blackwell" generation, NVIDIA is introducing the new DLSS 4 technology. The most groundbreaking feature being introduced with DLSS 4 is multi-frame generation. The technology relies on generative AI to predict up to three frames ahead of a conventionally rendered frame, which in and of itself could be a result of super resolution. Since DLSS SR can effectively upscale 1 pixel into 4 (i.e. turn a 1080p render into 4K output), and DLSS 4 generates the following three frames, DLSS 4 effectively has a pixel generation factor of 1:15 (15 in every 16 pixels are generated outside the rendering pipeline). When it launches alongside the GeForce RTX 50-series later this month, over 75 game titles will be ready for DLSS 4. Multi-frame generation is a feature exclusive to "Blackwell."
145 Comments on NVIDIA Introduces DLSS 4 with Multi-Frame Generation for up to 8X Framerate Uplifts
Like I said before, your perception of aliasing might, but that is a different topic. And not a thing with normal monitor pixel densities. Physx is a whole different topic. And it did not go away because the idea was bad but primarily because CPUs got more and more cores and physics engines that could be well enough parallelized got much more performance to play with. Nvidia hobbled Physx for a while so that the version of it running on CPUs was single threaded but in the end gave up and made it work properly. Physx itself did not go away, the forced-to-run-on-GPU variant did.
developer.nvidia.com/physx-sdk
en.wikipedia.org/wiki/PhysX#PhysX_in_video_games
That's not to say the RTX 5090 is not marketed as a gaming card though.
Hint CUDA is a toolkit. Usually ran on linux. developer.nvidia.com/cuda-downloads that right there is the justification for the 5090. You might have missed it. That was the justification for the 8800 GTX. It's only got more extreme since Titan. That's why GPUs get banned to China. Not because 5090s will let their gold farmers in WoW out gold farm our gold farmers.
Rasterization is dead. You're buying an AI product and it's fakery. Make no mistake about it.
It's the fastest gaming card ever. It's marketed towards gamers. Please explain how it's not a gaming card other than "The companies are telling us this". Again, in what way would you say they are not trying to attract gamers with that page? Good luck!
Oh and that last part; I have a feeling you'd better apologize or delete it.
Renders are just some math that try to approximate what we have in the real world. Our current approximations for games (in raw raster) is reasonable, but far from good given the real time constraints. We can achieve better quality with longer renders that do more math, and more precise math (that are still far from real life, but improving), but that'd be useless for games given the speed.
Upscaling with machine learning is nothing more than another different way to try to do this approximation in a non-deterministic way. Any representation that's not the real thing is a sub-optimal representation still. Even a modern camera at 6000x4000, even though it's considered good nowadays, still can't capture all details present.
That same 6000x4000 in some years will also look like a graded image that we will find ways to improve with future tech. Their professional lineup is also called "RTX", without the geforce branding since they dropped the "Quadro/Tesla" monikers:
www.nvidia.com/en-us/design-visualization/rtx-6000/ FWIW CUDA does need drivers. And on linux it's a single driver for all product lines, be it quadro, tesla or geforce (only the tegra ones get a different driver).
If they make a 5040 does that mean it will get 3080 performance?? ;)
So they dont like FG or DLSS because AMD clearly is 2# now
If Amd is better than Nvidia in FG/upscaling Amd fans will love it.
[COLOR=rgb(226, 80, 65)]I was right about the 5090. It’s BAD[/COLOR]
I would be quite surprised if he would find 5090 was good.
And 60 feels perfectly smooth to me, visually. To me, above 60hz for games is a law of diminishing returns, other than latency. Below 60, well that's another matter ;)
Nvidia just took that used in drivers slam AI deep learning $$ name on it and priced it double.
Hi FI marketing won't work here, same as didn't work with phones.
Hi-Fi
Games on the other hand benefit from increased framerate and getting more frames is valued and preferred.
And you are too late - regardless of whether you or me likes it, it already worked. Nvidia introduced frame generation more than 2 years ago now. AMD followed last year. Consoles somewhat surprisingly have not adopted it yet but they are on a bit older hardware as well.
EDIT: And just to add, I don't think it's fair to call people critical of upscaling "progress deniers". Being critical of a new technology doesn't automatically mean they're in favour of stagnation. Being critical of a technology could lead to improvement if enough people are heard.
So if you get 120 fps pure raster and want to trigger it to make it 240 is OK, ish. At 60 fps base is questionable.
So it's a gimmick you want to use if the game runs already very well. With very questionable use cases. But putting it in first place when describing GPU is smoke and mirrors.
Anything more than 500usa for GPU is high end. Nvidia clearly points toward 4070 ti super/5070 ti to upscale its profits from player base.
And It's OK its company which want to make profit, but keep on doing product which is genuine good and no marketing and consumer disinformation capitalization.
Hope people will just stop being gamers, at least for next 6ish years, vote with your wallets people please.
But if you're 30fps with lower resolution and then make it blown in resolution and frames it will look like crap or generated slop at best.
Ngreedia have no value proposal in reasonable prices at all (by design) aiming for 600£+ hi fi gimics.
So anything like, 6700xt till next game consoles will do just enough.
Intel delivered b5800 which is just that for good price in the USA, here 6700xt still is cheaper.
Which is real progress for most of the costumers.
And ps6 is still 1 or 2 GPU generations away, same with next gen gaafet transistors on chips etc.
300£ 6700xt will do just OK for another 3 years. b580 same.
Just wish there wuld be something much better amiable at 400-500.
7800xt look great if price meets.
Seeing nvidia huge margins, there will be new players taking some of it for sure, and i w8 for them to hand my money.
If AMD delivers performant GPU for a price it will be there, if there will be intel aviable in good price they claim to be same (it isn't good value in EU at all)
There used to be a huge visual difference between low and high graphic settings in games. In modern games that difference is not nearly that big. Low settings often look as good as games from 5 years ago and high settings are not much better, but the difference between GPU requirements to be able to run low or high settings remains big.
EDIT: Think of a modern game that you think it's optimized. Then compare visual fidelity and FPS between low and high settings in that game. It's probably going to be a small graphics quality difference and a big FPS difference.