• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Introduces DLSS 4 with Multi-Frame Generation for up to 8X Framerate Uplifts

Joined
Oct 30, 2022
Messages
247 (0.30/day)
Location
Australia
System Name Blytzen
Processor Ryzen 7 7800X3D
Motherboard ASRock B650E Taichi Lite
Cooling Deepcool LS520 (240mm)
Memory G.Skill Trident Z5 Neo RGB 64 GB (2 x 32 GB) DDR5-6000 CL30
Video Card(s) Powercolor 6800XT Red Dragon (16 gig)
Storage 2TB Crucial P5 Plus SSD, 80TB spinning rust in a NAS
Display(s) MSI MPG321URX QD-OLED (32", 4k, 240hz), Samsung 32" 4k
Case Coolermaster HAF 500
Audio Device(s) Logitech G733 and a Z5500 running in a 2.1 config (I yeeted the mid and 2 satellites)
Power Supply Corsair HX850
Mouse Logitech G502X lightspeed
Keyboard Logitech G915 TKL tactile
Benchmark Scores Squats and calf raises
That's just flat out wrong though. Have you seen image restoration for example?



4k at 32", definitely needs AA.
Thing is, that's taken the sub optimal time degraded image and restored it.

If you took a picture of the same subject today with a quality modern camera at the same detail level (pretend for arguments sake the original was 6000x4000 pixels - so a camera that uses the same resolution) then you couldn't improve upon the picture taken without interpolation, which is not the original image then.

Image restoration is recreation of the best picture, or returning it to as close to the original as you can, which is exactly what frame generation/upscaling does, tries to get as close to the original as it can.

The difference with image restoration is that the original image when first created (so at it's best) might not be as good as what could be produced today.

Mathematically speaking though, if it was digital and the file wasn't corrupted, you couldn't restore it to be better without interpolating data.
 
Joined
Feb 3, 2017
Messages
3,879 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
Depending on the game 4K does smooth edges on is own without any AA needed so that's already not true that is necessary always.
No, resolution has absolutely nothing to do with edge aliasing.
Like I said before, your perception of aliasing might, but that is a different topic. And not a thing with normal monitor pixel densities.
Rtx is not necessary always because is just like Physx used to be a thing in the past.
Physx is a whole different topic. And it did not go away because the idea was bad but primarily because CPUs got more and more cores and physics engines that could be well enough parallelized got much more performance to play with. Nvidia hobbled Physx for a while so that the version of it running on CPUs was single threaded but in the end gave up and made it work properly. Physx itself did not go away, the forced-to-run-on-GPU variant did.
 
Joined
Sep 15, 2016
Messages
493 (0.16/day)
The 2000 ask is not for gamers. It's for pros. The 1000 ask is for the inbetween. The below are for gamers. People need to get that through their skulls and fast. There is a reason AMD is not competing with those because those are not gaming cards. The companies are telling us this. Yet PC gamers refuse to accept what the specs, the price, and the companies themselves keep saying. Because they are spoiled fucking shits.

That's just objectively false. Anything "RTX" is a gaming card and will not work with professional drivers.
 
Joined
Feb 3, 2017
Messages
3,879 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
Joined
Apr 13, 2022
Messages
1,255 (1.23/day)
My mistake then, that must have changed in recent years.

That's not to say the RTX 5090 is not marketed as a gaming card though.
It's marketed as a prosumer card. CUDA does not need drivers. Quadro drivers are only for CAD CAM. You're either a liar or a fucking idiot. 5090 is a prosumer card. Hell even 5080 straddles that line. Bellow that are the gaming cards. Get a clue.

Hint CUDA is a toolkit. Usually ran on linux. https://developer.nvidia.com/cuda-downloads that right there is the justification for the 5090. You might have missed it. That was the justification for the 8800 GTX. It's only got more extreme since Titan. That's why GPUs get banned to China. Not because 5090s will let their gold farmers in WoW out gold farm our gold farmers.

Rasterization is dead. You're buying an AI product and it's fakery. Make no mistake about it.
 
Last edited:
Joined
Feb 24, 2013
Messages
196 (0.05/day)
System Name Vaksdal Venom
Processor Ryzen 7 7800X3D
Motherboard Gigabyte B650 Aorus Elite AX V2
Cooling Thermalright Peerless Assassin 120 SE
Memory G.Skill Flare X5 DDR5-6000 32GB (2x16GB) CL30-38-38-96
Video Card(s) MSI GeForce RTX 3080 SUPRIM X
Storage WD Black SN750 500GB, Samsung 840 EVO 500GB, Samsung 850 EVO 500GB, Samsung 860 EVO 1TB
Display(s) Viewsonic XG2431
Case Lian Li O11D Evo XL (aka Dynamic Evo XL) White
Audio Device(s) ASUS Xonar Essence STX
Power Supply Corsair HX1000 PSU - 1000 W
Mouse Logitech G703 Hero
Software Windows 10 Home 64-bit
The 2000 ask is not for gamers. It's for pros.
I present to you the Nvidia 5090 page, tadaaaa! GeForce RTX 5090 Graphics Cards | NVIDIA
It's the fastest gaming card ever. It's marketed towards gamers. Please explain how it's not a gaming card other than "The companies are telling us this".
It's marketed as a prosumer card. You're either a liar or a fucking idiot.
Again, in what way would you say they are not trying to attract gamers with that page? Good luck!

Oh and that last part; I have a feeling you'd better apologize or delete it.
 
Joined
May 10, 2023
Messages
526 (0.84/day)
Location
Brazil
Processor 5950x
Motherboard B550 ProArt
Cooling Fuma 2
Memory 4x32GB 3200MHz Corsair LPX
Video Card(s) 2x RTX 3090
Display(s) LG 42" C2 4k OLED
Power Supply XPG Core Reactor 850W
Software I use Arch btw
Nothing looks better than native, can look as close to native as possible, can't look better. Can't. It's interpreted math. That's not progress (getting closer is.) That's like saying AI upscaling a picture makes it look better at the same resolution.
I agree with your word, but disagree with what you meant. Reminder that the only actual "native" thing for us would be you actually seeing something with your own eyes.
Renders are just some math that try to approximate what we have in the real world. Our current approximations for games (in raw raster) is reasonable, but far from good given the real time constraints. We can achieve better quality with longer renders that do more math, and more precise math (that are still far from real life, but improving), but that'd be useless for games given the speed.

Upscaling with machine learning is nothing more than another different way to try to do this approximation in a non-deterministic way.
Thing is, that's taken the sub optimal time degraded image and restored it.

If you took a picture of the same subject today with a quality modern camera at the same detail level (pretend for arguments sake the original was 6000x4000 pixels - so a camera that uses the same resolution) then you couldn't improve upon the picture taken without interpolation, which is not the original image then.
Any representation that's not the real thing is a sub-optimal representation still. Even a modern camera at 6000x4000, even though it's considered good nowadays, still can't capture all details present.
That same 6000x4000 in some years will also look like a graded image that we will find ways to improve with future tech.
That's just objectively false. Anything "RTX" is a gaming card and will not work with professional drivers.
Their professional lineup is also called "RTX", without the geforce branding since they dropped the "Quadro/Tesla" monikers:
It's marketed as a prosumer card. CUDA does not need drivers. Quadro drivers are only for CAD CAM. You're either a liar or a fucking idiot. 5090 is a prosumer card. Hell even 5080 straddles that line. Bellow that are the gaming cards. Get a clue.

Hint CUDA is a toolkit. Usually ran on linux. https://developer.nvidia.com/cuda-downloads that right there is the justification for the 5090. You might have missed it. That was the justification for the 8800 GTX. It's only got more extreme since Titan. That's why GPUs get banned to China. Not because 5090s will let their gold farmers in WoW out gold farm our gold farmers.

Rasterization is dead. You're buying an AI product and it's fakery. Make no mistake about it.
FWIW CUDA does need drivers. And on linux it's a single driver for all product lines, be it quadro, tesla or geforce (only the tegra ones get a different driver).
 
Joined
Dec 26, 2006
Messages
3,907 (0.59/day)
Location
Northern Ontario Canada
Processor Ryzen 5700x
Motherboard Gigabyte X570S Aero G R1.1 BiosF5g
Cooling Noctua NH-C12P SE14 w/ NF-A15 HS-PWM Fan 1500rpm
Memory Micron DDR4-3200 2x32GB D.S. D.R. (CT2K32G4DFD832A)
Video Card(s) AMD RX 6800 - Asus Tuf
Storage Kingston KC3000 1TB & 2TB & 4TB Corsair MP600 Pro LPX
Display(s) LG 27UL550-W (27" 4k)
Case Be Quiet Pure Base 600 (no window)
Audio Device(s) Realtek ALC1220-VB
Power Supply SuperFlower Leadex V Gold Pro 850W ATX Ver2.52
Mouse Mionix Naos Pro
Keyboard Corsair Strafe with browns
Software W10 22H2 Pro x64
Joined
Sep 19, 2014
Messages
129 (0.03/day)
I'd bet these people saying they dont like "fake frames" wouldn't be able to tell which is the AI generated frames gameplay vs non AI generated at the same framerate in a blind test. Probably would say the AI generated gameplay would even look better than the original one at the same framerate.
ppls say because they use AMD and Nvidia is now Rank#1 in FG/Upscaling
So they dont like FG or DLSS because AMD clearly is 2# now

If Amd is better than Nvidia in FG/upscaling Amd fans will love it.
 
Joined
May 29, 2017
Messages
510 (0.18/day)
Location
Latvia
Processor AMD Ryzen™ 7 5700X
Motherboard ASRock B450M Pro4-F R2.0
Cooling Arctic Freezer A35
Memory Lexar Thor 32GB 3733Mhz CL16
Video Card(s) PURE AMD Radeon™ RX 7800 XT 16GB
Storage Lexar NM790 2TB + Lexar NS100 2TB
Display(s) HP X34 UltraWide IPS 165Hz
Case Zalman i3 Neo + Arctic P12
Audio Device(s) Airpulse A100 + Edifier T5
Power Supply Sharkoon Rebel P20 750W
Mouse Cooler Master MM730
Keyboard Krux Atax PRO Gateron Yellow
Software Windows 11 Pro
A little bit closer to the truth!

I was right about the 5090. It’s BAD

 
Last edited:
Joined
Jun 2, 2017
Messages
9,655 (3.46/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
ppls say because they use AMD and Nvidia is now Rank#1 in FG/Upscaling
So they dont like FG or DLSS because AMD clearly is 2# now

If Amd is better than Nvidia in FG/upscaling Amd fans will love it.
We don't need it. Buy the card that fits your resolution.
 
Joined
Feb 1, 2013
Messages
1,276 (0.29/day)
System Name Gentoo64 /w Cold Coffee
Processor 9900K 5.2GHz @1.312v
Motherboard MXI APEX
Cooling Raystorm Pro + 1260mm Super Nova
Memory 2x16GB TridentZ 4000-14-14-28-2T @1.6v
Video Card(s) RTX 4090 LiquidX Barrow 3015MHz @1.1v
Storage 660P 1TB, 860 QVO 2TB
Display(s) LG C1 + Predator XB1 QHD
Case Open Benchtable V2
Audio Device(s) SB X-Fi
Power Supply MSI A1000G
Mouse G502
Keyboard G815
Software Gentoo/Windows 10
Benchmark Scores Always only ever very fast
Do I want smoother past frames or do I want future frames to arrive faster? I want both, but I prefer the latter over the former. Leather jacket is primarily selling the former this generation, in the disguise of both.
 
Joined
Feb 3, 2017
Messages
3,879 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
A little bit closer to the truth!

I was right about the 5090. It’s BAD

tl;dw?
I would be quite surprised if he would find 5090 was good.
 
Joined
Mar 2, 2013
Messages
9 (0.00/day)
Correct, so, get a 165Hz monitor, lock it to that Hz, enjoy 6ms latency. :D

This is why I can never get used to older systems anymore, the high refresh locks ruined me for life and I NEED, a better GPU to keep it near that 165Hz in AAA titles, these technologies from nVidia will ensure it, although, perhaps at a little graphical loss, but from what I saw, it doesn't seem that way anymore.

Anyways, we will see from W1zzards testing. :)

This is actually the main reason why I refuse to play games at over 60fps. I don't want to get used to it, and for 60fps to feel like 30fps, as I happen to be a console gamer and PC gamer equally. If 120 becomes the new 60, suddenly even nice smooth 60fps console games might start to feel more like 30 ;)

And 60 feels perfectly smooth to me, visually. To me, above 60hz for games is a law of diminishing returns, other than latency. Below 60, well that's another matter ;)
 

oddrobert

New Member
Joined
Dec 28, 2024
Messages
8 (0.30/day)
Progress denier. I see
Fake frames are in Hi-Fi home theatres for decades, most people don't use it, because it is just worse than source 99%. Some upscales to anime were OK ish, but nothing noticeable.
Nvidia just took that used in drivers slam AI deep learning $$ name on it and priced it double.
Hi FI marketing won't work here, same as didn't work with phones.

Hi-Fi
 

Attachments

  • rustic-log-cabins.jpg
    rustic-log-cabins.jpg
    117.1 KB · Views: 23
Joined
Feb 3, 2017
Messages
3,879 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
Fake frames are in Hi-Fi home theatres for decades, most people don't use it, because it is just worse than source 99%. Some upscales to anime were OK ish, but nothing noticeable.
TV and movies work at fixed framerate and deviations from that are not well received even if it is not fake frames. Soap opera effect is the most known-common problem.
Games on the other hand benefit from increased framerate and getting more frames is valued and preferred.

And you are too late - regardless of whether you or me likes it, it already worked. Nvidia introduced frame generation more than 2 years ago now. AMD followed last year. Consoles somewhat surprisingly have not adopted it yet but they are on a bit older hardware as well.
 
Joined
Jan 12, 2023
Messages
249 (0.33/day)
System Name IZALITH (or just "Lith")
Processor AMD Ryzen 7 7800X3D (4.2Ghz base, 5.0Ghz boost, -30 PBO offset)
Motherboard Gigabyte X670E Aorus Master Rev 1.0
Cooling Deepcool Gammaxx AG400 Single Tower
Memory Corsair Vengeance 64GB (2x32GB) 6000MHz CL40 DDR5 XMP (XMP enabled)
Video Card(s) PowerColor Radeon RX 7900 XTX Red Devil OC 24GB (2.39Ghz base, 2.99Ghz boost, -30 core offset)
Storage 2x1TB SSD, 2x2TB SSD, 2x 8TB HDD
Display(s) Samsung Odyssey G51C 27" QHD (1440p 165Hz) + Samsung Odyssey G3 24" FHD (1080p 165Hz)
Case Corsair 7000D Airflow Full Tower
Audio Device(s) Corsair HS55 Surround Wired Headset/LG Z407 Speaker Set
Power Supply Corsair HX1000 Platinum Modular (1000W)
Mouse Logitech G502 X LIGHTSPEED Wireless Gaming Mouse
Keyboard Keychron K4 Wireless Mechanical Keyboard
Software Arch Linux
It's a bit disappointing that upscaling went from "a neat technology to increase the longevity of older hardware" to "a requirement to get decent performance" when in-game graphic fidelity has plateaued. I'd be happy to use upscaling if it seemed warranted (eg: some of the Cyberpunk photo-realism mods I've seen make me think that yes, a GPU is going to need help rendering that), but when we have games that look the same as counterparts from 5 years ago requiring exponentially more resources it seems less like "advancement" and more "cheating".

EDIT: And just to add, I don't think it's fair to call people critical of upscaling "progress deniers". Being critical of a new technology doesn't automatically mean they're in favour of stagnation. Being critical of a technology could lead to improvement if enough people are heard.
 
Last edited:

oddrobert

New Member
Joined
Dec 28, 2024
Messages
8 (0.30/day)
So ye It's here, and we will be able to top out monitor refresh rate with it, regardless need strong base performance because the best effect it gives in 2x( still artefacts) with 4x and 8x it can be very sloppy.
So if you get 120 fps pure raster and want to trigger it to make it 240 is OK, ish. At 60 fps base is questionable.
So it's a gimmick you want to use if the game runs already very well. With very questionable use cases. But putting it in first place when describing GPU is smoke and mirrors.

Anything more than 500usa for GPU is high end. Nvidia clearly points toward 4070 ti super/5070 ti to upscale its profits from player base.
And It's OK its company which want to make profit, but keep on doing product which is genuine good and no marketing and consumer disinformation capitalization.
Hope people will just stop being gamers, at least for next 6ish years, vote with your wallets people please.
But if you're 30fps with lower resolution and then make it blown in resolution and frames it will look like crap or generated slop at best.
 

oddrobert

New Member
Joined
Dec 28, 2024
Messages
8 (0.30/day)
ps6 is something like 6700xt so in the GPU department for a price progress was slower than I would like of course.
Ngreedia have no value proposal in reasonable prices at all (by design) aiming for 600£+ hi fi gimics.

So anything like, 6700xt till next game consoles will do just enough.
Intel delivered b5800 which is just that for good price in the USA, here 6700xt still is cheaper.
Which is real progress for most of the costumers.

And ps6 is still 1 or 2 GPU generations away, same with next gen gaafet transistors on chips etc.
300£ 6700xt will do just OK for another 3 years. b580 same.

Just wish there wuld be something much better amiable at 400-500.
7800xt look great if price meets.


Seeing nvidia huge margins, there will be new players taking some of it for sure, and i w8 for them to hand my money.
If AMD delivers performant GPU for a price it will be there, if there will be intel aviable in good price they claim to be same (it isn't good value in EU at all)
 
Joined
Sep 21, 2023
Messages
49 (0.10/day)
when we have games that look the same as counterparts from 5 years ago requiring exponentially more resources it seems less like "advancement" and more "cheating".
It's just that in-game graphics have reached that level of fidelity. Offline 3D rendering has reached that level long ago – visually minor improvements require exponentially more compute.

There used to be a huge visual difference between low and high graphic settings in games. In modern games that difference is not nearly that big. Low settings often look as good as games from 5 years ago and high settings are not much better, but the difference between GPU requirements to be able to run low or high settings remains big.

EDIT: Think of a modern game that you think it's optimized. Then compare visual fidelity and FPS between low and high settings in that game. It's probably going to be a small graphics quality difference and a big FPS difference.
 
Top