• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Introduces DLSS 4 with Multi-Frame Generation for up to 8X Framerate Uplifts

Joined
Sep 15, 2011
Messages
6,815 (1.40/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
Multi-frame generation is a feature exclusive to "Blackwell."
Scumbag nGreedia is scumbag.
Actually, this was to be expected for a callous greedy corporation such as them. Nothing surprising here.
 
Joined
Feb 23, 2019
Messages
6,122 (2.85/day)
Location
Poland
Processor Ryzen 7 5800X3D
Motherboard Gigabyte X570 Aorus Elite
Cooling Thermalright Phantom Spirit 120 SE
Memory 2x16 GB Crucial Ballistix 3600 CL16 Rev E @ 3600 CL14
Video Card(s) RTX3080 Ti FE
Storage SX8200 Pro 1 TB, Plextor M6Pro 256 GB, WD Blue 2TB
Display(s) LG 34GN850P-B
Case SilverStone Primera PM01 RGB
Audio Device(s) SoundBlaster G6 | Fidelio X2 | Sennheiser 6XX
Power Supply SeaSonic Focus Plus Gold 750W
Mouse Endgame Gear XM1R
Keyboard Wooting Two HE
Just what we need, more fake frames.
 
Joined
May 13, 2016
Messages
89 (0.03/day)
Hahaha, of course they have another series exclusive "feature"... Doesn't matter Lossless Scaling had multiple frame generation for ANY GPU quiet a while now as a downloadable app.
There is basically 0 progress in hardware between the generations, just brute force: more power, more cores, more expensive.
 
Joined
Feb 23, 2019
Messages
6,122 (2.85/day)
Location
Poland
Processor Ryzen 7 5800X3D
Motherboard Gigabyte X570 Aorus Elite
Cooling Thermalright Phantom Spirit 120 SE
Memory 2x16 GB Crucial Ballistix 3600 CL16 Rev E @ 3600 CL14
Video Card(s) RTX3080 Ti FE
Storage SX8200 Pro 1 TB, Plextor M6Pro 256 GB, WD Blue 2TB
Display(s) LG 34GN850P-B
Case SilverStone Primera PM01 RGB
Audio Device(s) SoundBlaster G6 | Fidelio X2 | Sennheiser 6XX
Power Supply SeaSonic Focus Plus Gold 750W
Mouse Endgame Gear XM1R
Keyboard Wooting Two HE
Using fake frames as a marketing tool to present your new products... Okay then.
Nvidia wants to sell you their AI tech, not raster performance
 
Joined
Oct 28, 2012
Messages
1,204 (0.27/day)
Processor AMD Ryzen 3700x
Motherboard asus ROG Strix B-350I Gaming
Cooling Deepcool LS520 SE
Memory crucial ballistix 32Gb DDR4
Video Card(s) RTX 3070 FE
Storage WD sn550 1To/WD ssd sata 1To /WD black sn750 1To/Seagate 2To/WD book 4 To back-up
Display(s) LG GL850
Case Dan A4 H2O
Audio Device(s) sennheiser HD58X
Power Supply Corsair SF600
Mouse MX master 3
Keyboard Master Key Mx
Software win 11 pro
Yep, called it many months ago, a gazillion interpolated frames. Screw it just modify the driver so that it always reports 99999999 FPS, why keep doing this ? That's the end game anyway.
At that point, I think that expecting a massive uplift in rasterization is not coping with the reality of the GPU market haha. There's literally not a single GPU maker that seems able to pull it off, it seems that we are entering into a 3 year cycle, but with a lower/equal raw perf uplift than we could have a few decades ago when it was a yearly cycle.

People somehow expected AMD to be savior, the champion of the true gamers, with their MCM expertise, but that doesn't seem to be happening. And with UDNA they will stop to have a gaming/compute split, and they are spending a lot of time developing software tricks to increase the performance. (and becoming more software-oriented generally, FSR4 wasn't needed according to some people, but it's coming anyway and seems to be exclusive to RDNA 4).

Every actor is doing the same thing: software tricks are their current battlefield, and nobody wants to be caught sleeping, even if forum dwellers seem to say that raw raster is were the real money is.
 
Joined
Jul 24, 2024
Messages
310 (1.83/day)
System Name AM4_TimeKiller
Processor AMD Ryzen 5 5600X @ all-core 4.7 GHz
Motherboard ASUS ROG Strix B550-E Gaming
Cooling Arctic Freezer II 420 rev.7 (push-pull)
Memory G.Skill TridentZ RGB, 2x16 GB DDR4, B-Die, 3800 MHz @ CL14-15-14-29-43 1T, 53.2 ns
Video Card(s) ASRock Radeon RX 7800 XT Phantom Gaming
Storage Samsung 990 PRO 1 TB, Kingston KC3000 1 TB, Kingston KC3000 2 TB
Case Corsair 7000D Airflow
Audio Device(s) Creative Sound Blaster X-Fi Titanium
Power Supply Seasonic Prime TX-850
Mouse Logitech wireless mouse
Keyboard Logitech wireless keyboard
I'm not sure the cynicism works, chief. Note that NV did not disable any of the new features on Ada hardware and they also allowed the new enhanced DLSS super resolution and ray reconstruction features to run on hardware as old as Turing. MFG obviously require faster hardware only present in Blackwell, and the one thing people hyperfocused on the 4x frame generation model haven't figured out that with this ample optical flow performance, lowering the generation factor to 3 or back to 2 will probably result in a more accurate image with frame generation on over Ada.
Since RTX 4090 has AI TOPS similar to RTX 5070 (Nvidia's own claim) forgive me but I am somehow unable to see why at least RTX 4090 is not capable of handling newest FG technology. Nvidia once again screwed with it's own customers as they did back with RTX3000 & DLSS 3. In other words: in your face, RTX <5000 owners. I understand Nvidia, I mean Jensen. It's all about the money. Create something not needed, persuade others about it's neccessity, than sit back and earn money. Next year release new generation, make subtle improvement and don't forget to cut older versions from access to it. It's greedy as hell but people don't give a f*, so it works for Nvidia.

Did you like his new jacket? Must have been designed by so called AI LJM (Large Jacket Model) itself.

I can't wait to see the tests, but nobody with an existing GeForce RTX card is being shafted out of features and this is a first for NV in a very long time. People who bought an RTX 2080 almost 7 years ago are getting new features, while the 5700 XT can barely run games since it's downlevel hardware below 12_2 base and the driver support is terrible.
RX 5700 XT is a performer of lower tier than RTX 2080 (Ti) with considerably portional launch price. AMD current best upscaling technology is also supported on RX 5000 and higher SKUs, on Nvidia's and Intel's GPUs too. Only thing that RX 5000 series lack is RT and FG support. You can't blame AMD and Intel for not supporting DLSS as it's locked proprietary technology of other manufacturer. AMD and Intel developed their own technologies and open-sourced them.

And once again, what is wrong with driver support? I've had no problems with AMD drivers for years. My problems only existed with newest titles when I forgot or was lazy to update drivers. When talking about driver support, please do also mention how Nvidia's drivers (not just once) allowed their own GPUs to get destroyed literally by playing games. It's been only weeks ago when Nvidia f*cked up their overlay in new Nvidia app and that caused significant performance loss (up to 15%) in games. AFAIK, it has not been fixed YET. The only solution that Nvidia was able to came up with was to disable overlays by default.

If you say A, please do also say B. AMD drivers were garbage long time ago.

Think of it this way: I’m running at a lower res but I still get a higher quality image than native.
I’m running FG, but the latency is the same as not using DLSS 4.
240FPS in 4K in CP2077 with PATH TRACING.
Yes, that’s progress.
It's impossible to think about it the way you do because it denies physics and logic. Lower resolution means less data rendered, less data at hand. You cannot make higher quality out of something by artificially upscaling it and extra/inter polating data between/before/after. Data is data and guessing between is still guessing. Guessed data ALWAYS carry portion of uncertainty because nothing can really predict the future.

but if has the same latency, what the point of having 400 fps instead of 100.

View attachment 378753

its not lower, its more generated frames with the SAME latency
Exactly. Input latency simply cannot improve with FG technology.

Upscaling is better than native vs previous AA technologies like older DLSS and especially TAA.


DLSS 4 looks better than native = Progress
DLSS FG double performance for the same latency (vs native with no DLSS SR and Reflex) = Progress
DLSS MFG Triples performance for the same latency = Progress.

This is “fake” progress only in the eyes of AMD fanboys or low IQ individuals
Look, people have opinions and they have right to have them.
Calling others fanboys ... okay ... I get it, but telling people they are dumb because they have other opinion? Not nice.

Do you actually understand how does FG work? FG interpolates between rendered frames and basically multiplies this interpolations with help of accelerated neural network (with use of vectors which are calculated based on real rasterized frames in between those interpolated). It improves framerate, that's true, but what you see is mostly not rendered rasterized image but guessed.

In no way can this approach generate lower latency than native rendering because hardware needs additional time to process FG.
This latency penalty can be mitigated by additional hardware resources, which is what Nvidia did.

Why some people have doubts about this being a progress is because instead of improving rasterized pefrormance, Nvidia brought artificiality to the game, created enormously huge and power hungry GPUs that do more guessing than rasterizing. Rasterizing performance is thus progressing very slowly.
 
Last edited:
Joined
Apr 2, 2008
Messages
466 (0.08/day)
System Name -
Processor Ryzen 9 5900X
Motherboard MSI MEG X570
Cooling Arctic Liquid Freezer II 280 (4x140 push-pull)
Memory 32GB Patriot Steel DDR4 3733 (8GBx4)
Video Card(s) MSI RTX 4080 X-trio.
Storage Sabrent Rocket-Plus-G 2TB, Crucial P1 1TB, WD 1TB sata.
Display(s) LG Ultragear 34G750 nano-IPS 34" utrawide
Case Define R6
Audio Device(s) Xfi PCIe
Power Supply Fractal Design ION Gold 750W
Mouse Razer DeathAdder V2 Mini.
Keyboard Logitech K120
VR HMD Er no, pointless.
Software Windows 10 22H2
Benchmark Scores Timespy - 24522 | Crystalmark - 7100/6900 Seq. & 84/266 QD1 |
Progress denier. I see
It isnt progress when a) Corps are ripping thier customers off, b) they are using Ai to mask true perf and c) the visual quality of the this 'masking' has been demonstrably shown to be glitchy. Its got to the point where some game devs are using Frame-gen as a crutch to not optimise their games.
 
Last edited:
Joined
Aug 12, 2020
Messages
129 (0.08/day)
System Name Main
Processor i9-10900kf @4.3Ghz
Motherboard MSI Z490 Unify
Cooling 4x Noctua 140mm + 200mm Noctua, 280GTS Xflow x2 + EK : X2 RES 250 Advanced, D5, CPU : Supremacy Evo
Memory 4x8go Gskill 4200 CL16
Video Card(s) Inno 3D 3070 @Alphacool watercooled
Storage Crucial P5 1To + FURY Renegade 2to + 256go ssd (os) HDD (seagate) : 1to + 2to cold storage
Display(s) 27" LG 144hz
Case Thermaltake core X5
Audio Device(s) Home cinema 5.1.2 : vsx-930 + 5 klipsch 100w + 2 jamo bipolar + subwoofer jamo 150w
Power Supply Seasonic 650w gold
Mouse G502 wireless
Keyboard Corsair K68
Software W11 x64
FSR3 create blurry when you move, I dont understand whhy people use it also it's add a lot of latence for fps is unplayble (stalker2)
It isnt progress when a) ripping your customers off, b) your using Ai to mask true perf and c) the visual quaility of the this 'masking' has been demonsteably shown to be glitchy. Its got to the point some some game devs are using Frame-gen as a crutch to not optomise thier games.
Indeeed ,also games are looking way worse than ~2014-2018 era ,with a big gpus and lot of VRAM
 
Joined
Apr 2, 2008
Messages
466 (0.08/day)
System Name -
Processor Ryzen 9 5900X
Motherboard MSI MEG X570
Cooling Arctic Liquid Freezer II 280 (4x140 push-pull)
Memory 32GB Patriot Steel DDR4 3733 (8GBx4)
Video Card(s) MSI RTX 4080 X-trio.
Storage Sabrent Rocket-Plus-G 2TB, Crucial P1 1TB, WD 1TB sata.
Display(s) LG Ultragear 34G750 nano-IPS 34" utrawide
Case Define R6
Audio Device(s) Xfi PCIe
Power Supply Fractal Design ION Gold 750W
Mouse Razer DeathAdder V2 Mini.
Keyboard Logitech K120
VR HMD Er no, pointless.
Software Windows 10 22H2
Benchmark Scores Timespy - 24522 | Crystalmark - 7100/6900 Seq. & 84/266 QD1 |
By the way, Am I the only one who have the feeling that 5090 is ultra slow garbage that is barely doing 30 fps in 4k when DLSS is disabled? Its not me, its Nvidia's own claim
I am getting the pop-corn ready for the HardwareUnboxed review.
 
Joined
Feb 24, 2013
Messages
193 (0.04/day)
System Name Vaksdal Venom
Processor Ryzen 7 7800X3D
Motherboard Gigabyte B650 Aorus Elite AX V2
Cooling Thermalright Peerless Assassin 120 SE
Memory G.Skill Flare X5 DDR5-6000 32GB (2x16GB) CL30-38-38-96
Video Card(s) MSI GeForce RTX 3080 SUPRIM X
Storage WD Black SN750 500GB, Samsung 840 EVO 500GB, Samsung 850 EVO 500GB, Samsung 860 EVO 1TB
Display(s) Viewsonic XG2431
Case Lian Li O11D Evo XL (aka Dynamic Evo XL) White
Audio Device(s) ASUS Xonar Essence STX
Power Supply Corsair HX1000 PSU - 1000 W
Mouse Logitech G703 Hero
Software Windows 10 Home 64-bit
Maybe a bit off-topic, but can someone point me in the direction of a good deep dive into frame generation with regards to input lag and vsync?

I'm running BFI on my monitor you see (Viewsonic XG2431), and that requires fps>hz. It also requires vsync being on. Currently I try to run all my games at a minimum of 75hz/fps (no ray tracing). So I'm basically wondering how much input lag frame generation will introduce given that scenario.
 
Joined
Dec 26, 2013
Messages
191 (0.05/day)
Processor Ryzen 7 5800x3D
Motherboard Gigabyte B550 Gaming X v2
Cooling Thermalright Phantom Spirit 120 SE
Memory Corsair Vengeance LPX 2x32GB 3600Mhz C18
Video Card(s) XFX RX 6800 XT Merc 319
Storage Kingston KC2500 2TB NVMe + Crucial MX100 256GB + Samsung 860QVO 1TB + Samsung Spinpoint F3 500GB HDD
Display(s) Samsung CJG5 27" 144 Hz QHD
Case Phanteks Eclipse P360A DRGB Black + 3x Thermalright TL-C12C-S ARGB
Audio Device(s) Logitech X530 5.1 + Logitech G35 7.1 Headset
Power Supply Cougar GEX850 80+ Gold
Mouse Razer Viper 8K
Keyboard Logitech G105
So, exactly same thing that is available even on a 3rd party tool like Lossless Scaling for both old and new AMD, Nvidia and Intel GPUs, exclusive only to 5000 series...
Very nvidiaish indeed.
 
Joined
Aug 12, 2020
Messages
129 (0.08/day)
System Name Main
Processor i9-10900kf @4.3Ghz
Motherboard MSI Z490 Unify
Cooling 4x Noctua 140mm + 200mm Noctua, 280GTS Xflow x2 + EK : X2 RES 250 Advanced, D5, CPU : Supremacy Evo
Memory 4x8go Gskill 4200 CL16
Video Card(s) Inno 3D 3070 @Alphacool watercooled
Storage Crucial P5 1To + FURY Renegade 2to + 256go ssd (os) HDD (seagate) : 1to + 2to cold storage
Display(s) 27" LG 144hz
Case Thermaltake core X5
Audio Device(s) Home cinema 5.1.2 : vsx-930 + 5 klipsch 100w + 2 jamo bipolar + subwoofer jamo 150w
Power Supply Seasonic 650w gold
Mouse G502 wireless
Keyboard Corsair K68
Software W11 x64
Maybe a bit off-topic, but can someone point me in the direction of a good deep dive into frame generation with regards to input lag and vsync?

I'm running BFI on my monitor you see (Viewsonic XG2431), and that requires fps>hz. It also requires vsync being on. Currently I try to run all my games at a minimum of 75hz/fps (no ray tracing). So I'm basically wondering how much input lag frame generation will introduce given that scenario.
ThreatInteractive on youtube talks about frame generation and ue5 problems maybe it can help you
 
Joined
Apr 2, 2008
Messages
466 (0.08/day)
System Name -
Processor Ryzen 9 5900X
Motherboard MSI MEG X570
Cooling Arctic Liquid Freezer II 280 (4x140 push-pull)
Memory 32GB Patriot Steel DDR4 3733 (8GBx4)
Video Card(s) MSI RTX 4080 X-trio.
Storage Sabrent Rocket-Plus-G 2TB, Crucial P1 1TB, WD 1TB sata.
Display(s) LG Ultragear 34G750 nano-IPS 34" utrawide
Case Define R6
Audio Device(s) Xfi PCIe
Power Supply Fractal Design ION Gold 750W
Mouse Razer DeathAdder V2 Mini.
Keyboard Logitech K120
VR HMD Er no, pointless.
Software Windows 10 22H2
Benchmark Scores Timespy - 24522 | Crystalmark - 7100/6900 Seq. & 84/266 QD1 |
Joined
Dec 5, 2013
Messages
653 (0.16/day)
Location
UK
Yep, more blurryness for unoptimised games ?? sound great
They need to relearn to optimise them better then. The gaslighting here though is unreal where "native" can only possibly be the worst examples of Unreal TAA blur-fest (which never was necessary to force on in the first place) whilst DLSS is constantly declared "better than native" like a new unquestionable religious dogma, until you go back and compare half the games released this year and even compared to TAA, DLSS often contains very obvious visual aberrations like weirdly greyed out telephone wires that look worse than those in Half Life 2, to irritating as hell Jittery HUD / UI's, etc, and we're all supposed to "pretend" to not see...

At best, DLSS is a tool in a toolbox that should make a game run better at almost the same visual quality (and sometimes does). At worst, it also can (and sometimes does) look worse than the same native resolution it's trying to approximate to, plus has reduced the incentive for developers to do any optimisation at all. Imagine DLSS, FSR, etc, weren't invented, then developers would have no choice but to put in more effort (as they did in the past during multiple periods of hardware stagnation). That should have been the healthy "baseline norm" optimisation we should be enjoying for post-2020 AAA's with DLSS gains being on top (aka "A rising tide lifts all boats"), not a "Now we can churn out sh*tty performance turds with less effort and more excuses than ever before!" bait & switch instead of, which is exactly how half of last year's games ended up. Half the unhealthy gushing hype here though every time nVidia announces a new version involves cheering it on as a crutch rather than an enhancement, and that's what people are (rightfully) calling out with "More Fake Frame BS" comments, ie, it's "progress" that has also noticeably regressed the AAA gaming industry in other areas...
 
Joined
Dec 12, 2016
Messages
2,015 (0.68/day)
I guess the summary is that Nvidia has moved on from traditional gen ras rendering to AI/RT/DLSS rendering. If you buy a 5000 series card, expect to implement this new render method or fail to get your money’s worth.

For the rest of us who just want to game old school on an old school budget, Intel and AMD will do just fine.
 
Joined
Sep 13, 2020
Messages
165 (0.10/day)
Progress denier. I see
Nvidia loves to compare a game struggling at 30FPS to it running at 200FPS with DLSS and sh1t. Sure, that's impressive. But what would actually be impressive is if the game could be run natively at 150FPS.
Also, I couldn't care less about RT. Devs are abusing fake-frames, fake-res, temporalAA and sh1t, to push out poorly optimized games, relying heavily on GPU generative techs instead of proper optimization. None of this is necessary to make a great-looking game.

Now, to play AAA games, you need all of Nvidia’s fancy tech just to end up with worse graphics, more artifacts and blur —when a properly optimized game would look and run better natively.

I’d happily trade all this progress for the neanderthal approach of running games at high specs without AI-driven trickery.
 
Joined
Feb 14, 2012
Messages
1,854 (0.39/day)
Location
Romania
I don't consider faking things progress, especially when it comes with noticeable impacts on quality and latency
If you like it, and you cant notice the impacts yourself, I won't stop you but what I'm interested in is raster
I play Warhammer: Darktide with DLSS on Quality and don't notice any input lag. With frame gen on, it is a bit of input lag, but nothing to ruin my gameplay or aim.
DLSS/FSR/XeSS and "fake frames" are a God sent for people with weaker hardware, but also for game devs that don't try as hard to optimize their games, so a win-win ?
 
Joined
Jun 14, 2020
Messages
3,738 (2.24/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
I guess the summary is that Nvidia has moved on from traditional gen ras rendering to AI/RT/DLSS rendering. If you buy a 5000 series card, expect to implement this new render method or fail to get your money’s worth.

For the rest of us who just want to game old school on an old school budget, Intel and AMD will do just fine.
AMD emphasis was on RT this time around, so yeah, nope.
 
Joined
Jul 24, 2024
Messages
310 (1.83/day)
System Name AM4_TimeKiller
Processor AMD Ryzen 5 5600X @ all-core 4.7 GHz
Motherboard ASUS ROG Strix B550-E Gaming
Cooling Arctic Freezer II 420 rev.7 (push-pull)
Memory G.Skill TridentZ RGB, 2x16 GB DDR4, B-Die, 3800 MHz @ CL14-15-14-29-43 1T, 53.2 ns
Video Card(s) ASRock Radeon RX 7800 XT Phantom Gaming
Storage Samsung 990 PRO 1 TB, Kingston KC3000 1 TB, Kingston KC3000 2 TB
Case Corsair 7000D Airflow
Audio Device(s) Creative Sound Blaster X-Fi Titanium
Power Supply Seasonic Prime TX-850
Mouse Logitech wireless mouse
Keyboard Logitech wireless keyboard
At best, DLSS is a tool in a toolbox that should make a game run better at almost the same visual quality (and sometimes does). At worst, it also can (and sometimes does) look worse than the same native resolution it's trying to approximate to, plus has reduced the incentive for developers to do any optimisation at all. Imagine DLSS, FSR, etc, weren't invented, then developers would have no choice but to put in more effort (as they did in the past during multiple periods of hardware stagnation). That should have been the healthy "baseline norm" optimisation we should be enjoying for post-2020 AAA's with DLSS gains being on top (aka "A rising tide lifts all boats"), not a "Now we can churn out sh*tty performance turds with less effort and more excuses than ever before!" bait & switch instead of, which is exactly how half of last year's games ended up. Half the unhealthy gushing hype here though every time nVidia announces a new version involves cheering it on as a crutch rather than an enhancement, and that's what people are (rightfully) calling out with "More Fake Frame BS" comments, ie, it's "progress" that has also noticeably regressed the AAA gaming industry in other areas...
Nvidia loves to compare a game struggling at 30FPS to it running at 200FPS with DLSS and sh1t. Sure, that's impressive. But what would actually be impressive is if the game could be run natively at 150FPS.
Also, I couldn't care less about RT. Devs are abusing fake-frames, fake-res, temporalAA and sh1t, to push out poorly optimized games, relying heavily on GPU generative techs instead of proper optimization. None of this is necessary to make a great-looking game.

Now, to play AAA games, you need all of Nvidia’s fancy tech just to end up with worse graphics, more artifacts and blur —when a properly optimized game would look and run better natively.

I’d happily trade all this progress for the neanderthal approach of running games at high specs without AI-driven trickery.
oh my god omg GIF by Late Night with Seth Meyers

Exactly my point of view. Finally some people understand how this DLSS and FG helps game devs to neglect proper optimizations.

One would think that with current capabilities of GPUs we would be using supersampling (rendering in higher than native res and shrinking it down to native res) instead of rendering in low res and upscaling to higher.

I play Warhammer: Darktide with DLSS on Quality and don't notice any input lag. With frame gen on, it is a bit of input lag, but nothing to ruin my gameplay or aim.
DLSS/FSR/XeSS and "fake frames" are a God sent for people with weaker hardware, but also for game devs that don't try as hard to optimize their games, so a win-win ?
Exactly. To achieve reasonable framerate with lower tier hardware on QHD or 4K resolutions. It's win for game devs but definitely not for gamers in terms of poor game optimizations. Look at Wukong, that's so badly optimized game. Even mighty RTX 4090 struggles badly in Wukong. Check out how the game looks and compare it to how it runs on most modern cards. Have you seen better graphics at much better performance? I did, without any doubt.

What comes next? Devs will be even lazier. Generating textures and other graphics assets by AI, no more designers needed. Same for game script, soundtrack, etc. Give us $79.99 for this shiny new title and just ramp up the upscaling and you'll be fine. This is what you pay for when you purchase the game?
 
Joined
Jan 19, 2023
Messages
267 (0.37/day)
TBH once FSR4 is out and has improved image quality, alongside with AntiLag 2, it seems like AMD can be on par at least when it comes to upscaling and FG. Achieving the same 3 generated frames per real one, should not be hard for them considering you can do that now if you enable FSR3.1 in game and AFMF in the driver. Yeah it looks bad but that's mostly due to FSR image quality that should be fixed with FSR4 and also due to the fact AFMF doesn't use motion vectors.
It's not like I'm excited but it's better than last time around where they were missing FG for a year.
 
Top