• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA's DLSS 2.0 is Easily Integrated into Unreal Engine 4.26 and Gives Big Performance Gains

Joined
Sep 17, 2014
Messages
22,275 (6.02/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
This situation is not your free will. Nvidia and others decided to justify their attentions to get more money from people with present unreal needs. Games with predefined effects is more than enough beauty when their creators are also and good artists.

I even think of defining the imposition of real-time calculated effects, such as violence on consumers' personal budgets. Because once when Nvidia and others have decided that all models are RTX(or DXR whatever choose different name) they do not leave people the right to choose. Yes today we have option to disable it when play games...But we pay for it with the increased price of the hardware without anyone asking if we want to own it.

Well.... history for computer graphics has a few examples of completely failed technological advancements that initially lots of people were all crazy about.
3DFX.
PhysX
VR - its been launched and then re-launched how many times now? Still not really sticking to anything more than niche.
...

And what about API adoption? Some of them were either dragged out to infinity (DirectX 9.0 (c) and DX11) while others were barely used and still are only just gaining ground like DX12 and 10. Or Vulkan.

There is a whole industry behind this with varying demands and investments, and developers have so much to choose from, its really not as simple as you might think. Budget is limited, and every feature costs money and dev time. Time to market is usually what kills projects. The whole premise of RT was that developers would be able to do less work on a scene, having calculations done for them, etc. But realistically, you still have to design scenes and you're just adding another layer of effects to it that have to be implemented in the whole process.

Another potential hurdle for RT is the state of the world right now. This was supposed to be Ampere's glory moment, RT's 'getting big' generation. What do we have? GPUs that are priced out of the market or simply not there at all, Consoles that launch with new hardware but no games to show it off, and similar availability issues, and a global pandemic keeping us home with lots of time to not enjoy it. The stars for RT are definitely not aligned and this whole situation will probably set it back a few years, if not more. After all, what are devs going to target now? New gen hardware? Or the stuff everybody already has? If you want sales, the last thing you want is to have half your consumer base feel like 'have-nots'.
 
Joined
Jul 13, 2016
Messages
3,242 (1.07/day)
Processor Ryzen 7800X3D
Motherboard ASRock X670E Taichi
Cooling Noctua NH-D15 Chromax
Memory 32GB DDR5 6000 CL30
Video Card(s) MSI RTX 4090 Trio
Storage Too much
Display(s) Acer Predator XB3 27" 240 Hz
Case Thermaltake Core X9
Audio Device(s) Topping DX5, DCA Aeon II
Power Supply Seasonic Prime Titanium 850w
Mouse G305
Keyboard Wooting HE60
VR HMD Valve Index
Software Win 10
"The DLSS 2.0 has managed to bring anywhere from 60-180% increase in frame rate, depending on the scene. These are rather impressive numbers and it goes to show just how well NVIDIA has managed to build its DLSS 2.0 technology. For a full overview, please refer to the blog post."

Impressive if you take them at their word, which would be extremely ill advised given this is Nvidia we are talking about. How many times to people have to fall for Nvidia marketing numbers?

Hardware Unboxed found that you get anywhere from a 12% boost to a 60% boost at 4K (so best case scenario) depending on the game. I don't know where they are getting the 60-180% but it seems likely complete BS. You cannot gain more performance from upscaling than you would have lost by increasing the resolution instead. If I take a 50% performance hit by bumping my resolution to 4K from 2K, the most I can gain is that 50% back through upscaling 2K to 4K. The only scenario where you could gain more than 60% is if you took an old AF video card and tried to run it at 4K, by which point the card doesn't support DLSS anyways.
 
Joined
Sep 1, 2020
Messages
2,305 (1.52/day)
Location
Bulgaria
Nvidia marketing numbers
RTX 3090 is not 18 teraflops! It's have 36 teraflops but where is my 2.25X better frame rate than weak RX 6800 with it's only 16 teraflops... Exist good marketing and also exist a uncovered lie because is not possible to cover it...However in GPU database continuing existence of fake teraflops which Nvidia declare to world.
 
Joined
Oct 15, 2019
Messages
584 (0.32/day)
We could render tons of rays already for ages, the rendering isn't the issue, the speed at which you can, is the issue.
Yes, exactly. My point in replying to @TumbleGeorge 's post was to illustrate how far we have come in raytracing performance. It used to be so that you would need thousands of machines and even then you wouldn't be able to run anything even close to real time because of various reasons related to scale. Now one can run even fully raytraced games with power consumption of less than 500W for a complete system, with vastly superior performance compared to the first economically viable cinema applications. His point of raytraced real time 4k graphics being impossible FOREVER:
Hardware is too weak now and will be not enough an in future. Never!.
Is just complete and utter horse shit when we extrapolate forward the trend in raytracing performance (per watt and per $) from the early render farms to the current modern GPUs.
 
Joined
Oct 28, 2012
Messages
1,184 (0.27/day)
Processor AMD Ryzen 3700x
Motherboard asus ROG Strix B-350I Gaming
Cooling Deepcool LS520 SE
Memory crucial ballistix 32Gb DDR4
Video Card(s) RTX 3070 FE
Storage WD sn550 1To/WD ssd sata 1To /WD black sn750 1To/Seagate 2To/WD book 4 To back-up
Display(s) LG GL850
Case Dan A4 H2O
Audio Device(s) sennheiser HD58X
Power Supply Corsair SF600
Mouse MX master 3
Keyboard Master Key Mx
Software win 11 pro
This situation is not your free will. Nvidia and others decided to justify their attentions to get more money from people with present unreal needs. Games with predefined effects is more than enough beauty when their creators are also and good artists.

I even think of defining the imposition of real-time calculated effects, such as violence on consumers' personal budgets. Because once when Nvidia and others have decided that all models are RTX(or DXR whatever choose different name) they do not leave people the right to choose. Yes today we have option to disable it when play games...But we pay for it with the increased price of the hardware without anyone asking if we want to own it.
Right now I think that it's bigger than just "for gamers". Unity and Unreal have clients that are architechs, product designer, the automotive industry, even film makers. The guys who made that stars wars ray tracing demo (ILMxLab) are not really game developpers, they are doing VR movies, big scale entertainement in walt disney parks (like the Millennium Falcon ride who's using ray tracing). The demo was running on four GPU from a DGX workstation that didn't even have RT cores.
There's a big convergence going on betwen those industry and gaming. The need for ray traced real time graphic might have been something that was pushed by actors outside of gaming. That wouldn't be the first time that "hollywood" tech make it into games. For PBR (physcially based rendering) Epic looked at the work of Dysney for Wreck-It Ralph, and found a way to adapt the principle for real time graphics.

SGI, the guy who made Open GL and Maya, where the guy who helped the dinosaur in jurassic park become a reality, but down the line all the work that they did for movies ended benefiting video games in some way. They worked with sega, the N64 hardware was made by them. As a matter of fact Nvidia and ATI blurred the line with a few of their patents when they started doing their GPUs
SGI, Nvidia Bury the Hatchet, Agree to Share Patents - Computer Business Review (cbronline.com)
SGI v. AMD: Chief Judge Rader on Claim Construction | Patently-O

Even Apple the special snowflake who's not that big into big AAA gaming, ignored the 3D screen fad, and didn't jump into VR as fast as google and everyone else, are already interested in RT. It's interesting because Apple doesn't really do bleeding edge until something is actually usable in some kind of way. (microsoft was first with windows on ARM, but it wasn't ready at all, Apple was late, but did it the right way)
Ray Tracing with Metal - WWDC 2019 - Videos - Apple Developer
 
Last edited:
Joined
Sep 1, 2020
Messages
2,305 (1.52/day)
Location
Bulgaria
Right now I think that it's bigger than just "for gamers". Unity and Unreal have clients that are architechs, product designer, the automotive industry, even film makers. The guys who made that stars wars ray tracing demo (ILMxLab) are not really game developpers, they are doing VR movies, big scale entertainement in walt disney parks (like the Millennium Falcon ride who's using ray tracing). The demo was running on four GPU from a DGX workstation that didn't even have RT cores.
There's a big convergence going on betwen those industry and gaming. The need for ray traced real time graphic might have been something that was pushed by actors outside of gaming. That wouldn't be the first time that "hollywood" tech make it into games. For PBR (physcially based rendering) Epic looked at the work of Dysney for Wreck-It Ralph, and found a way to adapt the principle for real time graphics.

SGI, the guy who made Open GL and Maya, where the guy who helped the dinosaur in jurassic park become a reality, but down the line all the work that they did for movies ended benefiting video games in some way. They worked with sega, the N64 hardware was made by them. As a matter of fact Nvidia and ATI blurred the line with a few of their patents when they started doing their GPUs
SGI, Nvidia Bury the Hatchet, Agree to Share Patents - Computer Business Review (cbronline.com)
SGI v. AMD: Chief Judge Rader on Claim Construction | Patently-O

Even Apple the special snowflake who's not that big into big AAA gaming, ignored the 3D screen fad, and didn't jump into VR as fast as google and everyone else, are already interested in RT. It's interesting because Apple doesn't really do bleeding edge until something is actually usable in some kind of way. (microsoft was first with windows on ARM, but it wasn't ready at all, Apple was late, but did it the right way)
Ray Tracing with Metal - WWDC 2019 - Videos - Apple Developer
Yes but professional just using latest Nvidia Quadro or AMD Radeon Pro depending on used software to get better and faster results. I'm absolutely sure that they have enough money for that and not use gaming oriented cards for it's work like poor semi-pro or non pro users. Time for professionals is more expensive than hardware. Hardware it's a consumable, you use it while it's profitable, afted then you pay to throw it away as waste ... and it's usually sold second-hand in poor countries like mine.
 
Joined
Oct 28, 2012
Messages
1,184 (0.27/day)
Processor AMD Ryzen 3700x
Motherboard asus ROG Strix B-350I Gaming
Cooling Deepcool LS520 SE
Memory crucial ballistix 32Gb DDR4
Video Card(s) RTX 3070 FE
Storage WD sn550 1To/WD ssd sata 1To /WD black sn750 1To/Seagate 2To/WD book 4 To back-up
Display(s) LG GL850
Case Dan A4 H2O
Audio Device(s) sennheiser HD58X
Power Supply Corsair SF600
Mouse MX master 3
Keyboard Master Key Mx
Software win 11 pro
Yes but professional just using latest Nvidia Quadro or AMD Radeon Pro depending on used software to get better and faster results. I'm absolutely sure that they have enough money for that and not use gaming oriented cards for it's work like poor semi-pro or non pro users. Time for professionals is more expensive than hardware. Hardware it's a consumable, you use it while it's profitable, afted then you pay to throw it away as waste ... and it's usually sold second-hand in poor countries like mine.
Well, sadly capitalism and technical breaktrough are a pair. They saw an opportunity to bring that tech to market and jumped on it... RT and machine learning are a bit odd, it's really bleeding edge with only a small benefits for gamers, but for freelancers ? The value is insane. You can say that there's a competion between gamers and 3d illustrators, motion designer, artist/engineer doing A.I at Home. I'm following a lot of artist who have rigs that would make people on tech forums jealous, but they don't game on them.
Remember how smartphone were meant for professional at first, that it was too expensive to ever become mainstream, and now every teenager got one ? Tech is evolving and what we are doing with it does as well. Eventually the entry point will become low enough, but with everything from phones to cars competing for chip manufacturing, I don't know what will happen. Nvidia switched to a two year realease cycle, and with Intel joining the TSMC fun things might become more and more complex
1612392845136.png

1612393304207.png
 
Joined
Sep 17, 2014
Messages
22,275 (6.02/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Yes, exactly. My point in replying to @TumbleGeorge 's post was to illustrate how far we have come in raytracing performance. It used to be so that you would need thousands of machines and even then you wouldn't be able to run anything even close to real time because of various reasons related to scale. Now one can run even fully raytraced games with power consumption of less than 500W for a complete system, with vastly superior performance compared to the first economically viable cinema applications. His point of raytraced real time 4k graphics being impossible FOREVER:

Is just complete and utter horse shit when we extrapolate forward the trend in raytracing performance (per watt and per $) from the early render farms to the current modern GPUs.

Never is a big thing, but if you look at what we're pushing forward in gaming with RT now, its not thát impressive at all. The game may look nice, but that is only in small part due to RT. Its a different environment.
 
Joined
Oct 15, 2019
Messages
584 (0.32/day)
Never is a big thing, but if you look at what we're pushing forward in gaming with RT now, its not thát impressive at all. The game may look nice, but that is only in small part due to RT. Its a different environment.
Yeah, fully path traced games are virtually non-existent right now, but when the adoption rate of 3070+ level of GPU:s is over 50% in a couple of years one can bring something to the market and people will buy it. I would guess that the next console generation is the one to really enable the change in 5-7 years.
 
Joined
Jan 21, 2020
Messages
109 (0.06/day)
Well.... history for computer graphics has a few examples of completely failed technological advancements that initially lots of people were all crazy about.
PhysX
well, you can repeat that old lie a hundred times and it will not be any more true than it was yesterday. funny thing is, physx is integrated in engines like unreal engine, unity and others. it's been like that for years now. dozens of games use it. physx is actually a successful technology based simply on adoption, even though games no longer bear the physx sticker. you've been playing games with the nasty physx that you hate so much, without even knowing it, for such a long time :D i wonder how that must feel like, ouch! and even though it's being phased out in favor of chaos once that is ready, it is still there at the moment:
Chaos Physics is a lightweight physics solver, and when it is production ready, it will replace PhysX. To learn more about Chaos, read the following documents.

but honestly, everyone here knows only too well how full of hot air you are. i mean .. the big boss himself slapped you for telling lies multiple times just under this single article, with style and gusto i might add
Its really simple, magic does not happen, it gets precooked on Nvidia's DGX farms and if they didn't do that for you, you're simply out of luck, in that case the best you get is a small update to what AMD FidelityFX also has on offer - NON proprietary. A simple blurry upscale with sharpening overlays.

Let's stop beating around the bush. If you lack Nvidia's extended support for DLSS its nothing special at all and never will be, no matter how much misdirecting marketing they pour over it.
That's not how DLSS works since v2.0. It's game agnostic now, doesn't require per-title training

But then the gains are hardly as great, right?
Actually DLSS 2.0 works much better. Surprised you missed that.

What makes a huge difference is that the game now feeds motion vectors to the algorithm, which solves the TAA movement problem

comedy gold
 
Joined
Sep 17, 2014
Messages
22,275 (6.02/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
well, you can repeat that old lie a hundred times and it will not be any more true than it was yesterday. funny thing is, physx is integrated in engines like unreal engine, unity and others. it's been like that for years now. dozens of games use it. physx is actually a successful technology based simply on adoption, even though games no longer bear the physx sticker. you've been playing games with the nasty physx that you hate so much, without even knowing it, for such a long time :D i wonder how that must feel like, ouch! and even though it's being phased out in favor of chaos once that is ready, it is still there at the moment:


but honestly, everyone here knows only too well how full of hot air you are. i mean .. the big boss himself slapped you for telling lies multiple times just under this single article, with style and gusto i might add






comedy gold
Comedy, so you can tell me now what DLSS 2.0's implementation will do in terms of performance against different versions?

Cool story! But keep quoting, maybe you'll get those results that way. I'm not a fan of getting marketing force-fed to me so that eventually I'll believe nonsense. Apparently that is what some here do prefer, before seeing the actual numbers. To each their own ;) Maybe it signifies the overall level of conversation in here more than anything? We're looking at an Nvidia announcement specifying big performance gains, but with no numbers to back it up, and limited to an engine combined with an Nvidia card. So yes, comparisons matter.

The PhysX implementation that is non proprietary, is a CPU physx library. Not the GPU one that accelerated pretty neat effects in a handful of games. In that sense, we have replacements now but overall the push for good ingame physics engines is on a very low priority altogether. Funny how PhysX is getting phased out regardless, don't you think? I also love your assumptions, because you apparently really were eager to make a hate post towards my person before actually thinking about it. I would love more and broad adoption of GPU PhysX, because the fact remains that what's left of it on CPU is extremely limited.

Oh btw, your link actually provides more support for the argument that PhysX is dead, the new library isn't based on it either:
"Chaos Physics is a Beta feature that is the light-weight physics simulation solution used in Fortnite"

'Ouch'... yeah you really burned shit bro, damn! Hope you feel good about it.
 
Last edited:
Joined
Jul 9, 2015
Messages
3,413 (1.00/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
DLSS 2.0 have proven that it works
2.0 is essentially NV surrendering on NN idea (which was 1.0)
2.0 is essentially TAA derivative with all weaknesses (blur, loss of detail, terrible handling of small, quickly moving objects) and strengths of it (improved lines, details improve over time).

Source:

Still to remember is:
DLSS 2 processing eats good half of the perfrormance bump due to running at lower resolutions.

AMD has a wide range of CROSS PLAT, CROSS VENDOR tech that has excellent results (Fidelity FX CAS, and checkboard rendering in particular) that are simply not hyped into oblivion by FUD lowers.
 
Joined
Nov 6, 2019
Messages
38 (0.02/day)
2.0 is essentially NV surrendering on NN idea (which was 1.0)
2.0 is essentially TAA derivative with all weaknesses (blur, loss of detail, terrible handling of small, quickly moving objects) and strengths of it (improved lines, details improve over time).

Source:

Still to remember is:
DLSS 2 processing eats good half of the perfrormance bump due to running at lower resolutions.

AMD has a wide range of CROSS PLAT, CROSS VENDOR tech that has excellent results (Fidelity FX CAS, and checkboard rendering in particular) that are simply not hyped into oblivion by FUD lowers.


Calling DLSS 2.0 blurry, terrible etc and then suggesting that primitive upscaling such as FX cas with sharpening filter provides "excellent results". I don't know if serious or .... You can test it in Cyberpunk. AMD upscaling looks like a blurry mess doesn't matter what you settings try and the performance uplift is laughable at best. Not to mention, no raytracing.

Still remember this:

DLSS 2.0 > Anyting amd can offer including image quality and performance wise.


Death Stranding with DLSS 2.0 on looks better than the native. But you tried.
 
Joined
Jul 9, 2015
Messages
3,413 (1.00/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Death Stranding with DLSS 2.0 on looks better than the native
Yeah, and let me pick up that screen with huge blurry face that has barely any texture on it, but has eyebrowse so that TAA can shine at improving lines, while nobody catches it's weaknesses as there is nothing else on the screen. That would sound convincing, thanks. :D

Oh, user with barely any posts is triggered about lack of butt kissing terrifyingly overhyped NV tech, color me surprised... ;)
 
Joined
Nov 6, 2019
Messages
38 (0.02/day)
Yeah, and let me pick up that screen with huge blurry face that has barely any texture on it, but has eyebrowse so that TAA can shine at improving lines, while nobody catches it's weaknesses as there is nothing else on the screen. That would sound convincing, thanks. :D

Oh, user with barely any posts is triggered about lack of butt kissing terrifyingly overhyped NV tech, color me surprised... ;)


There is only one guy triggered here the one who has gone so far he checks other accounts and mentions the number of posts as the way how to validate his delusions. Imagine stooping so low. Thanks for the laughs.
 
Joined
Sep 15, 2011
Messages
6,680 (1.39/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
And stuck at 24 FPS ever since :)
You are not "stuck". Or the Movie industry isn't "stuck". Look at all the best TV shows nowadays. They are all shoot on 24fps in 4K (GOT, the Expanse, American Gods, The Mandalorian, etc, etc).
Did you have a chance to watch "The Hobbit" on 60fps?? A DISASTER. The whole movie effect was gone, was like watching a live session on TV.
23.99fps is the perfect format for the movies, I wouldn't want any other way. For Live TV, like F1, football, concerts, etc, yeah, give me 60 or even 120fps.
 
Last edited:
Low quality post by medi01
Joined
Jul 9, 2015
Messages
3,413 (1.00/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
one guy triggered
Single screen that is not a blurry bob in its original state would suffice, triggered guy with 14 posts, oh, sorry, it's 15... :D
 
Joined
Aug 2, 2011
Messages
1,458 (0.30/day)
Processor Ryzen 9 7950X3D
Motherboard MSI X670E MPG Carbon Wifi
Cooling Custom loop, 2x360mm radiator,Lian Li UNI, EK XRes140,EK Velocity2
Memory 2x16GB G.Skill DDR5-6400 @ 6400MHz C32
Video Card(s) EVGA RTX 3080 Ti FTW3 Ultra OC Scanner core +750 mem
Storage MP600 Pro 2TB,960 EVO 1TB,XPG SX8200 Pro 1TB,Micron 1100 2TB,1.5TB Caviar Green
Display(s) Alienware AW3423DWF, Acer XB270HU
Case LianLi O11 Dynamic White
Audio Device(s) Logitech G-Pro X Wireless
Power Supply EVGA P3 1200W
Mouse Logitech G502X Lightspeed
Keyboard Logitech G512 Carbon w/ GX Brown
VR HMD HP Reverb G2 (V2)
Software Win 11
Single screen that is not a blurry bob in its original state would suffice, triggered guy with 14 posts, oh, sorry, it's 15... :D

Imagine trying to use post count on an internet forum as a form of put down.
 
Low quality post by medi01
Joined
Jul 9, 2015
Messages
3,413 (1.00/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Imagine trying to use post count on an internet forum as a form of put down.
Oh, that was purely in the context of getting triggered discussion.
A person who is barely posting, could not leave post largely unimpressed with DLSS without comment.
I think number of posts is somewhat relevant here, is it not? :D
 
Joined
Sep 17, 2014
Messages
22,275 (6.02/day)
Location
The Washing Machine
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling Thermalright Peerless Assassin
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
You are not "stuck". Or the Movie industry isn't "stuck". Look at all the best TV shows nowadays. They are all shoot on 24fps in 4K (GOT, the Expanse, American Gods, The Mandalorian, etc, etc).
Did you have a chance to watch "The Hobbit" on 60fps?? A DISASTER. The whole movie effect was gone, was like watching a life session on TV.
23.99fps is the perfect format for the movies, I wouldn't want any other way. For Live TV, like F1, football, concerts, etc, yeah, give me 60 or even 120fps.

You are correct, the so called soap opera effect is what they call that.

Its a matter of what you're getting used to. Not all film is 24hz either and the Hobbit was 48hz wasn't it?

My experience, especially in cinema is that the low framerate causes lots of problems in fast motion. Things feel like a slideshow sometimes. Does it feel better in some way? I really can't say that it does. The Hobbit was a strange exception to a rule so it stood out and I agree as an example taken like that, it wasn't pretty. But I also think higher refresh rates were never explored for film simply because it means a major cost increase as you have many more frames to edit, and there is no apparent 'demand' for it. Which can also be explained as, we're used to it, we don't know if it can be better at higher FPS. But still, gaming tells us that higher framerates are a direct win for motion clarity and immersion.

As much as preference is what it is, keep in mind our mind and eyes are very capable of getting comfortable with pretty shitty things, just look back on film itself... If you look at very old footage now its almost painful. We have adjusted.
 

MxPhenom 216

ASIC Engineer
Joined
Aug 31, 2010
Messages
13,006 (2.51/day)
Location
Loveland, CO
System Name Ryzen Reflection
Processor AMD Ryzen 9 5900x
Motherboard Gigabyte X570S Aorus Master
Cooling 2x EK PE360 | TechN AM4 AMD Block Black | EK Quantum Vector Trinity GPU Nickel + Plexi
Memory Teamgroup T-Force Xtreem 2x16GB B-Die 3600 @ 14-14-14-28-42-288-2T 1.45v
Video Card(s) Zotac AMP HoloBlack RTX 3080Ti 12G | 950mV 1950Mhz
Storage WD SN850 500GB (OS) | Samsung 980 Pro 1TB (Games_1) | Samsung 970 Evo 1TB (Games_2)
Display(s) Asus XG27AQM 240Hz G-Sync Fast-IPS | Gigabyte M27Q-P 165Hz 1440P IPS | Asus 24" IPS (portrait mode)
Case Lian Li PC-011D XL | Custom cables by Cablemodz
Audio Device(s) FiiO K7 | Sennheiser HD650 + Beyerdynamic FOX Mic
Power Supply Seasonic Prime Ultra Platinum 850
Mouse Razer Viper v2 Pro
Keyboard Corsair K65 Plus 75% Wireless - USB Mode
Software Windows 11 Pro 64-Bit
RTX 3090 is not 18 teraflops! It's have 36 teraflops but where is my 2.25X better frame rate than weak RX 6800 with it's only 16 teraflops... Exist good marketing and also exist a uncovered lie because is not possible to cover it...However in GPU database continuing existence of fake teraflops which Nvidia declare to world.

Probably because teraflops is only determined via a single part of the function of the GPU, not the entire data path from in and out of the GPU. Its mostly used for marketting, especially on consoles to give those morons something to fight over.

Teraflops: a unit of computing speed equal to one million million (10^12) floating-point operations per second.

That number does not tell the whole story of a GPUs performance.
 

r9

Joined
Jul 28, 2008
Messages
3,300 (0.56/day)
System Name Primary|Secondary|Poweredge r410|Dell XPS|SteamDeck
Processor i7 11700k|i7 9700k|2 x E5620 |i5 5500U|Zen 2 4c/8t
Memory 32GB DDR4|16GB DDR4|16GB DDR4|32GB ECC DDR3|8GB DDR4|16GB LPDDR5
Video Card(s) RX 7800xt|RX 6700xt |On-Board|On-Board|8 RDNA 2 CUs
Storage 2TB m.2|512GB SSD+1TB SSD|2x256GBSSD 2x2TBGB|256GB sata|512GB nvme
Display(s) 50" 4k TV | Dell 27" |22" |3.3"|7"
VR HMD Samsung Odyssey+ | Oculus Quest 2
Software Windows 11 Pro|Windows 10 Pro|Windows 10 Home| Server 2012 r2|Windows 10 Pro
You get huge FPS boost with graphics quality so close that you have to zoom in and analyze pixel by pixel to tell the difference and in some scenarios DLS 2.0 it even looks better.
Sow why is everybody stuck on how it's done ?! As long you don't have to rub two dicks together who cares.
 
Joined
Jul 9, 2015
Messages
3,413 (1.00/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
You get huge FPS boost with graphics quality so close that you have to zoom in and analyze pixel by pixel to tell the difference
4k is 2.2+ times more pixels than 1440p, naturally.
Even 1440p is plenty of pixels for most screens.

Still, bush in frequently shared Death Stranding "improvement" screenshot from sites overhyping the tech, is very visibly blurred/lost details, at the same time, long grass on the right is improved, as one would have expected from a TAA derivative, shrug.
 

Aquinus

Resident Wat-man
Joined
Jan 28, 2012
Messages
13,161 (2.82/day)
Location
Concord, NH, USA
System Name Apollo
Processor Intel Core i9 9880H
Motherboard Some proprietary Apple thing.
Memory 64GB DDR4-2667
Video Card(s) AMD Radeon Pro 5600M, 8GB HBM2
Storage 1TB Apple NVMe, 4TB External
Display(s) Laptop @ 3072x1920 + 2x LG 5k Ultrafine TB3 displays
Case MacBook Pro (16", 2019)
Audio Device(s) AirPods Pro, Sennheiser HD 380s w/ FIIO Alpen 2, or Logitech 2.1 Speakers
Power Supply 96w Power Adapter
Mouse Logitech MX Master 3
Keyboard Logitech G915, GL Clicky
Software MacOS 12.1
Death Stranding with DLSS 2.0 on looks better than the native. But you tried.
Maybe we should call it Upscaling 3.0. When push comes to shove, that's all it really is. A pig with lipstick is still a pig. :laugh:

To be clear, this isn't a bad strategy when your hardware can't run native. It's just a little amusing to say it's as good as running native.
 
Top