Wednesday, February 3rd 2021

NVIDIA's DLSS 2.0 is Easily Integrated into Unreal Engine 4.26 and Gives Big Performance Gains

When NVIDIA launched the second iteration of its Deep Learning Super Sampling (DLSS) technique used to upscale lower resolutions using deep learning, everyone was impressed by the quality of the rendering it is putting out. However, have you ever wondered how it all looks from the developer side of things? Usually, games need millions of lines of code and even some small features are not so easy to implement. Today, thanks to Tom Looman, a game developer working with Unreal Engine, we have found out just how the integration process of DLSS 2.0 looks like, and how big are the performance benefits coming from it.

Inthe blog post, you can take a look at the example game shown by the developer. The integration with Unreal Engine 4.26 is easy, as it just requires that you compile your project with a special UE4 RTX branch, and you need to apply your AppID which you can apply for at NVIDIA's website. Right now you are probably wondering how is performance looking like. Well, the baseline for the result was TXAA sampling techniques used in the game demo. The DLSS 2.0 has managed to bring anywhere from 60-180% increase in frame rate, depending on the scene. These are rather impressive numbers and it goes to show just how well NVIDIA has managed to build its DLSS 2.0 technology. For a full overview, please refer to the blog post.
Source: Tom Looman Blog
Add your own comment

74 Comments on NVIDIA's DLSS 2.0 is Easily Integrated into Unreal Engine 4.26 and Gives Big Performance Gains

#51
evernessince
"The DLSS 2.0 has managed to bring anywhere from 60-180% increase in frame rate, depending on the scene. These are rather impressive numbers and it goes to show just how well NVIDIA has managed to build its DLSS 2.0 technology. For a full overview, please refer to the blog post."

Impressive if you take them at their word, which would be extremely ill advised given this is Nvidia we are talking about. How many times to people have to fall for Nvidia marketing numbers?

Hardware Unboxed found that you get anywhere from a 12% boost to a 60% boost at 4K (so best case scenario) depending on the game. I don't know where they are getting the 60-180% but it seems likely complete BS. You cannot gain more performance from upscaling than you would have lost by increasing the resolution instead. If I take a 50% performance hit by bumping my resolution to 4K from 2K, the most I can gain is that 50% back through upscaling 2K to 4K. The only scenario where you could gain more than 60% is if you took an old AF video card and tried to run it at 4K, by which point the card doesn't support DLSS anyways.
Posted on Reply
#52
TumbleGeorge
evernessinceNvidia marketing numbers
RTX 3090 is not 18 teraflops! It's have 36 teraflops but where is my 2.25X better frame rate than weak RX 6800 with it's only 16 teraflops... Exist good marketing and also exist a uncovered lie because is not possible to cover it...However in GPU database continuing existence of fake teraflops which Nvidia declare to world.
Posted on Reply
#53
Dredi
Vayra86We could render tons of rays already for ages, the rendering isn't the issue, the speed at which you can, is the issue.
Yes, exactly. My point in replying to @TumbleGeorge 's post was to illustrate how far we have come in raytracing performance. It used to be so that you would need thousands of machines and even then you wouldn't be able to run anything even close to real time because of various reasons related to scale. Now one can run even fully raytraced games with power consumption of less than 500W for a complete system, with vastly superior performance compared to the first economically viable cinema applications. His point of raytraced real time 4k graphics being impossible FOREVER:
TumbleGeorgeHardware is too weak now and will be not enough an in future. Never!.
Is just complete and utter horse shit when we extrapolate forward the trend in raytracing performance (per watt and per $) from the early render farms to the current modern GPUs.
Posted on Reply
#54
dyonoctis
TumbleGeorgeThis situation is not your free will. Nvidia and others decided to justify their attentions to get more money from people with present unreal needs. Games with predefined effects is more than enough beauty when their creators are also and good artists.

I even think of defining the imposition of real-time calculated effects, such as violence on consumers' personal budgets. Because once when Nvidia and others have decided that all models are RTX(or DXR whatever choose different name) they do not leave people the right to choose. Yes today we have option to disable it when play games...But we pay for it with the increased price of the hardware without anyone asking if we want to own it.
Right now I think that it's bigger than just "for gamers". Unity and Unreal have clients that are architechs, product designer, the automotive industry, even film makers. The guys who made that stars wars ray tracing demo (ILMxLab) are not really game developpers, they are doing VR movies, big scale entertainement in walt disney parks (like the Millennium Falcon ride who's using ray tracing). The demo was running on four GPU from a DGX workstation that didn't even have RT cores.
There's a big convergence going on betwen those industry and gaming. The need for ray traced real time graphic might have been something that was pushed by actors outside of gaming. That wouldn't be the first time that "hollywood" tech make it into games. For PBR (physcially based rendering) Epic looked at the work of Dysney for Wreck-It Ralph, and found a way to adapt the principle for real time graphics.

SGI, the guy who made Open GL and Maya, where the guy who helped the dinosaur in jurassic park become a reality, but down the line all the work that they did for movies ended benefiting video games in some way. They worked with sega, the N64 hardware was made by them. As a matter of fact Nvidia and ATI blurred the line with a few of their patents when they started doing their GPUs
SGI, Nvidia Bury the Hatchet, Agree to Share Patents - Computer Business Review (cbronline.com)
SGI v. AMD: Chief Judge Rader on Claim Construction | Patently-O

Even Apple the special snowflake who's not that big into big AAA gaming, ignored the 3D screen fad, and didn't jump into VR as fast as google and everyone else, are already interested in RT. It's interesting because Apple doesn't really do bleeding edge until something is actually usable in some kind of way. (microsoft was first with windows on ARM, but it wasn't ready at all, Apple was late, but did it the right way)
Ray Tracing with Metal - WWDC 2019 - Videos - Apple Developer
Posted on Reply
#55
TumbleGeorge
dyonoctisRight now I think that it's bigger than just "for gamers". Unity and Unreal have clients that are architechs, product designer, the automotive industry, even film makers. The guys who made that stars wars ray tracing demo (ILMxLab) are not really game developpers, they are doing VR movies, big scale entertainement in walt disney parks (like the Millennium Falcon ride who's using ray tracing). The demo was running on four GPU from a DGX workstation that didn't even have RT cores.
There's a big convergence going on betwen those industry and gaming. The need for ray traced real time graphic might have been something that was pushed by actors outside of gaming. That wouldn't be the first time that "hollywood" tech make it into games. For PBR (physcially based rendering) Epic looked at the work of Dysney for Wreck-It Ralph, and found a way to adapt the principle for real time graphics.

SGI, the guy who made Open GL and Maya, where the guy who helped the dinosaur in jurassic park become a reality, but down the line all the work that they did for movies ended benefiting video games in some way. They worked with sega, the N64 hardware was made by them. As a matter of fact Nvidia and ATI blurred the line with a few of their patents when they started doing their GPUs
SGI, Nvidia Bury the Hatchet, Agree to Share Patents - Computer Business Review (cbronline.com)
SGI v. AMD: Chief Judge Rader on Claim Construction | Patently-O

Even Apple the special snowflake who's not that big into big AAA gaming, ignored the 3D screen fad, and didn't jump into VR as fast as google and everyone else, are already interested in RT. It's interesting because Apple doesn't really do bleeding edge until something is actually usable in some kind of way. (microsoft was first with windows on ARM, but it wasn't ready at all, Apple was late, but did it the right way)
Ray Tracing with Metal - WWDC 2019 - Videos - Apple Developer
Yes but professional just using latest Nvidia Quadro or AMD Radeon Pro depending on used software to get better and faster results. I'm absolutely sure that they have enough money for that and not use gaming oriented cards for it's work like poor semi-pro or non pro users. Time for professionals is more expensive than hardware. Hardware it's a consumable, you use it while it's profitable, afted then you pay to throw it away as waste ... and it's usually sold second-hand in poor countries like mine.
Posted on Reply
#56
dyonoctis
TumbleGeorgeYes but professional just using latest Nvidia Quadro or AMD Radeon Pro depending on used software to get better and faster results. I'm absolutely sure that they have enough money for that and not use gaming oriented cards for it's work like poor semi-pro or non pro users. Time for professionals is more expensive than hardware. Hardware it's a consumable, you use it while it's profitable, afted then you pay to throw it away as waste ... and it's usually sold second-hand in poor countries like mine.
Well, sadly capitalism and technical breaktrough are a pair. They saw an opportunity to bring that tech to market and jumped on it... RT and machine learning are a bit odd, it's really bleeding edge with only a small benefits for gamers, but for freelancers ? The value is insane. You can say that there's a competion between gamers and 3d illustrators, motion designer, artist/engineer doing A.I at Home. I'm following a lot of artist who have rigs that would make people on tech forums jealous, but they don't game on them.
Remember how smartphone were meant for professional at first, that it was too expensive to ever become mainstream, and now every teenager got one ? Tech is evolving and what we are doing with it does as well. Eventually the entry point will become low enough, but with everything from phones to cars competing for chip manufacturing, I don't know what will happen. Nvidia switched to a two year realease cycle, and with Intel joining the TSMC fun things might become more and more complex

Posted on Reply
#57
Vayra86
DrediYes, exactly. My point in replying to @TumbleGeorge 's post was to illustrate how far we have come in raytracing performance. It used to be so that you would need thousands of machines and even then you wouldn't be able to run anything even close to real time because of various reasons related to scale. Now one can run even fully raytraced games with power consumption of less than 500W for a complete system, with vastly superior performance compared to the first economically viable cinema applications. His point of raytraced real time 4k graphics being impossible FOREVER:

Is just complete and utter horse shit when we extrapolate forward the trend in raytracing performance (per watt and per $) from the early render farms to the current modern GPUs.
Never is a big thing, but if you look at what we're pushing forward in gaming with RT now, its not thát impressive at all. The game may look nice, but that is only in small part due to RT. Its a different environment.
Posted on Reply
#58
Dredi
Vayra86Never is a big thing, but if you look at what we're pushing forward in gaming with RT now, its not thát impressive at all. The game may look nice, but that is only in small part due to RT. Its a different environment.
Yeah, fully path traced games are virtually non-existent right now, but when the adoption rate of 3070+ level of GPU:s is over 50% in a couple of years one can bring something to the market and people will buy it. I would guess that the next console generation is the one to really enable the change in 5-7 years.
Posted on Reply
#59
Jinxed
Vayra86Well.... history for computer graphics has a few examples of completely failed technological advancements that initially lots of people were all crazy about.
PhysX
well, you can repeat that old lie a hundred times and it will not be any more true than it was yesterday. funny thing is, physx is integrated in engines like unreal engine, unity and others. it's been like that for years now. dozens of games use it. physx is actually a successful technology based simply on adoption, even though games no longer bear the physx sticker. you've been playing games with the nasty physx that you hate so much, without even knowing it, for such a long time :D i wonder how that must feel like, ouch! and even though it's being phased out in favor of chaos once that is ready, it is still there at the moment:
Chaos Physics is a lightweight physics solver, and when it is production ready, it will replace PhysX. To learn more about Chaos, read the following documents.
docs.unrealengine.com/en-US/InteractiveExperiences/Physics/ChaosPhysics/index.html

but honestly, everyone here knows only too well how full of hot air you are. i mean .. the big boss himself slapped you for telling lies multiple times just under this single article, with style and gusto i might add
Vayra86Its really simple, magic does not happen, it gets precooked on Nvidia's DGX farms and if they didn't do that for you, you're simply out of luck, in that case the best you get is a small update to what AMD FidelityFX also has on offer - NON proprietary. A simple blurry upscale with sharpening overlays.

Let's stop beating around the bush. If you lack Nvidia's extended support for DLSS its nothing special at all and never will be, no matter how much misdirecting marketing they pour over it.
W1zzardThat's not how DLSS works since v2.0. It's game agnostic now, doesn't require per-title training
Vayra86But then the gains are hardly as great, right?
W1zzardActually DLSS 2.0 works much better. Surprised you missed that.

What makes a huge difference is that the game now feeds motion vectors to the algorithm, which solves the TAA movement problem
comedy gold
Posted on Reply
#60
Vayra86
Jinxedwell, you can repeat that old lie a hundred times and it will not be any more true than it was yesterday. funny thing is, physx is integrated in engines like unreal engine, unity and others. it's been like that for years now. dozens of games use it. physx is actually a successful technology based simply on adoption, even though games no longer bear the physx sticker. you've been playing games with the nasty physx that you hate so much, without even knowing it, for such a long time :D i wonder how that must feel like, ouch! and even though it's being phased out in favor of chaos once that is ready, it is still there at the moment:

docs.unrealengine.com/en-US/InteractiveExperiences/Physics/ChaosPhysics/index.html

but honestly, everyone here knows only too well how full of hot air you are. i mean .. the big boss himself slapped you for telling lies multiple times just under this single article, with style and gusto i might add






comedy gold
Comedy, so you can tell me now what DLSS 2.0's implementation will do in terms of performance against different versions?

Cool story! But keep quoting, maybe you'll get those results that way. I'm not a fan of getting marketing force-fed to me so that eventually I'll believe nonsense. Apparently that is what some here do prefer, before seeing the actual numbers. To each their own ;) Maybe it signifies the overall level of conversation in here more than anything? We're looking at an Nvidia announcement specifying big performance gains, but with no numbers to back it up, and limited to an engine combined with an Nvidia card. So yes, comparisons matter.

The PhysX implementation that is non proprietary, is a CPU physx library. Not the GPU one that accelerated pretty neat effects in a handful of games. In that sense, we have replacements now but overall the push for good ingame physics engines is on a very low priority altogether. Funny how PhysX is getting phased out regardless, don't you think? I also love your assumptions, because you apparently really were eager to make a hate post towards my person before actually thinking about it. I would love more and broad adoption of GPU PhysX, because the fact remains that what's left of it on CPU is extremely limited.

Oh btw, your link actually provides more support for the argument that PhysX is dead, the new library isn't based on it either:
"Chaos Physics is a Beta feature that is the light-weight physics simulation solution used in Fortnite"

'Ouch'... yeah you really burned shit bro, damn! Hope you feel good about it.
Posted on Reply
#61
medi01
watzupkenDLSS 2.0 have proven that it works
2.0 is essentially NV surrendering on NN idea (which was 1.0)
2.0 is essentially TAA derivative with all weaknesses (blur, loss of detail, terrible handling of small, quickly moving objects) and strengths of it (improved lines, details improve over time).

Source:
www.anandtech.com/show/15648/nvidia-intros-dlss-20-adds-motion-vectors

Still to remember is:
DLSS 2 processing eats good half of the perfrormance bump due to running at lower resolutions.

AMD has a wide range of CROSS PLAT, CROSS VENDOR tech that has excellent results (Fidelity FX CAS, and checkboard rendering in particular) that are simply not hyped into oblivion by FUD lowers.
Posted on Reply
#62
SaLaDiN666
medi012.0 is essentially NV surrendering on NN idea (which was 1.0)
2.0 is essentially TAA derivative with all weaknesses (blur, loss of detail, terrible handling of small, quickly moving objects) and strengths of it (improved lines, details improve over time).

Source:
www.anandtech.com/show/15648/nvidia-intros-dlss-20-adds-motion-vectors

Still to remember is:
DLSS 2 processing eats good half of the perfrormance bump due to running at lower resolutions.

AMD has a wide range of CROSS PLAT, CROSS VENDOR tech that has excellent results (Fidelity FX CAS, and checkboard rendering in particular) that are simply not hyped into oblivion by FUD lowers.
Calling DLSS 2.0 blurry, terrible etc and then suggesting that primitive upscaling such as FX cas with sharpening filter provides "excellent results". I don't know if serious or .... You can test it in Cyberpunk. AMD upscaling looks like a blurry mess doesn't matter what you settings try and the performance uplift is laughable at best. Not to mention, no raytracing.

Still remember this:

DLSS 2.0 > Anyting amd can offer including image quality and performance wise.


Death Stranding with DLSS 2.0 on looks better than the native. But you tried.
Posted on Reply
#63
TumbleGeorge
SaLaDiN666looks better than the native
LoL! Maybe game creators will save money and time and will make games fast and ugly because DLSS 2.0 will fix that. :D
Posted on Reply
#64
medi01
SaLaDiN666Death Stranding with DLSS 2.0 on looks better than the native
Yeah, and let me pick up that screen with huge blurry face that has barely any texture on it, but has eyebrowse so that TAA can shine at improving lines, while nobody catches it's weaknesses as there is nothing else on the screen. That would sound convincing, thanks. :D

Oh, user with barely any posts is triggered about lack of butt kissing terrifyingly overhyped NV tech, color me surprised... ;)
Posted on Reply
#65
SaLaDiN666
medi01Yeah, and let me pick up that screen with huge blurry face that has barely any texture on it, but has eyebrowse so that TAA can shine at improving lines, while nobody catches it's weaknesses as there is nothing else on the screen. That would sound convincing, thanks. :D

Oh, user with barely any posts is triggered about lack of butt kissing terrifyingly overhyped NV tech, color me surprised... ;)
www.eurogamer.net/articles/digitalfoundry-2020-image-reconstruction-death-stranding-face-off

There is only one guy triggered here the one who has gone so far he checks other accounts and mentions the number of posts as the way how to validate his delusions. Imagine stooping so low. Thanks for the laughs.
Posted on Reply
#66
Prima.Vera
Vayra86And stuck at 24 FPS ever since :)
You are not "stuck". Or the Movie industry isn't "stuck". Look at all the best TV shows nowadays. They are all shoot on 24fps in 4K (GOT, the Expanse, American Gods, The Mandalorian, etc, etc).
Did you have a chance to watch "The Hobbit" on 60fps?? A DISASTER. The whole movie effect was gone, was like watching a live session on TV.
23.99fps is the perfect format for the movies, I wouldn't want any other way. For Live TV, like F1, football, concerts, etc, yeah, give me 60 or even 120fps.
Posted on Reply
#67
Slizzo
medi01Single screen that is not a blurry bob in its original state would suffice, triggered guy with 14 posts, oh, sorry, it's 15... :D
Imagine trying to use post count on an internet forum as a form of put down.
Posted on Reply
#68
Vayra86
Prima.VeraYou are not "stuck". Or the Movie industry isn't "stuck". Look at all the best TV shows nowadays. They are all shoot on 24fps in 4K (GOT, the Expanse, American Gods, The Mandalorian, etc, etc).
Did you have a chance to watch "The Hobbit" on 60fps?? A DISASTER. The whole movie effect was gone, was like watching a life session on TV.
23.99fps is the perfect format for the movies, I wouldn't want any other way. For Live TV, like F1, football, concerts, etc, yeah, give me 60 or even 120fps.
You are correct, the so called soap opera effect is what they call that.

Its a matter of what you're getting used to. Not all film is 24hz either and the Hobbit was 48hz wasn't it?

My experience, especially in cinema is that the low framerate causes lots of problems in fast motion. Things feel like a slideshow sometimes. Does it feel better in some way? I really can't say that it does. The Hobbit was a strange exception to a rule so it stood out and I agree as an example taken like that, it wasn't pretty. But I also think higher refresh rates were never explored for film simply because it means a major cost increase as you have many more frames to edit, and there is no apparent 'demand' for it. Which can also be explained as, we're used to it, we don't know if it can be better at higher FPS. But still, gaming tells us that higher framerates are a direct win for motion clarity and immersion.

As much as preference is what it is, keep in mind our mind and eyes are very capable of getting comfortable with pretty shitty things, just look back on film itself... If you look at very old footage now its almost painful. We have adjusted.
Posted on Reply
#69
MxPhenom 216
ASIC Engineer
TumbleGeorgeRTX 3090 is not 18 teraflops! It's have 36 teraflops but where is my 2.25X better frame rate than weak RX 6800 with it's only 16 teraflops... Exist good marketing and also exist a uncovered lie because is not possible to cover it...However in GPU database continuing existence of fake teraflops which Nvidia declare to world.
Probably because teraflops is only determined via a single part of the function of the GPU, not the entire data path from in and out of the GPU. Its mostly used for marketting, especially on consoles to give those morons something to fight over.

Teraflops: a unit of computing speed equal to one million million (10^12) floating-point operations per second.

That number does not tell the whole story of a GPUs performance.
Posted on Reply
#70
r9
You get huge FPS boost with graphics quality so close that you have to zoom in and analyze pixel by pixel to tell the difference and in some scenarios DLS 2.0 it even looks better.
Sow why is everybody stuck on how it's done ?! As long you don't have to rub two dicks together who cares.
Posted on Reply
#71
medi01
r9You get huge FPS boost with graphics quality so close that you have to zoom in and analyze pixel by pixel to tell the difference
4k is 2.2+ times more pixels than 1440p, naturally.
Even 1440p is plenty of pixels for most screens.

Still, bush in frequently shared Death Stranding "improvement" screenshot from sites overhyping the tech, is very visibly blurred/lost details, at the same time, long grass on the right is improved, as one would have expected from a TAA derivative, shrug.
Posted on Reply
#72
Aquinus
Resident Wat-man
SaLaDiN666Death Stranding with DLSS 2.0 on looks better than the native. But you tried.
Maybe we should call it Upscaling 3.0. When push comes to shove, that's all it really is. A pig with lipstick is still a pig. :laugh:

To be clear, this isn't a bad strategy when your hardware can't run native. It's just a little amusing to say it's as good as running native.
Posted on Reply
Add your own comment
Oct 28th, 2024 17:30 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts