Wednesday, February 3rd 2021
NVIDIA's DLSS 2.0 is Easily Integrated into Unreal Engine 4.26 and Gives Big Performance Gains
When NVIDIA launched the second iteration of its Deep Learning Super Sampling (DLSS) technique used to upscale lower resolutions using deep learning, everyone was impressed by the quality of the rendering it is putting out. However, have you ever wondered how it all looks from the developer side of things? Usually, games need millions of lines of code and even some small features are not so easy to implement. Today, thanks to Tom Looman, a game developer working with Unreal Engine, we have found out just how the integration process of DLSS 2.0 looks like, and how big are the performance benefits coming from it.
Inthe blog post, you can take a look at the example game shown by the developer. The integration with Unreal Engine 4.26 is easy, as it just requires that you compile your project with a special UE4 RTX branch, and you need to apply your AppID which you can apply for at NVIDIA's website. Right now you are probably wondering how is performance looking like. Well, the baseline for the result was TXAA sampling techniques used in the game demo. The DLSS 2.0 has managed to bring anywhere from 60-180% increase in frame rate, depending on the scene. These are rather impressive numbers and it goes to show just how well NVIDIA has managed to build its DLSS 2.0 technology. For a full overview, please refer to the blog post.
Source:
Tom Looman Blog
Inthe blog post, you can take a look at the example game shown by the developer. The integration with Unreal Engine 4.26 is easy, as it just requires that you compile your project with a special UE4 RTX branch, and you need to apply your AppID which you can apply for at NVIDIA's website. Right now you are probably wondering how is performance looking like. Well, the baseline for the result was TXAA sampling techniques used in the game demo. The DLSS 2.0 has managed to bring anywhere from 60-180% increase in frame rate, depending on the scene. These are rather impressive numbers and it goes to show just how well NVIDIA has managed to build its DLSS 2.0 technology. For a full overview, please refer to the blog post.
74 Comments on NVIDIA's DLSS 2.0 is Easily Integrated into Unreal Engine 4.26 and Gives Big Performance Gains
and those who werent impressed enough were shunned from future founder editions
News at 11:00!
Oh, and blur? Loss of details? You are not sitting far enough from your screen!
The only alternative for AMD is the Sapphire Trixx Boost which Sapphire claims that it works in every game. It may not yield the same quality as DLSS upscaling, but I feel the benefit is that you can select how low a resolution you want to go via a slider bar, and if I take Sapphire's words for it, works in every game.
Let's stop beating around the bush. If you lack Nvidia's extended support for DLSS its nothing special at all and never will be, no matter how much misdirecting marketing they pour over it.
Dear god... Name a single "full ray tracing' effect that you cannot find in this game, running on 8 Jaguar cores with 7870 slappted on top:
Since 2.0 its really not as bad as people are saying it is, people just like to complain. If you don't like it don't use, it you do then use it problem solved. 1.0 was pretty bad though
What makes a huge difference is that the game now feeds motion vectors to the algorithm, which solves the TAA movement problem
You still need a per-title implementation, so you still need Nvidia's support going forward, and we haven't got a complete overview of what the performance gain truly is without any DGX pre render support. Nvidia is doing a fine job muddying those waters.
I'm not denying the technology is nice, but this is not quite as fire and forget as it looks.
As for DLSS, have you people seen prototypes of lossy video encoders based on AI/deep learning? It's hard to believe what they cook up from tiny amount of data. These technologies are effective and here to stay.
Doesn't change the fact that on local PC I want raw power and losslessness, not compressed textures, reduced shaders, upscaled images, lossy compressed monitor connection, etc. I could as well use streaming instead. I prefer lossless lower resolution than baked up pseudo high resolution.