Tuesday, August 22nd 2023

NVIDIA Announces DLSS 3.5 Ray Reconstruction Technology, Works on GeForce 20 and Newer

At this year's Gamescom 2023, NVIDIA will be releasing several new updates for the GeForce Gaming crowd. These are led by the announcement of the new DLSS 3.5 Ray Reconstruction feature, available this Fall for all GeForce RTX GPUs (RTX 20-series and later). DLSS 3.5 introduces a new feature called "Ray Reconstruction Technology" that's specifically designed to improve the way ray traced elements look in games. While traditional rasterization calculates every single pixel, for each frame, real-time ray tracing cannot do that, for performance reasons. During rendering, only few rays are shot in a coarse grid, which leaves empty "black" gaps in-between the ray outputs. To fill those in, a denoiser is used that runs various algorithms to literally fill in the blanks.

With DLSS 3.5, NVIDIA introduces a new denoiser that's optimized to work hand-in-hand with DLSS 2 upscaling, to provide better image quality results that are more correct at the same time. This feature relies on the Tensor Cores (not the RT cores, we asked), so it is available on all GeForce RTX graphics cards (Turing and newer).

The picture below shows the traditional way to do RT effects. Please note that DLSS 2 upscaling is enabled here—the image is composited at low resolution first and then scaled to native size.

In a first step, the engine creates the geometry and materials, but without any shading. This information is used to create the BVH acceleration structure for ray tracing, which helps to determine where rays intersect with world geometry. Next, a number of rays is cast and their path is traced, to calculate intersections, possibly let them bounce, maybe even several times. These results are now fed to the denoiser, which turns the individual pixels into a continuous image that looks like a ray traced reflection, shadow, lighting or ambient occlusion. With upscaling enabled, the denoiser generates output at the lower render resolution, not the final native output—the denoiser isn't even aware of the final resolution. On top of that, another problem is that the upscaler doesn't know anything about rays, it just sees the pixel output from the denoiser—all the original ray tracing values are lost at that stage.
The biggest problem with denoisers is that they rely on previous frames, to "collect" enough pixel data for the final image. This means that the RT output is an average of several previous frames. The slide above details such problematic cases. For example, the mirror on a moving car gets combined throughout several frames, which results in ghosting artifacts. Another problem is with subtle illumination effects and reflections that just look smeared out.

NVIDIA's innovation with DLSS 3.5 is that they are combining both the denoising and the upscaling steps into a single combined step that has more information available, which promises a higher-quality output image. The low-res output is combined with the output from rasterization, the ray tracing steps and the motion vectors, and everything is painted directly into a high-res output image, 4K in this case. The DLSS 3.5 algorithm also takes into account previous frames (temporal feedback), just like DLSS 2. Once upscaling is completed, another pass is made for the DLSS 3 Frame Generation feature (when enabled).

Here's some results provided by NVIDIA that show how DLSS 3.5 Ray Reconstruction promises to enhance the RT fidelity over classic denoising techniques.
Ray Reconstruction has negligible performance cost of its own, on frame-rate comparisons NVIDIA showed taken on an RTX 40-series GPU, DLSS 3.5 RR offers marginally higher frame-rates than DLSS 3 FG. NVIDIA made it clear that DLSS 3.5 is not a performance enhancing feature, but the focus is on image quality. Depending on the scene, the performance will be virtually identical, slightly better or slightly worse. In theory it is possible that game developers reduce the number of rays when DLSS 3.5 is enabled, which would lower the RT performance hit, and improve framerates—still with improved image quality. There's no handholding for that though, this is purely a game dev feature and out of the scope of NVIDIA's DLSS 3.5 implementation.

DLSS 3.5 will not only be available in games, but also in NVIDIA's professional D5 renderer, where it will enable real-time previews of stunning detail.
When it releases this Fall, DLSS 3.5 will be enabled on all GeForce RTX GPUs through a driver update. You now have three distinct subsets of DLSS—Super Resolution (SR), or the core image upscaling tech; Frame Generation (FG) introduced with DLSS 3, which doubles frame-rates by generating alternate frames using AI; and now the new Ray Reconstruction (RR) feature. DLSS 3.5 RR will work with all RTX GPUs, as all generations include tensor cores. On older RTX 20-series "Turing" and RTX 30-series "Ampere," DLSS 3.5 will work exactly like it does on the latest RTX 40-series "Ada," but FG won't be available. Games with support for Ray Reconstruction will have an additional checkbox "enable Ray Reconstruction", just like there's a checkbox "enable Frame Generation". We confirmed with NVIDIA that running DLAA with Ray Reconstruction is supported—you don't have to use the upscaler at all times.

While the naming is a bit confusing, it's great to see that NVIDIA is constantly improving their technology. There's no news yet regarding AMD's FSR 3; perhaps an announcement might come at Gamescom. However, from a technical standpoint, we'd classify Ray Reconstruction as "DLSS 2.5", because it has absolutely nothing to do with DLSS 3 Frame Generation, and is closely interlinked with DLSS 2 upscaling. It seems NVIDIA is now releasing all similar technologies under their established "DLSS" brand name, which is further segregated by feature. For example, "DLSS 3 Frame Generation" is only supported on GeForce 40—this announcement does not change that. The new "DLSS 3 Ray Reconstruction" works on GeForce 20 and newer though, just like "DLSS 2 Upscaling" works on GeForce 20, too.
Add your own comment

89 Comments on NVIDIA Announces DLSS 3.5 Ray Reconstruction Technology, Works on GeForce 20 and Newer

#1
W1zzard
Whoops comments were broken, fixed now
Posted on Reply
#2
Chomiq
Is this a joke? DLSS 3 requires 40-series but somehow 3.5 runs on 20-series and up?
Posted on Reply
#3
Calmmo
ChomiqIs this a joke? DLSS 3 requires 40-series but somehow 3.5 runs on 20-series and up?
That's xbox tier product naming
Posted on Reply
#4
abrfilho
ChomiqIs this a joke? DLSS 3 requires 40-series but somehow 3.5 runs on 20-series and up?
I guess you can think it's like a feature level support.
Posted on Reply
#5
ZoneDymo
I have no idea how to read that compatibility graph...


also ,qq, if the RT was high enough quality, would we even need a denoiser anymore?
Posted on Reply
#6
Suspecto
ChomiqIs this a joke? DLSS 3 requires 40-series but somehow 3.5 runs on 20-series and up?
DLSS 3 runs on any raytracing card, only for frame generations, the 4xxx series is required.
Posted on Reply
#7
Chomiq
SuspectoDLSS 3 runs on any raytracing card, only for frame generations, the 4xxx series is required.
Posted on Reply
#8
ZoneDymo
SuspectoDLSS 3 runs on any raytracing card, only for frame generations, the 4xxx series is required.
ermm isnt that exactly what dlss3 is though?
Posted on Reply
#9
Dixevil
this is best, can't wait for the Cyberpunk September
Posted on Reply
#10
Calmmo
@btarunr

There's an nvidia video up now you could also add to the article ;0

SuspectoDLSS 3 runs on any raytracing card, only for frame generations, the 4xxx series is required.
"All RTX GPU's" they say so in the video, it works on real frames, FG is just gonna keep doing it's own interpolation thing as per usual.
Essentially this replaces the traditional denoiser(s).
Posted on Reply
#11
nguyen
September is gonna be interesting.
Though august with BG3 is no less interesting :)
Posted on Reply
#12
Vayra86
ChomiqIs this a joke? DLSS 3 requires 40-series but somehow 3.5 runs on 20-series and up?
Nvidia is going for the mass confusion angle.
Posted on Reply
#13
Unregistered
After 3 generations of RT cores and a 2000€ card still they cannot render RT properly. Just buy it.
Posted on Edit | Reply
#14
oxrufiioxo
Yeah they should have branded Frame Generation as it's own thing the Upscaling portion of it is still works on all 3 generations of RTX gpus
Posted on Reply
#15
Vayra86
Xex360After 3 generations of RT cores and a 2000€ card still they cannot render RT properly. Just buy it.
Forever in beta. It was clear from the get-go this wasnt gonna fly...
Posted on Reply
#16
Rudolph2109
ChomiqIs this a joke? DLSS 3 requires 40-series but somehow 3.5 runs on 20-series and up?
DLSS3 never required a 40-series card at any point.
Frame generation requires a 40-series card, and while you need DLSS3 to run frame generation, you do not need frame generation to run DLSS3.
It's quite simple.
Posted on Reply
#17
TheDeeGee
NVIDIA is so ahead of their game, it's insane.

AMD has pretty much come to a stand still, even Intel is making faster progress with ARC lately.
Posted on Reply
#18
Easo
Seems Cyberpunk is becoming a tool to test new fancy stuff which, if anything, adds to the longevity of the game.
Of course Source engine games shine in that regard too, can't forget about them.
TheDeeGeeNVIDIA is so ahead of their game, it's insane.

AMD has pretty much come to a stand still, even Intel is making faster progress with ARC lately.
Yeah. I understand that in terms of brute force rasterization they are not really behind, but it is seemingly clear that RT & Co are the future deciding factors.
Posted on Reply
#19
cvaldes
While AMD continues to chase pure rasterization performance, Nvidia has diversified its approach in leveraging other differentiated silicon toward the future. More importantly, Nvidia has built an enormous universe of developer infrastructure. It's not just a bunch of peculiar circuits on a silicon die.

While initial DLSS efforts were largely dismissed by many gaming enthusiasts, many of these people didn't foresee that Nvidia would actually put in the effort into making it better. You walk before you run. Some people don't get this.

Nvidia's timing here is enviable. With the sudden AI craze, Nvidia is several years ahead of everyone else in the machine learning world. And again, it's not just the silicon. They have the software tools ready for others to build with.

I'd give Intel a better chance than AMD to put some pressure on Nvidia in the machine learning realm. All of the big revenue opportunities over the next ten years are outside the consumer gaming world.
Posted on Reply
#20
evernessince
Rudolph2109DLSS3 never required a 40-series card at any point.
Frame generation requires a 40-series card, and while you need DLSS3 to run frame generation, you do not need frame generation to run DLSS3.
It's quite simple.
DLSS 3.0 IS frame generation. Review outlets noted no noticeable changes aside from that over prior DLSS versions.

The naming is horrid. DLSS refers to a singular feature, Deep Learning Super Sampling. Adding features under the DLSS umbrella that's aren't DLSS was and is only increasingly a terrible idea.
cvaldesWhile AMD continues to chase pure rasterization performance, Nvidia has diversified its approach in leveraging other differentiated silicon toward the future. More importantly, Nvidia has built an enormous universe of developer infrastructure. It's not just a bunch of peculiar circuits on a silicon die.

While initial DLSS efforts were largely dismissed by many gaming enthusiasts, many of these people didn't foresee that Nvidia would actually put in the effort into making it better. You walk before you run. Some people don't get this.

Nvidia's timing here is enviable. With the sudden AI craze, Nvidia is several years ahead of everyone else in the machine learning world. And again, it's not just the silicon. They have the software tools ready for others to build with.

I'd give Intel a better chance than AMD to put some pressure on Nvidia in the machine learning realm. All of the big revenue opportunities over the next ten years are outside the consumer gaming world.
They really aren't. LevelOneTechs have demonstrated that AMD's AI performance is excellent and AI models run on AMD cards are more accurate than Nvidia cards. Wendell demonstrated a model running on both and the AMD card produced more correct results.
Posted on Reply
#21
cvaldes
evernessinceThey really aren't. LevelOneTechs have demonstrated that AMD's AI performance is excellent and AI models run on AMD cards are more accurate than Nvidia cards. Wendell demonstrated a model running on both and the AMD card produced more correct results.
Interesting.

Well, I'll put most of my money on Nvidia. (I am an indirect shareholder of most of the big semiconductor companies anyhow).

That said, I'm not sure if some vlogger hawking desk mats is really qualified to be testing ML models on GPUs. If it were that simple, wouldn't everyone be flocking to AMD to buy up their better value cards? So cherry picking through YT videos to come up with a couple of examples where Nvidia falls short isn't a ringing endorsement.

And as I understand a lot of these ML algorithms are tailored to hardware. If you pull some random software package off the shelf, it may favor one architecture over another, just like games have differing performance based on GPU architectures.

And it's not like the CTOs of Wal-Mart, Daimler-Benz, etc. are just surfing YouTube for hardware recommendations. They bring in hardware for testing to ascertain what works best for them. For these big companies, I'm sure they tried Nvidia, AMD, Intel and probably some smaller players as well. And they continually evaluate the technology in this rapidly evolving segment. In 2025 when they buy new hardware, I'm sure it'll based off of recent experience, not what they decided in 2020.

But this is all elementary...

Anyhow, the industry would be best served by strong competition amongst these companies, including startups that aren't in the mainstream conversation yet. But Nvidia will own this space for the next couple of years.
Posted on Reply
#22
evernessince
cvaldesInteresting.

Well, I'll put most of my money on Nvidia. (I am an indirect shareholder of most of the big semiconductor companies anyhow).

That said, I'm not sure if some vlogger hawking desk mats is really qualified to be testing ML models on GPUs. If it were that simple, wouldn't everyone be flocking to AMD to buy up their better value cards? So cherry picking through YT videos to come up with a couple of examples where Nvidia falls short isn't a ringing endorsement.

And as I understand a lot of these ML algorithms are tailored to hardware. If you pull some random software package off the shelf, it may favor one architecture over another, just like games have differing performance based on GPU architectures.

But this is all elementary...

Anyhow, the industry would be best served by strong competition amongst these companies, including startups that aren't in the mainstream conversation yet. But Nvidia will own this space for the next couple of years.
Level1Techs is a professional focused channel. I'd recommend you go check them out if you think they may be biased, they are after all the one that hooks up GamersNexus, Linus, and many other tech tubers with their HEDT, Server, and network setups. Most of their content focuses on the enterprise. Once you see their content and the tone of that content you'll understand.
Posted on Reply
#23
cvaldes
evernessinceLevel1Techs is a professional focused channel. I'd recommend you go check them out if you think they may be biased, they are after all the one that hooks up GamersNexus, Linus, and many other tech tubers with their HEDT, Server, and network setups. Most of their content focuses on the enterprise. Once you see their content and the tone of that content you'll understand.
I'll take a deeper look someday.

I poked my head into their Q&A forum, looked like a hundred other computer forums I've seen going way back to the Usenet days. TPU was once more like that. In fact, if the forum section was better laid out, I could see myself spending of my limited free time there, at least for a couple of years before moving on.

Online communities scale extremely poorly and the Level1Techs forum is still at a size where it might be worth visiting. So I thank you for that.
Posted on Reply
#24
Luminescent
And yet the biggest performance upgrade every new generation gets is from smaller TSMC node.
At least Nvidia is trying, AMD mostly copies.
This is good news, we could see even cheaper AMD cards :laugh:
Posted on Reply
#25
gffermari
I was expecting FSR 3.0 this very day 1 of the Gamescom and I got DLSS 3.5.
Sorry but that's a goal for nVidia.
Posted on Reply
Add your own comment
Dec 22nd, 2024 01:05 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts