• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Death Stranding with DLSS 2.0 Enables 4K-60 FPS on Any RTX 20-series GPU: Report

Joined
Aug 6, 2017
Messages
7,412 (2.92/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
It does look pretty good in their own screenshots. Give it time, DLSS wasn't great from the beginning either.
I wonder if they have a genuine intereset in following dlss or is it just a lazy way to show they're doing anything
 

bug

Joined
May 22, 2015
Messages
13,470 (4.02/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
I wonder if they have a genuine intereset in following dlss or is it just a lazy way to show they're doing anything
AMD can't do DLSS, they lack the compute ability to tackle that (I mean the training part). I'm hoping they're genuinely trying a different approach, that's how proper solutions are born: by pitting different solutions against each other, seeing which works best.
 
Joined
May 15, 2020
Messages
697 (0.46/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
AMD can't do DLSS, they lack the compute ability to tackle that (I mean the training part). I'm hoping they're genuinely trying a different approach, that's how proper solutions are born: by pitting different solutions against each other, seeing which works best.
I'm not completely sure, but from what I understand DLSS is based on training a neural network by Nvidia on their premises, which is then only used for interpolation on the graphic card. It's the only way that makes sense.If it were trained locally, that would require a lot of resources and, at first, the results would be bad.

FWIW, you can build, train and deploy neural networks on the CPU, too, not only on GPU's. I'm pretty sure it could be done on AMD GPU's, Im unsure what would be the performance hit.

In the case of Nvidia, I think the approach was inverse, they had some AI capabilities sitting on the die (due to the pro-market requirements) and they tried to find a nice way to use them for gaming. What is unclear to me is what is the cost for Nvidia for training a NN for each game. I'm guessing it's pretty big, otherwise they would've done more games by now. The advantage, however, is that there is no hit on the graphical part of the card. I have a hard time seeing how AMD could come with a better solution, or a decent solution, not involving AI.
 

bug

Joined
May 22, 2015
Messages
13,470 (4.02/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
I'm not completely sure, but from what I understand DLSS is based on training a neural network by Nvidia on their premises, which is then only used for interpolation on the graphic card. It's the only way that makes sense.If it were trained locally, that would require a lot of resources and, at first, the results would be bad.

FWIW, you can build, train and deploy neural networks on the CPU, too, not only on GPU's. I'm pretty sure it could be done on AMD GPU's, Im unsure what would be the performance hit.

In the case of Nvidia, I think the approach was inverse, they had some AI capabilities sitting on the die (due to the pro-market requirements) and they tried to find a nice way to use them for gaming. What is unclear to me is what is the cost for Nvidia for training a NN for each game. I'm guessing it's pretty big, otherwise they would've done more games by now. The advantage, however, is that there is no hit on the graphical part of the card. I have a hard time seeing how AMD could come with a better solution, or a decent solution, not involving AI.
Well, deep neural networks have been around in theory for ages (if you can think of a 3 layer network, there's no reason you can't think of a 25+ layer one). Training them, however, is just way tougher on the hardware. Whatever AMD could do with their OpenCL implementations, Nvida can do at least 10x faster with CUDA and specialized hardware. Considering AMD is behind in this area, nedd need to move faster, not slower.

Also, per title training was only needed for DLSS 1. Starting with DLSS 2, it seems Nvidia has trained their networks well enough per title training is no longer a requirement (though I'm pretty sure they still do it to work out kinks here and there).

That's why I'm guessing if AMD is to respond to DLSS, they shouldn't get sucked into this "who trains DNNs faster" and they should work from another angle (if there is one).
 
Joined
May 15, 2020
Messages
697 (0.46/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
Well, deep neural networks have been around in theory for ages (if you can think of a 3 layer network, there's no reason you can't think of a 25+ layer one). Training them, however, is just way tougher on the hardware. Whatever AMD could do with their OpenCL implementations, Nvida can do at least 10x faster with CUDA and specialized hardware. Considering AMD is behind in this area, nedd need to move faster, not slower.
It's true NN have been around for a long time, it's also true they sat mostly unused most of this time :)
Also, per title training was only needed for DLSS 1. Starting with DLSS 2, it seems Nvidia has trained their networks well enough per title training is no longer a requirement (though I'm pretty sure they still do it to work out kinks here and there).
Are you sure of that? I don't understand in this case why there is support for only a few games.
That's why I'm guessing if AMD is to respond to DLSS, they shouldn't get sucked into this "who trains DNNs faster" and they should work from another angle (if there is one).
I'm pretty sure there is no other angle than some form of AI that would give a fast and compact solution to good upscaling. You need to add/create detail, and any way of procedural/algorithmic solution to that sounds bound for failure.
 

bug

Joined
May 22, 2015
Messages
13,470 (4.02/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
It's true NN have been around for a long time, it's also true they sat mostly unused most of this time :)
2, 3 or 4 layer NNs have actually been in widespread use. Basic classification of stuff (think user profiling). DNNs have been around for as long, but they've been intractable for the most part.

Are you sure of that? I don't understand in this case why there is support for only a few games.
I meant training is no longer necessary for each and every title. Games obviously still have to make use of the DLSS library to make use of the feature.

I'm pretty sure there is no other angle than some form of AI that would give a fast and compact solution to good upscaling. You need to add/create detail, and any way of procedural/algorithmic solution to that sounds bound for failure.
I wouldn't know, I've been out of touch with all that for quite some time.
 
Joined
May 15, 2020
Messages
697 (0.46/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
2, 3 or 4 layer NNs have actually been in widespread use. Basic classification of stuff (think user profiling). DNNs have been around for as long, but they've been intractable for the most part.
This is going a bit OT, but still, I understand that you mean that NN networks have been known for a long time, it's just that they weren't largely adopted for a long time. 15 years ago everybody used different algorithms for calculating semantic distances and other probabilistic/stochastic models in order to model and class user profiles.
Only with Google's TensorFlow NN really came alive for the masses.
 

bug

Joined
May 22, 2015
Messages
13,470 (4.02/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
This is going a bit OT, but still, I understand that you mean that NN networks have been known for a long time, it's just that they weren't largely adopted for a long time. 15 years ago everybody used different algorithms for calculating semantic distances and other probabilistic/stochastic models in order to model and class user profiles.
Only with Google's TensorFlow NN really came alive for the masses.
No, I mean NN have actually been in widespread for various pattern recognition related stuff. Just not the deep sort. Whether that qualifies as "for the masses", I don't know.
By contrast, DNNs have been almost a no show till recently.
 
Joined
Jul 9, 2015
Messages
3,413 (1.03/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
DLSS 2.0
DLSS 2.0 works as follows:[13]


  • The neural network is trained by Nvidia using "ideal" images of video games of ultra-high resolution on supercomputers and low resolution images of the same games. The result is stored on the video card driver. It is said the Nvidia uses DGX-1 servers to perform the training of the network.
  • The Neural Network stored on the driver compares the actual low resolution image with the reference and produce a full high resolution result. The inputs used by the trained Neural Network are the low resolution aliased images rendered by the game engine, and the low resolution, motion vectors from the same images, also generated by the game engine. The motion vectors tell the network which direction objects in the scene are moving from frame to frame, in order to estimate what the next frame will look like.[14]
 
Joined
May 15, 2020
Messages
697 (0.46/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
DLSS 2.0
DLSS 2.0 works as follows:[13]


  • The neural network is trained by Nvidia using "ideal" images of video games of ultra-high resolution on supercomputers and low resolution images of the same games. The result is stored on the video card driver. It is said the Nvidia uses DGX-1 servers to perform the training of the network.
  • The Neural Network stored on the driver compares the actual low resolution image with the reference and produce a full high resolution result. The inputs used by the trained Neural Network are the low resolution aliased images rendered by the game engine, and the low resolution, motion vectors from the same images, also generated by the game engine. The motion vectors tell the network which direction objects in the scene are moving from frame to frame, in order to estimate what the next frame will look like.[14]
Thanks for doing the research and posting this, that means that my lazy logic assumption, without reading the spec, was correct, it works on a per-game basis.
 
Joined
Aug 6, 2017
Messages
7,412 (2.92/day)
Location
Poland
System Name Purple rain
Processor 10.5 thousand 4.2G 1.1v
Motherboard Zee 490 Aorus Elite
Cooling Noctua D15S
Memory 16GB 4133 CL16-16-16-31 Viper Steel
Video Card(s) RTX 2070 Super Gaming X Trio
Storage SU900 128,8200Pro 1TB,850 Pro 512+256+256,860 Evo 500,XPG950 480, Skyhawk 2TB
Display(s) Acer XB241YU+Dell S2716DG
Case P600S Silent w. Alpenfohn wing boost 3 ARGBT+ fans
Audio Device(s) K612 Pro w. FiiO E10k DAC,W830BT wireless
Power Supply Superflower Leadex Gold 850W
Mouse G903 lightspeed+powerplay,G403 wireless + Steelseries DeX + Roccat rest
Keyboard HyperX Alloy SilverSpeed (w.HyperX wrist rest),Razer Deathstalker
Software Windows 10
Benchmark Scores A LOT
DLSS 2.0
DLSS 2.0 works as follows:[13]


  • The neural network is trained by Nvidia using "ideal" images of video games of ultra-high resolution on supercomputers and low resolution images of the same games. The result is stored on the video card driver. It is said the Nvidia uses DGX-1 servers to perform the training of the network.
  • The Neural Network stored on the driver compares the actual low resolution image with the reference and produce a full high resolution result. The inputs used by the trained Neural Network are the low resolution aliased images rendered by the game engine, and the low resolution, motion vectors from the same images, also generated by the game engine. The motion vectors tell the network which direction objects in the scene are moving from frame to frame, in order to estimate what the next frame will look like.[14]
correct.
as I understand it,when it's a botched job it's entirrely on nvidia,not the developer.if it's well done,it's cause nvidia took time to optimize it.
I don't think the game developer plays any part in the process,except for the actual amount of time they give nvidia to work on it,which is a big factor probably.

I think it was me who linked it to you one day actually:nutkick:
 

bug

Joined
May 22, 2015
Messages
13,470 (4.02/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Thanks for doing the research and posting this, that means that my lazy logic assumption, without reading the spec, was correct, it works on a per-game basis.
Once again, it doesn't. The neural network is now smart/good enough that you don't need a different instance for each game. That's the difference from DLSS 1.0.
 
Joined
Mar 21, 2016
Messages
2,368 (0.78/day)
This is FidelityFX: https://gpuopen.com/fidelityfx-cas/
A sharpening filter mostly. It says is does up/downscaling as well, but it's unclear how it does that.
I think like DLSS as well it boils down to implimentation a bit along with spec if they've made any hardware revisions. Something tells me it can look good or bad in either case. FidelityFX can do more than just sharpening though it can also help with reflections. AMD's website on it shows and explains well enough what's possible with it. Like I said above tech of this nature is also dependant on how well it's implemented or if it is in the first place. Raw rasterization performance is always most ideal over these sorts of things if you want to utilize them on both past and present game libraries. Eventually RTRT will be more of a talking point though right now it's a joke because both the hardware isn't mature enough nor is the software. I think by the end of this console generation though we'll end up having some good discrete hardware for RTRT that matured a lot so the PS6 generation of consoles should be slick.

correct.
as I understand it,when it's a botched job it's entirrely on nvidia,not the developer.if it's well done,it's cause nvidia took time to optimize it.
I don't think the game developer plays any part in the process,except for the actual amount of time they give nvidia to work on it,which is a big factor probably.

I think it was me who linked it to you one day actually:nutkick:
That in itself is a big issue if true actually you'll end up bigger developers getting better results and Nvidia the not so AAA and indie developers much more differently. I can't blame them for doing so, but that will skew expectations and make DLSS's inherent perks really varied by case and I imagine the same holds true for AMD's FidelityFX.
 
Last edited:
Top