• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GeForce NOW Gains NVIDIA DLSS 2.0 Support In Latest Update

Joined
Mar 31, 2020
Messages
1,519 (0.88/day)
NVIDIA's game streaming service GeForce NOW has gained support for NVIDIA Deep Learning Super Sampling (DLSS) 2.0 in the latest update. DLSS 2.0 uses the tensor cores found in RTX series graphics cards to render games at a lower resolution and then use custom AI to construct sharp, higher resolution images. The introduction of DLSS 2.0 to GeForce NOW should allow for graphics quality to be improved on existing server hardware and deliver a smoother stutter-free gaming experience. NVIDIA announced that Control would be the first game on the platform to support DLSS 2.0, with additional games such as MechWarrior 5: Mercenaries and Deliver Us The Moon to support the feature in the future.



View at TechPowerUp Main Site
 
Joined
Feb 11, 2009
Messages
5,572 (0.96/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
Can someone explain DLSS to me....

This is how I understand it:
There is a "server" with a reference 16k image.
It lays that image against what you are seeing and then tries to make your (low res) image look as much as possible like the 16k image.
So does that mean you have to be online to even get DLSS support and/or to help it make the DLSS support for the game you are playing better?

Also why does DLSS On improve performance?
Does it mean that when you select 4k res in a game with DLSS On you are actually running it just on for example 1080p but DLSS "upscales" it to 4k with good quality?

And also, if this is how it works, why do games even need to support it? why cant Nvidia just make a 16k reference image themselves and have that communicate with the users to train the "server" in using it.
Why would this not be something you can just turn on in the Nvidia Control Panel for every game?
 
Joined
Sep 17, 2014
Messages
22,673 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Joined
Jun 14, 2010
Messages
634 (0.12/day)
Location
City 217
Processor AMD Phenom II X4 925
Motherboard Asus M4A78LT-M
Cooling Ice Hammer IH-4***
Memory 2x4GB DDR3 Corsair
Video Card(s) Asus HD7870 2GB
Storage 500GB SATAII Samsung | 500GB SATAII Seagate
Display(s) 23" LG 23EA63V-P
Case Thermaltake V3 Black Edition
Audio Device(s) VIA VT1708S
Power Supply Corsair TX650W
Software Windows 10 x64
Can someone explain DLSS to me....

This is how I understand it:
There is a "server" with a reference 16k image.
It lays that image against what you are seeing and then tries to make your (low res) image look as much as possible like the 16k image.
So does that mean you have to be online to even get DLSS support and/or to help it make the DLSS support for the game you are playing better?

Also why does DLSS On improve performance?
Does it mean that when you select 4k res in a game with DLSS On you are actually running it just on for example 1080p but DLSS "upscales" it to 4k with good quality?

And also, if this is how it works, why do games even need to support it? why cant Nvidia just make a 16k reference image themselves and have that communicate with the users to train the "server" in using it.
Why would this not be something you can just turn on in the Nvidia Control Panel for every game?
It's done "manually" by Nvidia using their super duper rig for each individual game to ensure everything looks good. The results are then incorporated into the drivers and replicated locally using your Nvidia GPU.
 
Joined
Feb 11, 2009
Messages
5,572 (0.96/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
It's done "manually" by Nvidia using their super duper rig for each individual game to ensure everything looks good. The results are then incorporated into the drivers and replicated locally using your Nvidia GPU.

Right so what I was thinking then is more or less correct.
And its up to Nvidia then to choose which games the system learns before rolling it out to the consumer, so some games may never be chosen by them, that is kinda sad to think about.
 
Joined
Jun 14, 2010
Messages
634 (0.12/day)
Location
City 217
Processor AMD Phenom II X4 925
Motherboard Asus M4A78LT-M
Cooling Ice Hammer IH-4***
Memory 2x4GB DDR3 Corsair
Video Card(s) Asus HD7870 2GB
Storage 500GB SATAII Samsung | 500GB SATAII Seagate
Display(s) 23" LG 23EA63V-P
Case Thermaltake V3 Black Edition
Audio Device(s) VIA VT1708S
Power Supply Corsair TX650W
Software Windows 10 x64
so some games may never be chosen by them, that is kinda sad to think about
Look on the bright side, that's only an issue at the moment. As tech and soft mature, it might become universal, like image sharpening or integer scaling.
 
Joined
Feb 11, 2009
Messages
5,572 (0.96/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> RX7800XT
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
Look on the bright side, that's only an issue at the moment. As tech and soft mature, it might become universal, like image sharpening or integer scaling.

I was thinking of some Folding at Home kinda deal where all Nvidia users could help, but someone (Nvidia probably) would still have to get those super high quality images first for it to work.
 
Joined
Sep 15, 2011
Messages
6,761 (1.39/day)
Processor Intel® Core™ i7-13700K
Motherboard Gigabyte Z790 Aorus Elite AX
Cooling Noctua NH-D15
Memory 32GB(2x16) DDR5@6600MHz G-Skill Trident Z5
Video Card(s) ZOTAC GAMING GeForce RTX 3080 AMP Holo
Storage 2TB SK Platinum P41 SSD + 4TB SanDisk Ultra SSD + 500GB Samsung 840 EVO SSD
Display(s) Acer Predator X34 3440x1440@100Hz G-Sync
Case NZXT PHANTOM410-BK
Audio Device(s) Creative X-Fi Titanium PCIe
Power Supply Corsair 850W
Mouse Logitech Hero G502 SE
Software Windows 11 Pro - 64bit
Benchmark Scores 30FPS in NFS:Rivals
where can we find some real comparison shots (NOT the one marketed by nGreedia) with the new DLSS 2.0 for PC?
 
Joined
Sep 17, 2014
Messages
22,673 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Right so what I was thinking then is more or less correct.
And its up to Nvidia then to choose which games the system learns before rolling it out to the consumer, so some games may never be chosen by them, that is kinda sad to think about.

Yes so far everything AI and deep learning translates into a titanic amount of effort for minimal gains, and certainly not 'self learning' in the pure sense of the word. Unless endless crunching on a per-case basis is somehow intelligent. :p
 
Joined
Jan 8, 2017
Messages
9,504 (3.27/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
And also, if this is how it works, why do games even need to support it? why cant Nvidia just make a 16k reference image themselves and have that communicate with the users to train the "server" in using it.
Why would this not be something you can just turn on in the Nvidia Control Panel for every game?

Here is the less technical explanation :

The game is run on a render farm at very high resolutions and stills from the game are fed into a program that tries to generate a model that generalizes how the images are supposed to look. This model is then sent through a driver update to the machine where it is fed the images from the game running locally at lower resolutions and hopefully it can then scale up the images to look like the original high resolution images that were generated back on the render farm.

Emphasis on the word "hopefully".

You don't have to be connected to the server, you just need to download the model once (ideally). The model differs from game to game for accuracy purposes, you could make one global model but the results are going to be worse.
 
Last edited:
Joined
Sep 17, 2014
Messages
22,673 (6.05/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
:roll:

Someone wake me up if they get to that point before abandoning DLSS.

Do keep in mind, you're talking about the company that pretty much controls discrete GPU progress in this market.

I agree, the per title optimization is utterly disgusting and useless. But this is Nvidia. Look at their drivers. They bring day one game ready stuff every time;; for reasons that vary, but its still there and they don't really miss a lot of titles at all. And in terms of pushing the performance envelope.... I do think this is the direction in the near future anyway if we want more faster bigger. Those nanometers won't get much smaller and bigger chips are not for everyone's wallet, and is completely counterproductive when growth is the norm. If you need more volume, you need smaller dies.
 
Top