Wednesday, May 19th 2021

NVIDIA Adds DLSS Support To 9 New Games Including VR Titles

NVIDIA DLSS adoption continues at rapid pace, with a further 9 titles adding the game-changing, performance-accelerating, AI and Tensor Core-powered GeForce RTX technology. This follows the addition of DLSS to 5 games last month, and the launch of Metro Exodus PC Enhanced Edition a fortnight ago. This month, DLSS comes to No Man's Sky, AMID EVIL, Aron's Adventure, Everspace 2, Redout: Space Assault, Scavengers, and Wrench. And for the first time, DLSS comes to Virtual Reality headsets in No Man's Sky, Into The Radius, and Wrench.

By enabling NVIDIA DLSS in each, frame rates are greatly accelerated, giving you smoother gameplay and the headroom to enable higher-quality effects and rendering resolutions, and raytracing in AMID EVIL, Redout: Space Assault, and Wrench. For gamers, only GeForce RTX GPUs feature the Tensor Cores that power DLSS, and with DLSS now available in 50 titles and counting, GeForce RTX offers the fastest frame rates in leading triple-A games and indie darlings.
Complete Games List
  • AMID EVIL
  • Aron's Adventure
  • Everspace 2
  • Metro Exodus PC Enhanced Edition
  • No Man's Sky
  • Redout: Space Assult
  • Scavengers
  • Wrench
  • Into The Radius VR
Source: NVIDIA
Add your own comment

49 Comments on NVIDIA Adds DLSS Support To 9 New Games Including VR Titles

#2
ozzyozzy
And still no DLSS for Microsoft Flight Simulator 2020
Posted on Reply
#3
ZoneDymo
again I hate this marketing, you can just say "if you run the game at 1080p instead of 4k, your performance will be better", shocking news.....

"By enabling NVIDIA DLSS in each, frame rates are greatly accelerated, giving you smoother gameplay and the headroom to enable higher-quality effects and rendering resolutions, and raytracing"

What they mean is: "By lowering the resolution in each, frame rates are greatly accelerated, giving you smoother gameplay and the headroom to enable higher-quality effects and rendering resolutions, and raytracing and you wont suffer loss in quality from lowering the resolution because DLSS will make it look as good as if it was actually running at a higher resolution"
Posted on Reply
#4
Zubasa
ozzyozzyAnd still no DLSS for Microsoft Flight Simulator 2020
That game still runs on DX11 and is heavily CPU bottle-necked anyway.
Posted on Reply
#5
Legacy-ZA
I am extremely impressed with DLSS 2.0, these type of technologies further refined is going to be a great boon to us, especially for high refresh rate displays.
Posted on Reply
#6
Sithaer
Legacy-ZAI am extremely impressed with DLSS 2.0, these type of technologies further refined is going to be a great boon to us, especially for high refresh rate displays.
Tbh the reason I'm interested in DLSS 2.0 is cause I rarely upgrade my GPU and I also don't buy high end stuff more like ~budget-mid range at most.
So for example take take lowest tier card that supports DLSS and in ~3 years time since that how long I usually keep my cards, that could mean the difference between playable and unplayable in new games w/o having to destroy in game settings to potato level.

From what I saw in comparison vids I kinda like DLSS 2.0, in games like Cyberpunk it even looks better to me than native imo. 'more details displayed on vegetation'
Also seems to work pretty well in Metro Exodus Enhanced.

When or if the time finally comes for me to upgrade my GPU this might be the deciding factor unless AMD's version will be good enough and with similar support in games.
Otherwise I might go back to Nvidia just for DLSS.
Posted on Reply
#7
Mussels
Freshwater Moderator
Hell, i got a 3090 cause i figured DLSS will make it last for years

seeing it in VR titles is awesome
Posted on Reply
#8
dj-electric
ZoneDymoagain I hate this marketing, you can just say "if you run the game at 1080p instead of 4k, your performance will be better", shocking news.....
And what about the fact that those 1080p renderings are using AI to look pretty close to how native 4K does? what about that point?
Gracefully missing the entire point of DLSS aren't we
Posted on Reply
#9
LemmingOverlord
From all the DLSS demos I've seen, I get the feeling DLSS renders things at a lower res and then upsample the output... Can someone explain this a little better?
Posted on Reply
#10
z1n0x
Well DLSS did its job, that is to sell RTX 2k/3k gpus.
Going forward when a open alternative arrives, DLSS wil be joining PhysX in the realm of dead proprietary tech.

edit: the first game on the list though, kinda suggestive :laugh:
Posted on Reply
#11
pat-roner
Just did a playthrough of Metro Exodus the EE version, and used the DLSS2.0 Performance to play on 4k with my LG monitor with a 3080, and it looked absolutely amazing, and with absolutely max settings, it ran at around 100-120 fps easily. In general a very nice experience.
Posted on Reply
#12
ZoneDymo
dj-electricAnd what about the fact that those 1080p renderings are using AI to look pretty close to how native 4K does? what about that point?
Gracefully missing the entire point of DLSS aren't we
no...come on man, I explained this many times already, they dont make the marketing about the "upscaling" being done because they feel that sounds bad or negative so they just act like 4k and 4k DLSS is the same thing but its not.

if the marketing was "we have 1080p and it looks like this, and this is what it looks like with DLSS upscaled to 4k, way nicer right? so just run the game at a lower res and use this tech" but nope, they dont do that....
Posted on Reply
#13
pat-roner
ZoneDymono...come on man, I explained this many times already, they dont make the marketing about the "upscaling" being done because they feel that sounds bad or negative so they just act like 4k and 4k DLSS is the same thing but its not.

if the marketing was "we have 1080p and it looks like this, and this is what it looks like with DLSS upscaled to 4k, way nicer right? so just run the game at a lower res and use this tech" but nope, they dont do that....
Care to showcase? All videos I've seen of it and my own experience, it hard to tell a difference. But feel free to put money where your mouth is...
Posted on Reply
#14
ZoneDymo
pat-ronerCare to showcase? All videos I've seen of it and my own experience, it hard to tell a difference. But feel free to put money where your mouth is...
Im sorry but I have to ask you to learn to read, because I never said anything about the quality, again, that is not the point.
The point is that the marketing makes it about performance but its disingenuous because 4k DLSS isnt 4k.
The marketing should be about visual fidelity at the same resolution because that is what it does but they dont do that.
Posted on Reply
#15
ratirt
ZoneDymoIm sorry but I have to ask you to learn to read, because I never said anything about the quality, again, that is not the point.
The point is that the marketing makes it about performance but its disinguinish because 4k DLSS isnt 4k.
The marketing should be about visual fidelity at the same resolution because that is what it does but they dont do that.
Some people don't understand or don't want to understand. What can you do about it?

What I'm curious about is, is the AI hardware acceleration really necessary? It has been pointed out many times, NV claims it is necessary and it uses the specific hardware aka tensor cores but yet, from what I read and watched online, this matter has never been cleared. AMD on the other hand, makes it open and says, no AI hardware acceleration needed. (yet we have to see it in action). all of this seems weird.
Is NV covering the fact it is not needed and just use this HW AI acceleration to boost sales of the new hardware?
I remember NV saying G-sync module is a must in the monitor for any NV gpu to use it properly and yet now we have FreeSync aka G-sync compatible and NV cards work just fine.
Weird stuff. Really weird stuff.
Posted on Reply
#16
ZoneDymo
ratirtSome people don't understand or don't want to understand. What can you do about it?

What I'm curious about is, is the AI hardware acceleration really necessary? It has been pointed out many times, NV claims it is necessary and it uses the specific hardware aka tensor cores but yet, from what I read and watched online, this matter has never been cleared. AMD on the other hand, makes it open and says, no AI hardware acceleration needed. (yet we have to see it in action). all of this seems weird.
Is NV covering the fact it is not needed and just use this HW AI acceleration to boost sales of the new hardware?
I remember NV saying G-sync module is a must in the monitor for any NV gpu to use it properly and yet now we have FreeSync aka G-sync compatible and NV cards work just fine.
Weird stuff. Really weird stuff.
Kinda similair iirc about how RTX voice was claimed to be tensor core powered first and then it was "just cuda" so yeah, maybe its a ploy to make all that seem more important or revolutionary then it really is
Posted on Reply
#17
las
DLSS support is exploding. Good to see.

Native support in the two most used game engines already.
ZoneDymoagain I hate this marketing, you can just say "if you run the game at 1080p instead of 4k, your performance will be better", shocking news.....
Educate yourself before spreading misinformation. Try some first hand experience before you talk BS. 4K DLSS can look better than 4K native, as in sharper text, sharper textures etc. Oh, and around 75% higher performance.
Legacy-ZAI am extremely impressed with DLSS 2.0, these type of technologies further refined is going to be a great boon to us, especially for high refresh rate displays.
The tech is great, when implementation is great. Glad to see native support in unity and unreal engine.

AMD and GTX owners hate DLSS but RTX owners loves it. Wonder why.
beautylessWow! They use Ryzen.
Yeah, and Intel.
Posted on Reply
#18
Vayra86
ZoneDymoagain I hate this marketing, you can just say "if you run the game at 1080p instead of 4k, your performance will be better", shocking news.....
Hold on now, you missed the disclaimer. Only if Nvidia says you can.
Posted on Reply
#19
ZoneDymo
lasDLSS support is exploding. Good to see.

Native support in the two most used game engines already.


Educate yourself before spreading misinformation. Try some first hand experience before you talk BS. 4K DLSS can look better than 4K native, as in sharper text, sharper textures etc. Oh, and around 75% higher performance.


The tech is great, when implementation is great. Glad to see native support in unity and unreal engine.

AMD and GTX owners hate DLSS but RTX owners loves it. Wonder why.


Yeah, and Intel.
"educate yourself" is quite ironic coming from someone lacking reading comprehension....
Posted on Reply
#20
swirl09
I dont like that they are using Performance mode in their graphs, the drop in IQ is very noticeable at that. I just played thru Exodus EE 4K Extreme and I couldnt tell the difference between native 4K and 4K with DLSS on Quality while playing, the boost in FPS was substantial however.

You can see in screenies side by side, Quality DLSS actually has slightly sharper vegetation at mid range. But if I need to sit and play spot the difference with stills, then as far as Im concerned thats a win for DLSS. You can easily tell when using any lower DLSS setting, its not bad, but I didnt buy a 4K screen to play with a vaseline filter. Still better than the original version of DLSS which made the game look like borderlands.

I would like to see DLSS in more titles, not sure how many its in now, but I know the last time they announced it was coming to several titles it basically amounted to a grand total of 1 title per month since its announcement in 2018 which isnt great! Not to mention the X2 version or whatever it was called that was supposed to bump IQ.

Also, the trailer they released for NMS is very misleading, half the trees are missing in the DLSS side!
Posted on Reply
#21
applejack
z1n0xWell DLSS did its job, that is to sell RTX 2k/3k gpus.
Going forward when a open alternative arrives, DLSS wil be joining PhysX in the realm of dead proprietary tech.

edit: the first game on the list though, kinda suggestive :laugh:
We use DLSS that is available today. whenever a worthy alternative arrives - we'll also use that.
also, nothing is "dead" about PhysX. it has become open source, it's built-in popular game engines, and is being widely used on all platforms.
just search your installed games folders for physx*.dll files. most likely you'll find some.
modern games rarely utilize PhysX GPU acceleration, I'll give you that, however, its CPU performance (and CPU hardware) has come a long way, and is not inferior to competitive engines in any way.
Posted on Reply
#22
95Viper
Stay on topic and discuss the topic....
DO NOT throw insults at each other, as, it does not help the discussion.

Thank You and keep it civil.
Posted on Reply
#23
las
DLSS improves performance and visuals, example:
www.rockpapershotgun.com/outriders-dlss-performance

You should stick to quality or balanced and you should never use motion blur with dlss enabled.

"DLSS Quality makes the solar panel look ten times sharper, and also makes the grate on the floor much more distinct. Everything looks sharper, too."
ZoneDymo"educate yourself" is quite ironic coming from someone lacking reading comprehension....
Another mad AMD owner :laugh: maybe FSR is out and working in 2025, I highly doubt a RX480 gets support tho :laugh:
In every single thread about DLSS, the only people who talk shit, are AMD and GTX owners. Nothing new here :roll:

I know it's hard to accept new technology when you can't use it :D
Posted on Reply
#24
Vayra86
lasAnother mad AMD owner :laugh: maybe FSR is out and working in 2025, I highly doubt a RX480 gets support tho :laugh:
In every single thread about DLSS, the only people who talk shit, are AMD and GTX owners. Nothing new here :roll:

I know it's hard to accept new technology when you can't use it :D
Talking shit, or just discussing the technology for what it truly is. Most of what's been said here are facts, yours included. One does not exclude the other, and like most I'm also in a love-hate perspective to this technology. Its great for what it does, its pretty shit Nvidia needs to apply its special sauce before you can use it. Because 1.0 or 2.0 that still is the case, no matter how much they speak of easy and integrated.

The same thing applies to pretty much every technology on RTX cards, mind. RT? Great tech. If it doesn't kill your FPS and Nvidia decides your game is the chosen one to get it.

Other than that, you can epeen all you want about haves or have nots, but its the epitome of sad and disgusting all at the same time, that. Not the best form. Did it occur to you many potential buyers have been waiting it out because (much like myself tbh) there really wasn't much to be had at all? Turing was utter shite compared to Pascal and Ampere was available for about five minutes. And if you have a life, there's more to it than having the latest greatest, right?
ZoneDymoKinda similair iirc about how RTX voice was claimed to be tensor core powered first and then it was "just cuda" so yeah, maybe its a ploy to make all that seem more important or revolutionary then it really is
Duh.

We're still talking about the same rasterized graphics, now with a few more post effects on top that require dedicated hardware to work without murdering performance altogether. Let's not fool each other.
Posted on Reply
#25
ratirt
Vayra86Its great for what it does, its pretty shit Nvidia needs to apply its special sauce before you can use it. Because 1.0 or 2.0 that still is the case, no matter how much they speak of easy and integrated.

The same thing applies to pretty much every technology on RTX cards, mind. RT? Great tech. If it doesn't kill your FPS and Nvidia decides your game is the chosen one to get it.
I'm curious if the DLSS ends up as the G-sync module. No dedicated stuff needed for NV cards which can use FreeSync freely. Is FSR gonna force NV to do the same thing with DLSS or will they just skip DLSS altogether?
That would have been something.
Posted on Reply
Add your own comment
Dec 19th, 2024 11:37 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts