Wednesday, May 19th 2021

NVIDIA Adds DLSS Support To 9 New Games Including VR Titles

NVIDIA DLSS adoption continues at rapid pace, with a further 9 titles adding the game-changing, performance-accelerating, AI and Tensor Core-powered GeForce RTX technology. This follows the addition of DLSS to 5 games last month, and the launch of Metro Exodus PC Enhanced Edition a fortnight ago. This month, DLSS comes to No Man's Sky, AMID EVIL, Aron's Adventure, Everspace 2, Redout: Space Assault, Scavengers, and Wrench. And for the first time, DLSS comes to Virtual Reality headsets in No Man's Sky, Into The Radius, and Wrench.

By enabling NVIDIA DLSS in each, frame rates are greatly accelerated, giving you smoother gameplay and the headroom to enable higher-quality effects and rendering resolutions, and raytracing in AMID EVIL, Redout: Space Assault, and Wrench. For gamers, only GeForce RTX GPUs feature the Tensor Cores that power DLSS, and with DLSS now available in 50 titles and counting, GeForce RTX offers the fastest frame rates in leading triple-A games and indie darlings.
Complete Games List
  • AMID EVIL
  • Aron's Adventure
  • Everspace 2
  • Metro Exodus PC Enhanced Edition
  • No Man's Sky
  • Redout: Space Assult
  • Scavengers
  • Wrench
  • Into The Radius VR
Source: NVIDIA
Add your own comment

49 Comments on NVIDIA Adds DLSS Support To 9 New Games Including VR Titles

#26
las
Vayra86Talking shit, or just discussing the technology for what it truly is. Most of what's been said here are facts, yours included. One does not exclude the other, and like most I'm also in a love-hate perspective to this technology. Its great for what it does, its pretty shit Nvidia needs to apply its special sauce before you can use it. Because 1.0 or 2.0 that still is the case, no matter how much they speak of easy and integrated.

The same thing applies to pretty much every technology on RTX cards, mind. RT? Great tech. If it doesn't kill your FPS and Nvidia decides your game is the chosen one to get it.

Other than that, you can epeen all you want about haves or have nots, but its the epitome of sad and disgusting all at the same time, that. Not the best form. Did it occur to you many potential buyers have been waiting it out because (much like myself tbh) there really wasn't much to be had at all? Turing was utter shite compared to Pascal and Ampere was available for about five minutes. And if you have a life, there's more to it than having the latest greatest, right?



Duh.

We're still talking about the same rasterized graphics, now with a few more post effects on top that require dedicated hardware to work without murdering performance altogether. Let's not fool each other.
Just tired of reading shit about DLSS from people who have absolutely zero experience with it.

What was so great about Pascal? Only 1080 Ti was truly great in my eyes, today it's not even considered mid-end tho and 1070/1080 was/is by RTX 2060 6GB especially in DLSS supported titles.

My 980 Ti with custom firmware performed on 1070Ti/1080 level. 1620 MHz 3D clocks. Did not even consider Pascal when it first came out because of this fact.
My 1080 Ti performed only on 2070 level overall and lacked RTX features. 3080 was a night and day upgrade. Biggest in many years.

I don't use RT ever, or I actually do in Metro Exodus EE because its needed, but on lowest settting, with all other settings maxed out (aka ultra) + DLSS Quality for 150-200 fps gameplay at 1440p. Looks sharp and crisp. If I disable DLSS, fps drops to 85-120 or so. Huge difference on a 1440p/165 Hz monitor.

Some people like RT tho and DLSS makes it useable. RT performance is beyond crap on AMD cards with no FSR to boost performance. So yeah, RT on AMD cards are still pretty much a NO-GO.
Posted on Reply
#27
ratirt
lasWhat was so great about Pascal? Only 1080 Ti was truly great in my eyes, today it's not even considered mid-end tho and 1070/1080 was/is by RTX 2060 6GB especially in DLSS supported titles

My 980 Ti with custom firmware performed on 1070Ti/1080 level. 1620 MHz 3D clocks. Did not even consider Pascal when it first came out because of this fact.
Yeah. Too bad DLSS doesn't work on those pascals right? That would have been something and that's the point.
If FSR does work on those (and it seems it will), you may see them in a totally different light in comparison to 2000 and 3000 series.
Posted on Reply
#28
las
ratirtYeah. Too bad DLSS doesn't work on those pascals right? That would have been something and that's the point.
If FSR does work on those (and it seems it will), you may see them in a totally new light in comparison to 2000 and 3000 series.
I highly doubt FSR will boost the performance as much as DLSS and it will take years before support is widespread anyway. It took years before Nvidia nailed it. AMD has way fewer ressources than Nvidia.

DLSS 3.0 should work in all games which has TAA, which is alot.

Also, with native support in Unreal Engine and Unity, most AAA games are pretty much covered going forward.

AMD are years behind on both RT and this. FSR is not going to come out and make 5 year old GPUs relevant again.

AMD talked about FSR for a long time now, showed nothing, why do you think? Talk is cheap.
Posted on Reply
#29
ratirt
lasI highly doubt FSR will boost the performance as much as DLSS and it will take years before support is widespread anyway. It took years before Nvidia nailed it. AMD has way fewer ressources than Nvidia.

DLSS 3.0 should work in all games which has TAA, which is alot.

Also, with native support in Unreal Engine and Unity, most AAA games are pretty much covered going forward.

AMD are years behind on both RT and this. FSR is not going to come out and make 5 year old GPUs relevant again.

AMD talked about FSR for a long time now, showed nothing, why do you think? Talk is cheap.
You are very arrogant about what you are saying you know. It took 2 years before NV improved DLSS and got more support for it. The DLSS implementation is not that simple as you describe it to be and FSR is supposedly be totally different and easier to implement. I wouldn't cross the FSR just yet because it is not like NV exclusive DLSS but rather more open. I'd wait and see what it does and how it performs. Not too long though. All points out to June which is just around the corner.
It doesn't even have to be as good in performance boost as you claim DLSS is. All FSR needs is easy implementation for any given game which it will have. Open source which will give all available cards performance boost regardless. Even if FSR gave half the performance of what DLSS can it would've still be a win for huge number of users.
Posted on Reply
#30
las
ratirtYou are very arrogant about what you are saying you know. It took 2 years before NV improved DLSS and got more support for it. The DLSS implementation is not that simple as you describe it to be and FSR is supposedly be totally different and easier to implement. I wouldn't cross the FSR just yet because it is not like NV exclusive DLSS but rather more open. I'd wait and see what it does and how it performs. Not too long though. All points out to June which is just around the corner.
It doesn't even have to be as good in performance boost as you claim DLSS is. All FSR needs is easy implementation for any given game which it will have. Open source which will give all available cards performance boost regardless. Even if FSR gave half the performance of what DLSS can it would've still be a win for huge number of users.
Why? Because I'm stating facts? I don't claim anything, there's literally numbers and tests about DLSS everywhere, fps boost on average is 75% or so at quality/balanced modes, can surpass 100% using performance modes, but thats pretty useless for most people, unless they are using an 8K display maybe.

Well I'm just tired of hearing AMD talk about this feature without showing anything at all. Talk is cheap.
Posted on Reply
#31
ratirt
lasWhy? Because I'm stating facts? I don't claim anything, there's literally numbers and tests about DLSS everywhere, fps boost on average is 75% or so at quality/balanced modes, can surpass 100% using performance modes, but thats pretty useless for most people, unless they are using an 8K display maybe.

Well I'm just tired of hearing AMD talk about this feature without showing anything at all. Talk is cheap.
I wasn't referring to DLSS being a flop of a feature and the facts are widely known and you don't need to bring that one up every time, by stating the obvious. I'm referring to your FSR 'facts' based on what's NV been through with DLSS which are totally different techniques. Claiming AMD will have to go through the same struggle for support and first release being a crap like DLSS 1.0 was, even though FSR is not out yet. There's no facts for those. DLSS implementation was a struggle due to the way it has to be implemented.

If you are tired then lie down and close your eyes for a minute. How can you show something that's coming out next month? People are just exited about seeing this actually happening. What's wrong with that? DLSS is a great feature, no doubt about it and everybody knows these facts but there are none for FSR and your claims about it are bold.
Posted on Reply
#32
las
ratirtI wasn't referring to DLSS being a flop of a feature and the facts are widely known and you don't need to bring that one up every time, by stating the obvious. I'm referring to your FSR 'facts' based on what's NV been through with DLSS which are totally different techniques. Claiming AMD will have to go through the same struggle for support and first release being a crap like DLSS 1.0 was, even though FSR is not out yet. There's no facts for those. DLSS implementation was a struggle due to the way it has to be implemented.

If you are tired then lie down and close your eyes for a minute. How can you show something that's coming out next month? People are just exited about seeing this actually happening. What's wrong with that? DLSS is a great feature, no doubt about it and everybody knows these facts but there are none for FSR and your claims about it are bold.
Because AMD talks about FSR yet showing nothing, as in nothing at all. They are doing damagecontrol because of DLSS 2.x success.

If AMD were ANYWHERE NEAR a release date, why not show a few glimts of the tech? Not promising if you ask me. So yeah, expect years before implementation is widespread and works. We can revive this thread when FSR vs DLSS comparisons are out in 6-12 months. Hopefully. If you will ever see a game feature both DLSS and FSR that is. If not around 75% gain should be the goal, which is what DLSS Quality mode delivers.

An open standard is great but I highly doubt FSR will be able to deliver what DLSS already delivers. Again, I will be amazed if AMD does that. And then AMD GPU is once again up for consideration next time.
Posted on Reply
#33
ratirt
lasBecause AMD talks about FSR yet showing nothing, as in nothing at all. They are doing damagecontrol because of DLSS 2.x success.

If AMD were ANYWHERE NEAR a release date, why not show a few glimts of the tech? Like I said, talk is cheap
Yes they are showing nothing and yet you already know the 'facts' about the tech?
AMD said June so if they are saying it's going to be released in June why would I doubt that? If NV said DLSS 2.1 release is in July would you said 'nowhere near' as well if they didn't show anything?
To be fair, I'm exited for the DLSS 2.1 to show what it can do. Honestly it might be a killer but I will refrain from any speculation or guesses.
If anything, AMD's talk may turn out to be cheap not people being exited about it and sharing thoughts.
Posted on Reply
#34
las
ratirtYes they are showing nothing and yet you already know the 'facts' about the tech?
AMD said June so if they are saying it's going to be released in June why would I doubt that? If NV said DLSS 2.1 release is in July would you said 'nowhere near' as well if they didn't show anything?
To be fair, I'm exited for the DLSS 2.1 to show what it can do. Honestly it might be a killer but I will refrain from any speculation or guesses.
If anything, AMD's talk may turn out to be cheap not people being exited about it and sharing thoughts.
DLSS 2.1 is already out, I think it was about 8K gaming (ultra performance mode) and VR mostly. Not that exiting for most people.
DLSS 3.0 is what is going to be good; All games with TAA support will be able to use DLSS.

Nvidia don't talk about DLSS, they act. They released 2.0 out of nowhere and same with 2.1. They TALK when it's FINISHED. AMD talk first.

Nvidia does not like to talk about DLSS much, without RT is mentioned too. DLSS was supposed to be a feature to make RT useful .. most people use DLSS to boost performance immensely with RT disabled instead.

DLSS works great as AA on top of boosting performance too. Jaggies goes away. Text and textures sharpen.

www.rockpapershotgun.com/outriders-dlss-performance

So yeah, if AMD can bring anything thats even remotely close, I will be amazed. I guess we will found out, in some months, maybe... :laugh:

I DON'T HATE AMD. I have a full AMD RIG in my house - I just HATE when people TALK. ACT PLEASE.
Posted on Reply
#35
JohnSuperXD
DLSS deteriorated graphics in the game. It's not magic like many suggest and turning it on and off makes clear difference in how the game looks. Especially when you stop and move and stop and move, I've notice that it sort of have a delay effect that you still can see the last frame.
Is it great that DLSS boost frame rate? Yes
But is it enjoying as gamer? No, for me there is more negative than there is positive especially when it comes to motion when you stop and move it leaves that trail from past frames and you see a filter on the frame that makes it looks fury and low resolution.
Posted on Reply
#36
Mussels
Freshwater Moderator
Sorry guys i cant hear the fighting over here with all the imaginary 8K 165Hz gaming - do i need to slap someone, or can we play nice?
Posted on Reply
#37
ratirt
lasDLSS 2.1 is already out, I think it was about 8K gaming (ultra performance mode) and VR mostly. Not that exiting for most people.
DLSS 3.0 is what is going to be good; All games with TAA support will be able to use DLSS.
Which games support DLSS 2.1? 8k VR gaming only?
BTW. The TAA support automatically DLLS 3.0 is a fake news or at least wishful thinking. nothing has been confirmed by NV yet.
lasNvidia don't talk about DLSS, they ACT. They released 2.0 out of nowhere and same with 2.1. They TALK when it's FINISHED. AMD talk first.

Nvidia does not like to talk about DLSS much, without RT is mentioned tho. DLSS was supposed to be a feature to make RT useful, but people use DLSS to boost performance immensely with RT disabled instead.

DLSS works great as AA on top of boosting performance too. Jaggies goes away. Text and textures sharpen.
NV said they are going to release 2.0 DLSS and that it is in works for a while before its release. Not shocking though considering how DLSS 1.0 has been received.
MusselsSorry guys i cant hear the fighting over here with all the imaginary 8K 165Hz gaming - do i need to slap someone, or can we play nice?
What are you about? We're just talking and sharing info. Isn't that right @las?
For instance I had no idea DLSS 2.1 is already in the play here and that it's dedicated VR feature.
Posted on Reply
#38
las
ratirtWhich games support DLSS 2.1? 8k VR gaming only?
BTW. The TAA support automatically DLLS 3.0 is a fake news or at least wishful thinking. nothing has been confirmed by NV yet.


NV said they are going to release 2.0 DLSS and that it is in works for a while before its release. Not shocking though considering how DLSS 1.0 has been received.
www.tomshardware.com/news/marvels-avengers-dlss-21-support-performance-tested

7 month old arcticle, again, nothing important for most people since version 2.0 has most covered

I never "praised" 2.1 :D I has no use for it, I only has experience with 1.0 which sucked and especially 2.0 which is great

First time I tried DLSS, was in OG Metro Exodus I think. Disabled it instantly. Blurry. DLSS 2.0 opened my eyes. It's an amazing tech when implemented right.

I can see that I sound like a Nvidia fanboy, I simply look forward to see FSR and Im tired of reading rumours about it. I want to see what they can do without dedicated hardware. I said that if they can match or even get close to what DLSS 2.0 can do, I will be amazed

I just hope FSR support will take off faster than DLSS did
Posted on Reply
#39
ratirt
las7 month old arcticle, again, nothing important for most people since version 2.0 has most covered

I never "praised" 2.1 :D I has no use for it, I only has experience with 1.0 which sucked and especially 2.0 which is great

First time I tried DLSS, was in OG Metro Exodus I think. Disabled it instantly. Blurry. DLSS 2.0 opened my eyes. It's an amazing tech when implemented right.
DLSS 1 was blurry but not all games had exhibit that as much. It specifically depended on the game I think if I reach in my memory banks.
Thanks for sharing the article :) btw.Too bad it's just one game. I'll dig in deeper for changes vs DLSS 2.0.
lasI can see that I sound like a Nvidia fanboy, I simply look forward to see FSR and Im tired of reading rumours about it. I want to see what they can do without dedicated hardware. I said that if they can match or even get close to what DLSS 2.0 can do, I will be amazed
There's nothing wrong with being a fan of a feature or anything for that matter. I wouldn't use a fanboy in your case although some people here might take it that way.
Frankly I'm amazed AMD Is releasing it regardless how good or bad it will be. Thought this will take ages. They have pulled all the stops to catch up.
Posted on Reply
#40
Vayra86
lasJust tired of reading shit about DLSS from people who have absolutely zero experience with it.

What was so great about Pascal? Only 1080 Ti was truly great in my eyes, today it's not even considered mid-end tho and 1070/1080 was/is by RTX 2060 6GB especially in DLSS supported titles.

My 980 Ti with custom firmware performed on 1070Ti/1080 level. 1620 MHz 3D clocks. Did not even consider Pascal when it first came out because of this fact.
My 1080 Ti performed only on 2070 level overall and lacked RTX features. 3080 was a night and day upgrade. Biggest in many years.

I don't use RT ever, or I actually do in Metro Exodus EE because its needed, but on lowest settting, with all other settings maxed out (aka ultra) + DLSS Quality for 150-200 fps gameplay at 1440p. Looks sharp and crisp. If I disable DLSS, fps drops to 85-120 or so. Huge difference on a 1440p/165 Hz monitor.

Some people like RT tho and DLSS makes it useable. RT performance is beyond crap on AMD cards with no FSR to boost performance. So yeah, RT on AMD cards are still pretty much a NO-GO.
Pascal was a price/perf/watt/temp balancing act that was never executed as well before. High clocking, relatively small dies with fantastic performance on a solid node. Excellent boost technology that would only keep giving. No frills or BS technology to hide something that was sub-par, because quite frankly there wasn't anything.

Sort of the place where RDNA2 is right now, as well.
lasI DON'T HATE AMD. I have a full AMD RIG in my house - I just HATE when people TALK. ACT PLEASE.
We're quite the same then, but this is a forum, which is meant for talking :)

As much as Nvidia is ready to pre-empt and frontier with new technology, they're also the proprietary bosses that we never like to see. Its when DLSS can move to mainstream and GPU agnostic that we can start cheering universally, IMHO. I'm totally unimpressed when companies can fine tune a single title with manual work. Wooptiedoo, tons of hours went into making something nice, except you're done looking at it after a tiny fraction of said hours. That math doesn't check out and won't survive the market.
Posted on Reply
#41
Mussels
Freshwater Moderator
If i recall, the key to AMDs implementation is that its not the entire scene at once - but items within the scene.
Nvidias does the entire screen at once, where AMD can leave say mirandas ass in mass effect in full 4K, while anything in motion or the far distance could be dropped to 720p

both approaches have their merits, and i look forward to seeing both polished results fight it out. It's rare for a technology to give us better performance, and not less.
Posted on Reply
#42
ratirt
MusselsIf i recall, the key to AMDs implementation is that its not the entire scene at once - but items within the scene.
Nvidias does the entire screen at once, where AMD can leave say mirandas ass in mass effect in full 4K, while anything in motion or the far distance could be dropped to 720p

both approaches have their merits, and i look forward to seeing both polished results fight it out. It's rare for a technology to give us better performance, and not less.
And that is why I want the FSR to be great. Mirandas ass is gold and should be left alone although I've never played the game :P
On a side note. 720p? does that apply for all resolutions or you are talking specifically about 1080p drop to 720p?
Posted on Reply
#43
las
Vayra86Pascal was a price/perf/watt/temp balancing act that was never executed as well before. High clocking, relatively small dies with fantastic performance on a solid node. Excellent boost technology that would only keep giving. No frills or BS technology to hide something that was sub-par, because quite frankly there wasn't anything.

Sort of the place where RDNA2 is right now, as well.


We're quite the same then, but this is a forum, which is meant for talking :)

As much as Nvidia is ready to pre-empt and frontier with new technology, they're also the proprietary bosses that we never like to see. Its when DLSS can move to mainstream and GPU agnostic that we can start cheering universally, IMHO. I'm totally unimpressed when companies can fine tune a single title with manual work. Wooptiedoo, tons of hours went into making something nice, except you're done looking at it after a tiny fraction of said hours. That math doesn't check out and won't survive the market.
Pascal was called Paxwell for a reason. It was mostly just Maxwell on a smaller process, which allowed for higher clocks, plus a few optimizations.
Custom 980 Ti's beat custom 1070's out of the box. Maxwell overclocked way better than Pascal overall. Maxwell had alot left in the tank. Pascal did not. GPU Boost worked much better in Pascal, so headroom was lower.

GM200 especially had insane OC potential and headroom. Custom versions delivered like 15-20% performance over reference, and you could easily gain an additional 15-20% more.

There was a night and day difference between reference 980 Ti and a custom card with max OC. Nvidia gimped the reference card alot. Altho they still hit around 1450 MHz post OC, but could get noisy, custom cards were much cooler and quieter post OC. 980 Ti reference only ran around 1200 MHz 3D clocks at stock. Thats like 300-400 MHz lower than it potentially could.

At 1400-1450 MHz you were beating 1070 with ease. At 1550-1600 MHz you were around 1080 level and 6GB vs 8GB never really meant anything back then, 980 Ti had 384 bit bus afterall. 1070/1080 only 256 bit. Today 6 vs 8 might matter but none of the cards can max out games at 1440p anyway, so VRAM requirement won't be as high. Pretty much no games break 6GB usage at 1440p maxed out today. Only modded ones.

1080 Ti is the only Pascal card I remember as being truly great. Yet I would barely consider it being mid-end today.

I AM NOT SAYING 1080 is a bad card, I just say it's nothing spectacular. Maxwell was great too and IMO there was not much difference between Maxwell and Pascal.

970 was insane value even with the "3.5GB". Still holds up very well today in 1080p gaming. It was almost too close to 980. Nvidia separated x70 and x80 more since 900 series as a result (I guess - because 970 sold like crazy compared to 980)

The best GPUs in the last 10 years probably is 980 Ti, 1080 Ti, 7970 and 7950. Random order. All of them aged very well and/or overclocked alot.

Hell 7970 was released like 3 times; 7970 -> 7970 GHz Edition -> 280X
I remember when Cata 12.11 hit, Tahiti really took off then.

7970 @ 1200/1600 was my last AMD GPU.
Posted on Reply
#44
Vayra86
lasThe best GPUs in the last 10 years probably is 980 Ti, 1080 Ti, 7970 and 7950. Random order. All of them aged very well and/or overclocked alot.
Exactly, and of those only the Pascal range was finally on a shrinked node after all those years, which totalled up for my conclusion on it being the best combination of metrics.

Long post to say you actually mostly agree ;)
Posted on Reply
#45
95Viper
You were asked to discuss the topic at hand and not the people/members who post.
Topic seems to be each other's opinions and who is right.
Anymore insults/jabs will be dealt with.
Now, stay on the topic.

Thank You.
Posted on Reply
#46
Mussels
Freshwater Moderator
ratirtAnd that is why I want the FSR to be great. Mirandas ass is gold and should be left alone although I've never played the game :p
On a side note. 720p? does that apply for all resolutions or you are talking specifically about 1080p drop to 720p?
i meant that they have the freedom to decide the min and max resolution (perhaps even in game settings) for foreground and background, at a game setting or driver override level
Posted on Reply
#47
ratirt
Musselsi meant that they have the freedom to decide the min and max resolution (perhaps even in game settings) for foreground and background, at a game setting or driver override level
That would be nice if they do. I'm really curious how this one will work.
Posted on Reply
#48
nguyen
Seems like DLSS adoption rate is accelerating, that's good.
4K high refresh gaming is becoming more accessible now thanks to DLSS, that even a 3060 Ti can get good framerates (without RT of course).
Posted on Reply
#49
las
nguyenSeems like DLSS adoption rate is accelerating, that's good.
4K high refresh gaming is becoming more accessible now thanks to DLSS, that even a 3060 Ti can get good framerates (without RT of course).
Yes but implementation is still the most important, it can be very good (mostly is, when DLSS 2.x is used) but COD Warzone implementation is not great and I don't use it here (aim is off and it's slightly blurry)
Posted on Reply
Add your own comment
Dec 19th, 2024 14:03 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts