Wednesday, July 1st 2020
Death Stranding with DLSS 2.0 Enables 4K-60 FPS on Any RTX 20-series GPU: Report
Ahead of its PC platform release on July 14, testing of a pre-release build by Tom's Hardware reveals that "Death Stranding" will offer 4K 60 frames per second on any NVIDIA RTX 20-series graphics card if DLSS 2.0 is enabled. NVIDIA's performance-enhancing feature renders the game at a resolution lower than that of the display head, and uses AI to reconstruct details. We've detailed DLSS 2.0 in an older article. The PC version has a frame-rate limit of 240 FPS, ultra-wide resolution support, and a photo mode (unsure if it's an Ansel implementation). It has rather relaxed recommended system requirements for 1080p 60 FPS gaming (sans DLSS).
Source:
Tom's Hardware
62 Comments on Death Stranding with DLSS 2.0 Enables 4K-60 FPS on Any RTX 20-series GPU: Report
If it goes like dlss 1.0 ,Rtx or physx, hairworks or gameworks it's not going to shake the world.
Big meh.
Also old games scale better with resolution because they were made to actually render to the resolution, most modern games basically frame scale a 1080p world with a few, like GtaV allowing you the option to use 100% frame scaling.
And yes, I played Control with DLSS 2 and all the RTX and it looks a little better than native res.
Already DLSS 2.0 shed that per title training DLSS 1.0 carried. I can only guess where DLSS 3 or 4 will go.
Picture quality, though, is subjective. Because there's no reference. It's just comparing an approximation of the desired result (no-DLSS) with another approximation (DLSS).
At the end of the day, it's just (clever) tricks that allows us to game at higher settings than the hardware could push otherwise. Anyone remembers how manufacturers were stuck for a while trying to do better SSAA until MSAA came along? Guess what, MSAA is still not on par with SSAA, but in the meantime we've come to call MSAA too taxing and willing to accept even more IQ compromises (e.g. TAA). Why? Because of performance.
So... seems to me DLSS 2.0 remains overhyped. I'm sure Nvidia's marketing money will bury this, though. This isn't strictly true. Depending on the model and the training set, it's very easy to get trapped into local optimums. The big issue with this is that it's very hard to know if you did. Moreover, when getting close to local optimums, more training does very little to improve inference result performance.
Don't get me wrong, DLSS is good tech, but it's still an approximation. There's just nothing inherent to it that makes it better at solving this particular problem that other approximations can't do.
I imagine it's pretty easy to avoid the local optimum solution if you train multiple times, applying inputs in a different order. Or if you develop an automated tool to analyze artifacts for you.
What makes DLSS better is the computations for approximations are done by Nvidia on their servers and the driver only has to apply the result when rendering. It doesn't guarantee any level of quality (though I think for only a second version, it does pretty well), but it guarantees more performance.
If you worry about loss of quality, you should worry more about variable rate shading. And that's in DX, it's going to be everywhere.
With respect to developing a tool to analyze artifacts, that's actually pretty hard too because artifacts are subjective. You're going to get artifacts regardless, which ones are actually acceptable?
If you had an accurate model that could easily allow you to identify a global optimum, you'd be able to build a mathematical model to achieve it and, thus, model it with a regular algorithm. You wouldn't need neural networks in the first place. You've basically described all upscalers. This doesn't even make sense. If I'm going to use an upscaler, why wouldn't I complain about quality if there's better upscalers out there?
Edit: Oh, I see what's going on here. The game is actually pretty good, but we pretend it's bad because Nvidia has something to do with it. My bad.
Sorry for that confusion, I mentioned the 1070 I used for 4k gaming and how powerless it was.
I've looked at screenshots of DLSS2.0 and it seems more like a sharpening filter. They do exist and damn they are sharp, especially for text it's a pleasure to use them.
DLSS 2.0 is the absolute best image reconstruction technique in gaming, yet people are moaning "but it's not real 4K". Just educate yourself and appreciate the technology, you don't have to complain about every damn thing Nvidia-Related.
what a joke.
1. This is not a proprietary technology, limited to only certain hardware
2. Because of this, you pay a premium to Nvidia, where competitor provides a dumb down version of an upscaler that works across hardware
Anyway regardless of the image quality, it is factual that it is not true 4K. That is the purpose of DLSS.
That said, DLSS at very high resolutions is a lot better than TAA most of the time. Definitely a middle ground preferable to newer AA in general.
Resident Evil (1, reboot) took the cake. It has an internal render resolution slider... even at max its STILL noticeably below your native display res. Like you're rendering 720p content on 1080p. And if that isn't enough... some games place a constant chromatic abberation horror on top of that. Want some LSD with your game? FidelityFX you say? God no. Its far worse, you're better off using some SweetFX filters from 2010. I'm not even joking.
I'll happily pay a premium for features that actually do improve the experience, and in the case of DLSS, there is virtually free performance on the table, so is it really a premium... some beg to differ.
Not that I jumped on Turing at this point, and never would've for DLSS alone either... but the tech is pretty impressive at this point. IF you can use it. That is the deal breaker up until now, but apparently they're working to make it game agnostic. And note... competitor only provides a dumbed down version because its cheap, its easy, and it somehow signifies 'they can do it too' when in reality they really can't. Let's call it what it is, okay?
Guess I'll have a crack at being Postman Pat too.
Beyond that, you have two choices when it comes to graphics cards, pick your poison and quit whining year in year out.
scroll down and compare native resolution+taa vs dlss 2.0 with a slider,then look at performance gains,near 40%.
this is crazy.the image is clearer and more detailed,and runs way faster.
dunno what amd fideltyfx is but it looks like a mess.
A sharpening filter mostly. It says is does up/downscaling as well, but it's unclear how it does that.
fidelityfx ................................................................................................... dlss 2.0
lol,it's this amd version of nvidia's dlss that's so much better according to red team fanbase
It's just that AMD doesn't talk about upscaling in FidelityFX. I'm pretty sure this is mostly a sharpening filter + some classic interpolation (e.g. bi/trilinear, nearest neighbor). Well, when someone claims "better than original", it's pretty clear they don't know what they're talking about. Hint: there's no "original" in CGI ;)
True 4k is still an approximation of a geometric model as seen through the rendering pipeline.