Me: "Pffft, a walking simulator, who needs such a thing"
Covid-19 "Hi!"
Me: "Wow, it's like actually being outside!"
Hah but you never walked outside at TWO HUNDRED FORTY EF PEE ES
The tech press should not be spreading lies. 4K is 4K, and DLSS is not. You can claim your image looks as good, that's fine, I disagree, but say "it looks as good as 4k" use actual English that has meaning instead of marketing speak.
Many modern games use rendering techniques that look blurry and don't scale with resolution anymore. Play Detroit for example, and you'll notice that much of the image doesn't improve when you switch from 1440p to 4k. Play old games and it is totally different. That is why DLSS is being pushed because it is easy to fool people who have games like Detroit, Death Stranding, Control and FF15 that are chronically blurry.
I personally prefer the visuals of last gen. Games like Mass Effect 3 and old Unreal games before the AA craze where resolution actually leads to a proper image. The fetish for realistic graphics has led to transparency techniques for hair and vegetation that is blurry and horrible imo. I'd rather play Trials of Mana at 4k 120fps and enjoy crisp visuals than play most of the recent stuff we are getting. I even prefer the visuals of Fortnite over DLSS games.
It seems the market is diverging a bit. I love Riot Games commitment to fast performance and clean rendering techniques.
Inclined to agree. Some games are so blurry its painful. I even happen to complete kill AA altogether when only temporal is available. Yes, the jaggies are gone... with the sharpness. It gets even worse when there's a (fixed!) sharpening pass on top of that, suddenly you're looking at a cartoon. SOMETIMES, T(x)AA looks good. But whenever it does, it also hits perf pretty bad.
That said, DLSS at very high resolutions is a lot better than TAA most of the time. Definitely a middle ground preferable to newer AA in general.
Resident Evil (1, reboot) took the cake. It has an internal render resolution slider... even at max its STILL noticeably below your native display res. Like you're rendering 720p content on 1080p. And if that isn't enough... some games place a constant chromatic abberation horror on top of that. Want some LSD with your game?
I agree. I feel the need to have Tensor cores/ AI to perform this upscaling while theoretically sounds better, however in practice, even a dumb down version like the FidelityFX is capable of improving performance without significant loss in image quality. If this supposed upscaling by AI is significantly better in terms of image quality and ease of implementation, then DLSS 1.0 will not have failed in the first place. To me, it is the usual Nvidia strategy to bring in proprietary technology to retain their foothold.
Most people won't have that much to complain if,
1. This is not a proprietary technology, limited to only certain hardware
2. Because of this, you pay a premium to Nvidia, where competitor provides a dumb down version of an upscaler that works across hardware
Anyway regardless of the image quality, it is factual that it is not true 4K. That is the purpose of DLSS.
FidelityFX you say? God no. Its far worse, you're better off using some SweetFX filters from 2010. I'm not even joking.
I'll happily pay a premium for features that actually do improve the experience, and in the case of DLSS, there is virtually free performance on the table, so is it really a premium... some beg to differ.
Not that I jumped on Turing at this point, and never would've for DLSS alone either... but the tech is pretty impressive at this point. IF you can use it. That is the deal breaker up until now, but apparently they're working to make it game agnostic. And note... competitor only provides a dumbed down version because its cheap, its easy, and it somehow signifies 'they can do it too' when in reality they really can't. Let's call it what it is, okay?