What I can't fathom is how you do not understand that you can't convince me that a frame rate difference is very noticeable but one in image quality isn't, you can't quantify that. That's entirely within the realm of subjectivity and no matter how you spin it, "4k vs DLSS will still be the same visual videlity" will never be a valid point that you could use. You simply can't prove that to be true.
The native resolution is not 4K, no amount of reconstruction/machine learning will ever bring back that lost detail, you can interpolate the missing pixels, sharpen the image, etc nothing will bring back that original 3840x2160 image. So I can't say if the difference would be noticeable or not but what I can say with absolute certainty is that the image which is reconstructed with DLSS will look worse in every measurable way.
Its not about 4k vs DLSS, its about 4k vs DLSS on an older game or a newer game, THAT is of no difference, DLSS on an older game looks the same as DLSS on a newer game, its the same technique applied to both.
Lets go back a tad shall we:
M2B said that in the future he expects blind tests to be done and we will see many people who cant see a difference between Native 4k and DLSS
You then responded saying the same can be done for framerate, that the difference between 90 and 120 is hardly noticable for the vast majority, but if you were to therefor claim the cards producing those numbers are of the same value, "all hell would break loose"
implying that you cant just glance over the minor differences because they do matter.
Then I responded its not a fair comparison as that frame-rate difference wont matter indeed for that made up game, like you can buy a for example RX580 to play CSGO or an RTX2080Ti, both will easily run that game at 200 fps so it wont matter which you get, BUT If you were to play something a tad more demanding, suddenly it does matter.
If one only played CSGO nobody would mind someone claiming that the 200fps RX580 or the 300fps RTX2080ti dont matter and that they should just buy the cheaper of the two cards.
But if people do play more, suddenly that difference in performance will matter and then you cant just make the previous claim because you would give bad advice.
However when it comes to the diference between 4k and DLSS visual videlity for CSGO or a "tad more demanding" game, the difference would remain the same, the difference does not scale like fps can and does and so the argument of "all hell breaking lose" falls away.
Either way, getting massively off track here, or are we?
Its not entirely in the realm of subjectivity though is it, if a pixel is black, its black, there is nothing subjective about it.
A 4k image and a 1440 DLSS 2.0 image can be compared pixel for pixel and can be analysed in how "the same" they are and that removes the idea of subjectivity.
How much worse does B look then A? Subjective, but if some program analyses them and says they are 99.9% the same then well, becomes hard to argue subjectivity.
and on the last thing, remember that this is also about looks in motion, not just a fixed image, AND that the DLSS image comes from a 16k reference image, which is quite a bit higher res with more detail then 4k.