No denying that, and no argument there ..... going above 100% render resolution which actually improves the IQ at the cost of performance (hands down, the best AA there is).....Of course, with enough resources, I've got NO PROBLEM with going above 100% render resolution.
It's nice to see you say it, I'm baffled at the lengths some go to either argue better than 100%/native isn't possible, or to not admit it.
It's about native rendering vs upscaling (or rather, reduction scaling!). This is NOT about native rendering vs going above 100% render resolution which actually improves the IQ at the cost of performance (hands down, the best AA there is).
I will NEVER go below 100% render resolution.
That's quite separate to the point I was trying to make about 100%+ / supersampling, which I'm glad can be put to rest now. However, suffice to say there have been games where upscaling has absolutely provided better IQ (will give example below), but I suspect that won't occur much or at all moving forward as upscaling implementations now are
finally including a native AA option like DLAA, FSR Native AA etc - and because those AA solutions are
so much better than TAA, they are of course better than upscaling from lower res. Same algorithm, more data.
My example where upscaling
has provided better IQ goes as follows.
Game AAA releases, it has a rendering pipeline dependent on TAA to not break visual effects, so you
cannot outright disable TAA (or another temporally underpinned AA solution) at all in the game options, the developer made sure of this so their game doesn't look visually broken. The standard TAA that comes in the game has a very blurry resolve and it's own temporal artefacts like smearing and ghosting. The game has also implemented DLSS, and because the standard TAA is
that bad, and the AA algorithm in DLSS is that good, DLSS Quality mode provides superior image quality, higher detail retention, image stability etc.
Article covering some examples here.
However, it is true that DLSS can produce a better than native image, it just depends on the game. At 4K using the Quality mode, for example, we'd say that around 40% of the time DLSS was indeed producing a better than native image. Often this comes down to the native image, on a handful of occasions native rendering wasn't great, likely due to a poor TAA implementation. But on other occasions the native image looks pretty good, it's just that DLSS looks better or has an advantage at times.
Again this is
very unlikely to happen now, as we can use these AA algorithms like FSR native AA/XeSS native AA/DLAA, giving undeniably better results than starting from a lower base resolution, more data so duh, but that above scenario certainly played out in multiple games, because of straight garbage TAA so bad it fumbles, even with craploads more pixels to work with.
Image quality on a given game, on a given monitor/panel is on a spectrum of subnative-through-massively-supersampled, and so because better than native is categorically possible, it's objective to say that it isn't inherently limited to one singular way to achieve that.