So you are basically saying some games implement TAA better than others and poor implementations can result in blurriness or ghosting, so DLSS can actually look better than the native resolution image if the native image uses a poor TAA implementation.
Why would game developers, especially those sponsored by Nvidia have a poor implementation of TAA ? I don't know, maybe others know.
Because most games are made in UE4/5, which has not bad, not great TAA in case of UE4 and better TSR in case of UE5 but both still do not compare to DLSS. You forget that you also have DLAA mode which basically is native, and in FSR3 you have Native mode literally.
Studios won't waste time developing new TAA techniques and fine tuning it when they have DLSS, FSR and XESS solutions that are more or less plug and play. And even when they do like in case of Insomniac it still looks worse.
Matter of fact is DLSS, XESS and even FSR are state of the art temporal upscaling/temporal AA solutions. You can run them at native if you like.
If you don't like temporal solutions as a whole then I don't know. Turn it off where you can and enjoy Native, no AA image.
So a game that screws up rendering due to poor TAA implementation means that DLSS4 improves image quality in all titles?
Nah, DLSS4 improves image quality in games where the devs massively cock up. TAA is a scourge among GPU rendering and frankly shouldnt exist. In games that dont use it, DLSS4 does not improve image quality.
It is indeed great news for nVidia owners, especially midrangers. I look forward to the extra boost out of my 4060 mini PC.
With AMD locking FSR4 to rDNA4 GPUs, that's going to be yet another uphill battle for Team Red.
If not TAA then what is the better solution? MSAA? FXAA?