I thought you might be trolling, so I checked out your statement. My goodness, you're not wrong.
I don't play many games anymore, so I'm a bit 'meh' about these things. But what I'm seeing called DLSS is basically a pretty version of 1080p. It looks like what I see when my wife watches Murder She Wrote at 720p on my 4k OLED.
That's nothing like 4k. So why not just play at 1080 instead?
View attachment 280165
View attachment 280168
Actually, you're not too far off. Ultra performance DLSS targets 33% per axis, which assuming a 4K (3840 x 2160) resolution turns out to be around 1267 x 712, ever so slightly below 720p, or around 11% of the resolution in raw pixel count (8.3 MP/8.294.400 px > 0.9 MP/902.104 px) which is then run through the AI filter and output as a 4K signal. This is why it looks like some sort of glorified 720p,
that is what it is.
The same proportion is applied if you use ultra performance DLSS at 1080p, to... needless to say, catastrophic results. Quick cheat sheet for DLSS resolution scaling should be:
Quality: 66.6% per axis, 45% resolution
Balanced: 58% per axis, 33% resolution
Performance: 50% per axis, 25% resolution
Ultra Performance: 33% per axis, 11% resolution
Personally, I think that no one should be using DLSS below Quality settings, and Balanced should be reserved for situations where your graphics card just realistically cannot cope with the workload. The performance settings are intended strictly for ultra high resolution gaming, IMO.