Yeah, that makes sense as the game assets are being scaled up and stored in that scaled up state, which would take up much more VRAM.
You're not following along with I said. No games are being made with assets in native 8k. The game engines and driver are scaling the existing assets up to the displayed resolution. I'm not saying it looks bad. But it's NOT 8k native.
Maybe. It would be interesting to see. However, I going to stand firm on the stance that the compute for native 8k doesn't exist yet for consumer level parts. The only way it's going to happen anytime soon is if AMD and Nvidia bring back multi-GPU support in a way that is transparent to the software.
Imo it doesn't really make sense to speak about whether or not assets are 8k, aside of textures (which would require ALOT more vram !), as you would have to have your face right up against the asset to tell the difference anyways - however, the higher resolution allows you to see everything in greater detail across the board. Stuff that is muddy and not very detailed 10 meters away in 4k? Looks comparatively very detailed in 8k.
It's true that native 8k would be immensely hard to do, and probably won't ever be doable without multi-gpu, which i can't ever see making a comeback due to all games relying so heavily on temporal effects today (i personally held onto it for as long as possible with 1080 ti sli, and when i switched to a single 2080 ti, it was quite the downgrade in performance in games that did support sli).
That said, i don't see why you would necessarily want to chase native 8k. Just like 8k dlss looks better than native 4k, while also being a fair bit less demanding, 16k dlss would do the same vs native 8k.
Same cyberpunk scene as before, but now in 4k without any upscaling - gets 38 fps vs 47 fps with 8k dlss. Using all the same settings, aside of res and dlss.