GUYS,
Here's how 8K on consoles applies:
#1 - 8K capability simply means the GPU meets the HDMI v2.1 spec and is capable of outputting enough bandwidth for 8K resolution
#2 - it's up to 8K@30Hz (30FPS) without DSC, or up to 8K@120FPS with DSC (using 4:4:4 10-bit video)
en.wikipedia.org
#3 - DSC (Display Stream Compression) does LOSSLESS (no loss) compression of the data but the GPU and Monitor/HDTV must support DSC for this to work
#4 - 8K is almost completely pointless. By math, someone with 20/20 vision cannot see ANY difference at 0.75x the diagonal. That's FOUR FEET (48") if the HDTV is about 65". Practically speaking you can actually only see changes at roughly TWO FEET or about 0.38x the diagonal and that's only if you have very small text or something... that's only likely on a DESKTOP situation like video editing in which case you'd need to be 12" or closer on a 32" desktop monitor to observe the difference.
#5 - 4K requires you to be about 1.5x the diagonal or closer to observe differences over LOWER resolutions than 4K such as a normal 1080p monitor. That's closer than EIGHT FEET on a 65" HDTV.
#6 - to summarize, you should be closer than 1.5x for 4K and closer than 0.75x for 8K but practically speaking you need to be even closer
#7 - more specifically, the 1.5x diagonal distance is where being FURTHER BACK makes anything higher than 1920x1080 (1080p) pointless. The exception though is that 4K HDTV's may have superior video processing which affect a video so the image may still look better due to processing (though you should never have motion smoothing or any type of video processing on for a game input).
There's just no practical reason to produce more than 4K content for consoles. Sure, even at 4K you can see some ALIASING (jaggies) in video games but it's more efficient to apply some small amount of anti-aliasing than to further upscale the content and even then you'd never have a reason to go all the way up to 8K.
Update:
Since DYNAMIC RESOLUTION on consoles is getting easy to implement, I could see them scripting to process games at higher resolution than downscale them prior to sending the video to the screen. In theory, maybe even at 8K resolution if it's just an automated script and it's an OLDER GAME that requires little processing. Even then, it would probably be set to upscale dynamically to whatever then downscale again to 4K before sending out the video signal. Even if it's an 8K HDTV you'd probably still have it setup to always be linked at 4K@120Hz. Especially if both don't have DSC.
Also, having a higher bandwidth THROUGH the HDMI cable such as 8K vs 4K requires a much higher quality HDMI cable. Anyway, I'm bending over backwards to think of situations 8K would work but again practically speaking there's no point no matter how powerful the console gets as it's a human VISUAL limitation aside from processing limitations.
LOL...maybe upscale to 8k. I bet the game will render at the most 4K.
You'd never want to upscale game content after it's been rendered as that adds latency to process the video in the HDTV's video processor which results in gaming lag. I'm not sure there'd be any obvious benefit to upscaling game content that is already 4K anyway (or normal video for that matter).