Dude. We're not even at 4K mainstream yet. Hell, even 1440 isn't. PS6 will be 4K@120 in the best case.
Features and quality will drive GPU's, not resolution. If resolution was a driver we'd be there by now. Once you get up over 120ppi, you can't tell the difference sitting at a desk. Only reason to push for 8K would be for movie theater sized screens to game on.
I'd argue even ~100PPI is fine for most people. I know *some* people have 20/20 vision, but in reality I think most people don't. I certainly don't.
There is a reason the 27'' 1440p display rules the roost...It's kinda perfect. Yeah, you can start arguing for 42'' 4k monitors/tvs on your desk, but to me that's a lil' extreme bc it's too big to see everything. JMO.
This is why I'll never even understand running native 4k for gaming, given the extra power/perf needed. I run 1440p and upscale it to 4k, and I honestly don't think *most* people need more.
I sit 5-6' from a 65 4k OLED, depending on if doing the
gamer meme. I'd reckon you'd have to sit closer than 5' to notice a difference from 1440p native...and most don't do that bc then you can't see whole screen.
As an obscene perfectionist, I think my setup is perfect. Obviously no two people are the same, but I feel solid recommending something like that as it truly is the most economical for a great experience.
Now, getting into the era of 1440p->4k up-scaling with RT...which essentially requires something like a 4090...That part sucks. I'll eat it next-gen though, bc that's whatcha' do if it's yer' thing.
8k though? Good luck with that. Next-gen's absurd high-end will be about running 4k native RT. Again, I don't think most people need it, and certainly most can't afford it, and I think that's okay.
While next-gen certainly is about features (again, I believe 9070xt is the bottom of the new paradigm; 1440p raster, 1080pRT, or 960p->1440p upscaled RT), I think most overestimate the PS6.
Most we can hope is 1080p->4k up-scaling wrt demanding games, and doing something like that would require something with the grunt of a 5080, or a next-gen (9216sp?) 192-bit/18GB chip.
My hope is the PS6 essentially uses the 256-bit setup from AMD (similar to a 7900xtx but with RT/FSR improvements) but packed dense and clocked super low, making it similar to the desktop 192-bit parts (5080).
The thing people truly do not understand, and they will very soon, is that RT will become STANDARDIZED. You will NEED a card capable of this stuff if you want to run those games in any decent way.
Right now 9070xt will be the cheapest for a 60fps at any kind of common resolution (listed above). Yes, you can upscale from a lower-rez and/or use a lesser card and lower settings, but that's not the point.
People can argue what's acceptable to a point, but too many just do not understand the shift that is happening. There is a reason next-gen LOW-END (like the market of 9060) will be like a 9070 xt.
I'm honestly not trying to fight with people that don't get it, only prepare them. There is a reason why 9070 xt does what it does, there is a reason the 3nm stack will be what it is, and also the PS6 do what it does.
It may honestly catch some people off-guard, but that's why I caveat literally everything with *think about what you buy right now*, because for many people it just ain't gonna do what you want pretty soon.
Again, IN THOSE GAMES. Not all games are those games, but increasingly more will be (especially once PS6), and I'm also trying to be considerate that people don't want to be limited on the games they play.
Cards like the 5090 and 4090 should NOT exist, they are very customer and market unfriendly as they take away a ton of wafer production volume for a small number of chips and give developers excuses to make their games much more poorly. Chips above 400-450mm^2 should be left for the "professional" GPUs.
As I have said many times, 4090 exists because it is literally the foundation of 1440p->4k up-scaled RT, which to many people is the (reasonable) grail.
This will trickle down next-gen to $1000 cards. And the gen after that. And the gen after that.
The cards above that next-gen (36GB?) will be about native 4kRT. 5090 is a weird freakin' thing that I agree is mostly a novelty of what's currently possible, but not a tier.
The next generational leap after this is we'll probably all be gaming in the cloud.
