When did giving less require less optimisations?
Giving a lesser quality game, unoptimized to work on several tiers of GPU and CPU hardware, require's more of said hardware to run well. Remember he said;
if you give lazy developers/engineers more why should they optimise? A dangerous path.
So yeah, if you give them way overspecced hardware relative to consoles that (in theory) already run well/look good enough, like gobs of unnecessary VRAM, CPU power, GPU compute power etc, why would they optimize further.
I mean we're already bordering on some systems having 2X+ the amount of physical hardware a console game needs to run XYZ settings, potentially feeding it unnecessarily, like the big VRAM boogeyman, and it
could be a dangerous path. Nek minnit we might re-enter situations like craptacular budget/low end cards with 2/3/4x the amount of VRAM they need and can reasonably use, relative to their compute power. None of this is to say that
some current cards could use more than they have, but simply advocating for all of them to have shittonnes doesn't actually solve the problem at all, in fact it exacerbates it.
My solution? Allow AIB's to just make double memory capacity cards, at their own markup. Then people that want their GPU to last an almost unreasonable amount of time, like 6+ years or 3+ generations, can spend big on VRAM, without buying the top SKU, to run max textures far beyond when their GPU can run max anything else. Would a 20GB 3080 be cool? sure! would I have bought one given the choice at launch for $100-200 USD more than the 10GB? hell no. Some might have, I don't begrudge them that, hence why the option could be useful and well received. Then at least they can get ahead of any boogeyman arguments by giving your everyman the option to have bought double for a GPU that can only make purposeful use out of it in niche situations like modding, or when we have multiple fucking terrible ports in a row, or when they want to play the
ultra long game.