I don't get the notion that insufficient vram turns the card into dead weight. Surely you know you can drop textures from ultra to very high. I mean come on, it's not the end of the world
Except that's not the issue I'm talking about. We all know you can tune settings. But if you keep cards for a longer time, you're already solid in that territory. I'm talking about year 4-6 in a GPU's life, when it has an easy 4 left, and can and will run games on the current API just fine, but VRAM prevents it from being a GPU that you can still play on.
For Ampere, this is already happening with all 8GB GPUs right now. They're either there already depending on what you play, or you'll find yourself tweaking far more than you'd like, OR you'll still find you can't eliminate stutter entirely.
There's no denying this, I've been experiencing this throughout using several GPUs in the past few decades, how they run out of juice, what you can do to prevent it, and what you can't do to eliminate the issue entirely. I've also seen what type of cards last FAR longer and return far greater value for money. They're
never the midrangers with so-so VRAM. Those always go obsolete first, and when they do, their resale value is already in the shitter.
You mentioned Hogwarts. Now let's get real here. Say you bought an Ampere GPU. It was released some
TWO years ago. Hogwarts is also about that age. That's not a great lifetime for a GPU to already be cutting back on settings, most notably
texturing which is a low-GPU performance-hit to image quality. Its easy to have at maximum. All you really need is VRAM and it
always helps the presentation quite a bit, which can't be said of various post effects. Its not the end of the world no, but can you really say that you've got a well balanced GPU at that point, if it can otherwise quite easily produce enough frames?
I believe we can and should expect more of products that keep rapidly increasing in price.