So much salt was spilled on these forums with people agressively defending Nividia's controversial option to launch a flagship with only 10GB.
I was honestly not expecting 10GB to be inadequate this year, my prediction was at some point in the next 12 months but I'm not really surprised as I ran out of VRAM and was forced to dial back settings on my RTX2060 already (Doom Eternal, Q1 2020).
Hehe look at this topic. Its either not true or it doesn't exist, but in some people's minds its just not fathomable that games can ask upwards of 10GB now that Nvidia announced a 3080 with said amount.
Somehow oblivious to the fact the same company has been having 11GB cards for quite a while now in its top end offerings. Somehow oblivious to the fact PCs tend to be able to push
higher details, and a 3080 most certainly, than the upcoming consoles that already carry more than 10GB reserved for just VRAM.
This is going to be fun, because I find myself at the opposite side of that fence. Objectively, a 3080 with just 10GB was a shit balance, and this underlines it. That fact won't change no matter how many apologists jump into the gap. The competitor still does control the console hardware and obviously will push devs to use the full amount because 'why not'. Devs happily accepted RTX money bags, and they will happily accept Texture money bags too, and rightly so.
The link to Hairworks was made... as much as I defended Nvidia back then for taking that advantage when the competitor dropped the ball, and as much as I defend Nvidia for its GameWorks as well because honestly, its an industry effort anyone can do... I defend AMD now for pushing high VRAM requirements because obviously devs will use it to bring higher fidelity. Whether you want or need that, is a similar case to Hairworks. But I dare say textures are a little bit of a bigger deal than hair physics. What's more.... Hairworks can simply be turned off. You don't turn off textures. So reviews will measure cards by that same texture amount - not a good sign for the leather jacket.
Imagine buying a 3080 and having to consider a texture quality reduction straight from the get-go. As it is now, Godfall might even release before these GPUs are delivered proper. Yep. I've bought a few boxes worth of popcorn.
Sorry guys... but really. The cognitive dissonance is pretty strong here and it doesn't look good, nor objective. Oh yeah. The eternal 'meh but allocated doesn't mean used' even passed us by, even though we KNOW games stutter when VRAM is taxed. Puzzling, indeed. This is a 700+ dollar, top end card we're speaking of. Not some shitty x60... wake the hell up already. Wouldn't you want a few years going forward with actual VRAM headroom instead of having to wonder whether each release will exceed what you've got? I know I would.
Just one little thought on top here...
'If it looks too good to be true, it usually is'.
Remember our first response to the perf/dollar of the 3080

I think it fits well. Nvidia wanted to pre-empt these PR statements and get you pre ordering and mindshare invested in their Ampere card. Seems pretty clear now, especially with the constant rumors of higher capacity cards around the corner. But, do keep living in denial, buy that 3080 and let us know how it worked out. It won't be my loss...
Note that my only motive is that of a customer finding the best deal of the current gen. Nothing else. I couldn't give a rat's behind about what color the card is, what jacket the CEO wears or whether it has tits or not.