I can only laugh when I see people cherrypicking numbers to force cards like 3070 into having VRAM issues, by running the most demanding games today in 4K/UHD, sometimes with RT on top, just to force the VRAM dry. Not a real world scenario at all. The GPU itself would not even be able to run these settings, even if it had 16GB VRAM. GPU power is the problem, not VRAM.
In most cases, the GPU is the limiting factor. Not the VRAM at all. Besides, 95% of PC gamers are using 1440p or less (Steam HW Survey). Looking at 4K/UHD native fps numbers means little for the majority of PC gamers and 96% of people on Steam HW Survey have 12GB VRAM or less. Developers make games to earn money, and you don't sell any games if you make games for the 4-5% marketshare.
Beside, most games today look almost identical on high and ultra preset, while high uses alot less vram. Often motion blur and dof is pushed to ultra presets and uses more memory while looking worse to the end-user. Even medium preset looks great in most new games, and sometimes even more "clean" than high and especially ultra, which is filled with blur, dof and all kinds of features that tank performance but necessarily don't make the visuals "better"
This is something I can agree with cause its also what I'm personally experiencing with a 3060 Ti aka a lowly shitty 8 GB card according to some ppl around here.
'and yes I did pick this card over a 6700 XT..'
Ppl keep bringing up the 4k argument even tho its still not a common res to game at, maybe on forums/tech sites like TPU it is but in general nope and its also a moot point for me playing on a 2560x1080 monitor which has less pixels than a 1440p monitor.
In newer games I run out GPU power way before running out of Vram, especially with Unreal Engine 5 games and those are gonna be more and more common just like the UE 3-4 games before.
'sure the devs could use very high res textures in the future but the engine itself is very GPU heavy if nanite and lumen is used and thats gonna be the limiting factor first'
Immortals of Aveum was seriously choking my GPU on high settings even with DLSS but it had no Vram related issues, same with Plague Tale Requiem
'in house engine but still' which is an amazing looking game with great textures and it had zero Vram issues with RT off
'it did not even have RT when I was playing it at the relase' but the game was heavily limited by my GPU.
Older or lighter games are also a moot point cause those are easy to run anyway, even maxed out.
Sure theres always that edge case with shit optimized games at launch but even then its not a big deal to turn settings down a notch or wait for a fix like with Last of Us which is perfectly fine by now even on 8 GB cards at 1440p ~high settings so I'm all good whenever I will decide to play/buy it.
While I do agree this Vram issue is an existing thing but
in my opinion its is way overblown with ppl always finding a reason for it or pushin their cards where it shoudn't even be performing at in the first place.
Personally I would have no problems buying a 12 GB card like a 4070/super if I could afford it cause I'm sure it would easily last me my usual ~3 years if not more.
'this console generation at least + I'm not planning on upgrading my resolution/monitor either'