How so? VRAM debate aside - the x70 parts have traditionally been cut down x04 parts. The bus width is the only thing you can argue.
GTX670 = cut down GK104
GTX970 = cut down GM204
GTX1070Ti = cut down GP104
RTX2070 super = cut down TU104
RTX3070Ti = fully enabled GA104
RTX4070Ti = fully enabled AD104
Now we are complaining that they gave us an extra card that is based on cut down AD103, but this is what it should have been to begin with?
You sort of answer your own question. The 670, 970, 1070 Ti, 2070 super, and 3070 Ti all have 256-bit memory buses. The 4070 Ti is a 192-bit memory bus card with 12GB of VRAM and an MSRP of $800. Not only is the price high (people expect a lot at $800), it's providing you less memory bandwidth than the previous gen 3070 Ti. It's pretty easy to see why people say the 4070 Ti Super is what the card should have originally been, it's the only xx104 die out of that list that has a 192-bit bus.
I'd also like to point out that accounting for inflation the MSRP of the 970 comes out to $432.75. Nvidia's margins are absolutely ridiculous. People should demand more, a lot more, for $800.
I don't necessarily recall anyone suggesting 16GB was a requirement to lift immediate performance limitations on a broader spectrum. At best, a couple or few newer games are showing signs of utilizing more than 8GB/12GB (whether 1080p/1440p...etc). Furthermore we witnessed 2 reviews (one from HU and cant recall the other) where less VRAM can maintain excellent performance but at the cost of compromising on visual quality, a process involving texture streaming or dynamic assets swapping. Whether its adaptive quality, compression or dynamic pruning in some examples less VRAM observably illustrated these different levels of texture details and it didn't look pretty. If im selecting "high" quality settings i'd like my selection to pay off and not be compromised in favour of pushing the FPS "ranking" slide.
The good news: its a minor issue at the moment hence not of grave concern. 8GB still works great at 1080p and 12GB at 1440p although we don't have much material evidence to compare every GPU-intense game to see what level of visual quality compromise is taking shape. Lucky for me, the games I play at 1440p with a 10GB + 11GB cards (3080, 2080 TI) are holding up well although in rare instances enforced and poorly administered SMART assets swapping is noticeable on occasion. All in all, there is strong context here compelling the need to lift hardware limitations, especially considering VRAM isn't going to stretch production costs to any significant levels or charitably fits the bill considering mid-tier graphics cards cost a bomb, big enough MSRP to strap 16GB and not feel a pinch.
Please re-read my original comment, your quoted content in between edits.
"I can't say I'm a fan of the argument that because said issue is rare makes it's a non-issue. What you are implying is that VRAM will only be an issue once game devs start making games that perform terribly on most people's systems, which is simply illogical and is unlikely to ever happen. It's a condition that nearly impossible to statisfy because it's not how the market works. Software follows the hardware, not the other way around. Game devs have chimmed in that having to optimize games for the 8 VRAM buffers for years as VRAM size has stagnated has crimped creativity and created technical strain on their teams. Makes sense, Nvidia first needs to increase VRAM allotments being the majority market leader in order for devs to be able to utilize said VRAM. At the end of the day the logic is circular, saying VRAM shouldn't be increased because games don't need it means games won't be made requiring more VRAM."
Regardless of how you quantify the strain on developers or end users, you cannot expect games to utilize additional VRAM until cards come out with more VRAM.
FYI we witnessed 5 reviews of games running into VRAM limitations, not 2. That's not a lot but that's missing the point. Statements like "x and y VRAM cards still work fine at 1080" or only looking at which games are impacted by lack of VRAM now are the actual definition of not being able to see the forest through the trees for the reasons demonstrated above.