It's all relative, isn't it?
Newer games with higher-resulution assets making use of more features are what are driving up VRAM. Even at 4K max settings 10GB used to be enough only a few short years ago. People who bought 3080s probably skipped the 40-series and they've been suffering with 10GB for a good year or more, in so much as "suffering" is still little more than a minor inconvenience of having to compromise on some graphics settings.
I think 16GB is the new sweet spot in that it will be enough for max or high settings for a decent GPU lifespan right now. 20 and 24GB sure do feel like overkill when the consoles are making do with somewhere between 9GB and 12.5GB depending on how much system RAM the game requires. Throw some headroom into that and a 12GB card is probably fine for lower resolutions, 16GB should handle 4K, and by the time games actually need 20 or 24GB, cards like the 7900-series, 3090/4090 will lack the raw compute to actually run at 4K max settings.
We're all speculating, and this is a thread based on speculation anyway, but as someone with friends working in multiple different game studios, there's a strong focus on developing for the majority, which means devs are targeting consoles and midrange GPUs at most. If you have more GPU power on tap, you can get higher framerates and/or resolution but don't expect anything else except in rare edge cases like CP2077 where Nvidia basically dumped millions of dollars of effort and cooperation with CDPR as a marketing stunt more than a practical example of the real-world future that all games will look like.