I don't want to buy another 3080, I know it's a damn good GPU but 10GB isn't much by the look of things even at 1080p for foreseeable future.
Why do you people have these irrational fears? Where do you get your figures to determine a very good GPU is bad just because of a number you know nothing about?
And why should VRAM size increase so drastically between generations anyways, do each pixel on your screen need exponentially more data in order to be rendered?
Let's so some math for a moment;
Consider 4K (3840x2160), now assume we we're rendering a perfect scene with high details, we run 8xMSAA (8 samples per pixel), and we assume every object on-screen has 4 layers of textures, and every sample is interpolating on average 4 texels, and that every object is unique, so every texel is unique, resulting in a whopping 128 average samples per rendered pixel (this is
far more than any game would ever do), it will still total just 3037.5 MB uncompressed*. (Keep in mind I'm talking if every piece of grass, rock, etc. is unique.) So when considering a realistic scenario with objects repeating, lots of off-screen nearby objects cached (mip-levels and AF), etc. a ~5 GB of textures, ~1.5 GB of meshes and ~1 GB of temporary buffers would still not fill a VRAM size of 8 GB, let alone 10 GB. Throw in 12-bit HDR, and it would still not be that bad.
*) Keep in mind that with MSAA, lots of the same texels will be sampled. And normal maps are usually much lower resolution and are very highly compressible.
So the only logical conclusion would be that if a game struggles with 10 GB VRAM in 1080p, the game is either very poorly designed, or the driver is buggy. And as we usually see in such comparison, it's usually another bottleneck slowing it down.
The effect of VRAM limitations cannot be measured in average FPS alone like TPU does. No offense to W1zzard here but the issue is more complex as it requires also frame time analysis for every game at every setting and this is heard to read and takes forever to benchmark.
If a game is acutally running out of VRAM, and the GPU starts swapping just because of that, the FPS wouldn't just drop a few percent, or have slightly higher variance in frame times, it would completely collapse.
When Nvidia releases their usual refreshes of GPUs with slightly higher clocks and memory speeds, and yet they
keep scaling in 4K, we can safely conclude that VRAM size isn't a bottleneck.
So whenever these besserwissers on YouTube make their click-bait videos about an outlier game which drops 20-30%, it's bug, not a lack of VRAM.