Absolutely true, everything you say. This is what people that ate AMDs VRAM marketing don't know. Up to this marketing came out, a few AMD sponsored games was rushed out, that ate tons of VRAM, more than they should. I bet they did this to support their claim. Funny enough, those games were fixed over time and VRAM usage dropped (image quality even went up slightly - and the massive FSR shimmering and artifact issues got less severe too)
AMD did this several times before. Look up Shadow of Mordor texture pack. AMD sponsored game. Texture pack needed 6GB VRAM but textures looked identical, yet forced many Nvidia GPUs (but also many of AMDs own) to run out of VRAM, with no actual improvement in textures. ->
https://www.pcgamer.com/spot-the-di...rdor-ultra-hd-textures-barely-change-a-thing/
Allocation does not mean required amount. It's that simple. Depends on game engine how allocation work.
I can only laugh when I see people cherrypicking numbers to force cards like 3070 into having VRAM issues, by running the most demanding games today in 4K/UHD, sometimes with RT on top, just to force the VRAM dry. Not a real world scenario at all. The GPU itself would not even be able to run these settings, even if it had 16GB VRAM. GPU power is the problem, not VRAM.
In most cases, the GPU is the limiting factor. Not the VRAM at all. Besides, 95% of PC gamers are using 1440p or less (Steam HW Survey). Looking at 4K/UHD native fps numbers means little for the majority of PC gamers and 96% of people on Steam HW Survey have 12GB VRAM or less. Developers make games to earn money, and you don't sell any games if you make games for the 4-5% marketshare.
Beside, most games today look almost identical on high and ultra preset, while high uses alot less vram. Often motion blur and dof is pushed to ultra presets and uses more memory while looking worse to the end-user. Even medium preset looks great in most new games, and sometimes even more "clean" than high and especially ultra, which is filled with blur, dof and all kinds of features that tank performance but necessarily don't make the visuals "better"
Next gen games look great regardless of preset. Take Alan Wake 2 for example. Look good on low as well.
Avatar is one of the best looking games today and it don't need alot of VRAM ->
Avatar: Frontiers of Pandora features stunning visuals that recreate the movie franchise's unique universe. There's also support for AMD FSR 3 Frame Generation and NVIDIA DLSS. In our performance review, we're taking a closer look at image quality, VRAM usage, and performance on a wide selection...
www.techpowerup.com
"Our VRAM testing would suggest that Avatar is a VRAM hog, but that's not exactly true. While we measured over 15 GB at 4K "Ultra" and even 1080p "Low" is a hard hitter with 11 GB, you have to consider that these numbers are allocations, not "usage in each frame." The Snowdrop engine is optimized to use as much VRAM as possible and only evict assets from VRAM once that is getting full. That's why we're seeing these numbers during testing with the 24 GB RTX 4090. It makes a lot of sense, because unused VRAM doesn't do anything for you, so it's better to keep stuff on the GPU, once it's loaded. Our performance results show that there is no significant performance difference between RTX 4060 Ti 8 GB and 16 GB, which means that 8 GB of VRAM is perfectly fine, even at 4K. I've tested several cards with 8 GB and there is no stuttering or similar, just some objects coming in from a distance will have a little bit more texture pop-in, which is an acceptable compromise in my opinion."
People ramble about VRAM for longevity all the time, partly because AMD wants people to believe this is true. In reality tho, upscaling like DLSS/FSR will matter more for longevity and DLSS clearly beats FSR.
AMD owners said 6700XT would age much better than 3070 because of the 12GB VRAM but even today 3070 is faster in pretty much all games including 4K/UHD. Watch the Avatar link again ->
https://www.techpowerup.com/review/avatar-fop-performance-benchmark/5.html
And DLSS beats FSR as well in this game. TPU tested that too ->
https://www.techpowerup.com/review/avatar-frontiers-of-pandora-dlss-2-vs-fsr-3-comparison/
So yeah. Longevity don't come down to VRAM alone. In most cases, the GPU itself will be the limiting factor and you will not be able to max out games anyway on a dated and slow GPU, forcing you to lower settings.