You can't trust software GPU usage at all. Not even in-game numbers (some games shows you xx/xx GB usage for example).
+ Many game engines just allocate xx% by default. Tons of games allocate more than they need. Like 85 or 90% of all available VRAM, yet uses half of that in reality.
Also, Nvidia has several features to limit VRAM requirement (cache hit/miss requests in Ada arch for example + better memory compression - you can read more about this in architecture deep dives)
Pretty much no games use more than 12GB in 4K/UHD using ultra settings. With heavy RT you might go above this tho, yet no 12 or 16 GB cards will manage heavy RT at 4K/UHD anyway. Especially not AMD cards, since RT perf is very weak. Pointless.
Most of those games you talk about, is AMD sponsored games and rushed console ports ->
https://www.amd.com/en/gaming/featured-games.html
Properly optimized games use little VRAM. Atomic Heart completely maxed out at 4K/UHD uses like 7GB and looks better than 99% of games.
Generally pretty much no games needs more than 8-10GB at 1440p. Most hovers around 4-5-6GB usage. 12GB is more than enough. 16GB or more is wasted.
Look here. 3070 8GB outperforms 6700XT 12GB with ease in new games at 4K/UHD in terms of minimum fps. Minimum fps would drop very low if VRAM was an issue ->
https://www.techpowerup.com/review/amd-radeon-rx-7800-xt/35.html
All this VRAM talk started because of AMD marketing and a few rushed console ports (AMD sponsored as well). Very few properly optimized games uses alot of VRAM. Like I said ->
https://www.techpowerup.com/review/atomic-heart-benchmark-test-performance-analysis/5.html
AMD was catched before doing this stuff. Back with Shadow of Mordor uncompressed texture pack -
that did nothing for the end user ->
https://www.pcgamer.com/spot-the-di...rdor-ultra-hd-textures-barely-change-a-thing/
They only did this to make many Nvidia cards drop performance (however also affeted many AMD users)
The difference between high and ultra textures is often just slightly less compression (sometimes ultra is uncompressed), which you most of the time won't notice when actually playing the game. Dropping texture quality to low and sometimes medium can be seen easily, but high and ultra mostly looks identical, especially without 300% zoom and in motion.
I have a 4090 and 24GB is absolutely wasted for gaming. Outside of allocation only, it's simply not needed.
By the time 24GB is actually needed in demanding games maxed out, 4090 will belong in the trashbin. GPU will be too slow to run games max out anyway.
Some people think alot of VRAM will matter eventually, they just don't account for the GPU which will be too weak to max out games anyway, meaning less VRAM is required.
You have to be logical. Game developers knows that the majority of PC gamers don't have more than 8GB.
PS5 and XSX have 16GB shared RAM for entire system, meaning OS and BACKGROUND TASKS, GAME and GRAPHICS with a 4K/UHD resolution target (dynamic res tho).