Some points are valid, like:
- games being poorly optimized - so the GPU has the work harder while using more VRAM (for example: rendering far away textures at high quality - which is more like a waste of extra VRAM for details you barely notice or miss entirely)
- poorly optimized ports from consoles
- downgrading VRAM on latest GPUs (lowering both bus memory width and quantity) - so latter they can refresh them by using the norm VRAM from older generations - thus, giving some a reason to invest in upgrade - whithout actually bringing something new to the table
- ray tracing is still not worth it (offering very little in terms of visuals "for its requirements")
But then there's... "the elephant in the room" as in... the obsession with ULTRA HIGH SETTINGS at 2K or 4K resolutions - even while owning a mid range GPU. Not to mention, the game used as example (Guild Wars 2) - shined more artistically on a visual level - and not in terms of having the highest level of details for that time (which is not that surprising for a MMO of that scale). Even so - "
ON ULTRA SETTINGS + 4K - GW2 can get close to 5GB VRAM which is close to 16x Higher than the OP (300 MB / GW2 also switched from DX9 to DX11 - which actually improved the FPS and stuttering, significantly - beyond the buggy beta testing phase), in some regions:
...while in other regions gets close to 3.6GB VRAM:
Yet, the majority of GW2 players - use a mid range system (at best) - and it runs just fine "
AT LOWER SETTINGS!" Yes, shocking i know: PC Games have multiple graphical profiles (Very Low, Low, Medium, High, Ultra, Ultra+ and everything in between) - so that a larger audience can enjoy a well made game. I might be to old school - but there was a time when Game-play + Story-line were a main priority, as for visuals - even the artwork was deem more important (in terms of visuals) - than textures that can offer the highest level of detail for a given time period. It's quite an irony - that the game used as example - might need a 2000$ Video card (+ a CPU that can handle it without bottlenecking) - to run a game released more than a decade ago at 4K+ & highest settings (especially if we're also taking "Modern Refresh Rates" into account).
Last not but not least - this is far from being a new thing (some games having higher requirements - than the current tech can handle). Maybe some of you are to young (or at least - that was the case back in 2007) - but there's an old meme of such game catching people by surprise:
Yes, but can it run Crysis?! Another way to put it (if indeed you're to young to get this meme) - despite of being released in 2007, played in 8K at highest settings - can go over 6K VRAM:
That being said - it's only natural for VRAM Requirement to get higher and higher. On the bright side - most developers won't shoot themselves in the foot - by implementing a level of detail which only a very small minority of machines can handle (after all - profit - is the top priority).