Out of the 70,000 games made for PC going back 40 years, there are tens of thousands that are playable on even 2GB cards / APU's. I think a healthy "reality check" to those getting wrapped up in
"But muh Call of Duty 725 console port needs 64GB VRAM on my 16k rig" number chasing that saturates most enthusiast oriented tech forums is that:-
2GB GPU's will run plenty of games like : Amnesia The Dark Descent (0.6GB) / Bioshock 1-2 (0.7-1.6GB) / DARQ (1.3GB) / Deus Ex Human Revolution (0.8GB) / Dishonored (0.9GB) / Divinity Original Sin (1.7GB) / Don't Starve (0.5GB) / Dragon Age Origin (1.1GB) / Dusk (0.9GB) / Fallout 3 (1.4GB) / Half Life 1-2 (0.8GB) / INSIDE (1.9GB) / Morrowind (0.5GB) / Oblivion (0.8GB) / Portal 1-2 (0.7-1.4GB) / Skyrim Legendary (1.8GB), with obviously many old school games typically using well under 1GB VRAM (even in new source ports).
4GB GPU's extend that to better optimised 2010-2017 era titles like : Bioshock Infinite (2.4GB) / Prey 2017 (3.7GB) / Skyrim Special Edition (2.8GB) / SOMA (2.6GB) / The Talos Principle (3.7GB) / The Witness (2.6GB), etc, as well as thousands of modern Indie's that are often only just over the 2GB threshold but well under 4GB.
Higher resolutions? Well, The Vanishing of Ethan Carter (2014) uses 1.5GB VRAM at 3440x1440 Ultrawide
whilst looking like this. So the other half of the
"Do I have enough VRAM for my game, if not that's 100% my GPU's fault" equation is that some games developers are significantly better at optimising than others, and it's kinda sad that half the "Real Gamer (tm)" enthusiasts who tend to populate tech forums, swearing blind you need 6GB VRAM for Pacman for MS-DOS have ended up so habituated into accepting an endless stream of sh*tty console ports where the VRAM bloat is often artificial (devs can't be bothered to port the "flat" unified memory structure of consoles to PC's with separate RAM / VRAM properly) as the norm and well optimised games as the rare exception when industry expectations should have always been the other way around...