It's undisputable that game requirements will only go up in time. They always have. Since their inception, video games have driven the development of new technologies and advancements in hardware. With the transition of the medium to 3D in the mid-90s, gamers have come to expect more realistic, more detailed graphics, richer environments, and spectacular visual effects -- all in increasingly higher resolutions.
Major studios have always developed their games on cutting-edge hardware. A dev's workstation is usually far more powerful than what Joe the gamer has at home. Unfortunately, as of lately, we have seen an increasing number of titles which run poorly not just on an average PC, but even on a high-end config. It seems that optimizing their latest release, so that it runs acceptably on weaker components, isn't a priority for many software houses today. This issue is worth another discussion, but it's an industry-wide problem which lies with the developers, studio/project managers, publishers, video card manufacturers, and the gamers themselves.
And let's not forget that games are developed for the current generation of consoles first and foremost. Both XSX and PS5 sport eight core CPUs accompanied by 16 GB of VRAM, and most of these resources are available to the developer. Major games are created to run comfortably on console equivalent hardware, not on Johnny's "gaming" laptop with a mobile GTX1660 or RTX3050. Future consoles will have even more memory and more powerful GPUs -- and we're already half way through with the current gen.
As for VRAM requirements, the need for a larger frame buffer is undeniable with modern titles. While many games can still be played on an 8 GB card in 1080p, maxing out the details, especially in higher resolutions, will often call for 12 GB or more. I regularly see 14-16 GB maximum consumption in 4K with titles that are a couple years old at this point. Shadow of the Tomb Raider, a 2018 game, will show beyond 15 GB of used dedicated VRAM, as reported per process. Even good old GTA5 -- an eight year old game -- will peak at over 9 GB with everything cranked up.
That said, gaming on the PC gives us the great opportunity to experiment, tweak the settings, and find out their impact on performance. IMO we should always try to strike the kind of balance between visual fidelity and fps that we find comfortable. For example, gamers were outraged to learn Immortals of Aveum's requirements when the game launched. Running on UE5.1, this is one of the most demanding titles that released this year:
The game has received a number of patches post-launch, and future updates have been announced. I played this on a secondary PC with a Ryzen 3300X, a CPU well below the minimum 8c/16t AMD requirement. My GPU -- a 6600XT -- while matching the minimum spec of a 5700XT, was supposedly only good for 720p60 (1080p upscaled on quality preset), with everything on low.
Despite the excessive official requirements, everything played very smoothly and felt very responsive on this budget config. I ran Immortals in native 1920x1200 with all settings on high, which is the middle preset. Only shadows, cinematics DoF and cinematics motion blur were set to low. The fps hovered around 60 most of the time, with dips into the 40s. The only occasional drops below 40 fps were in some cutscenes and heavy battles with multiple opponents.
In all, I believe that the people who keep complaining about growing hardware requirements are either:
- those who have unrealistic expectations of their PCs (in no small part thanks to marketing ploys used by hardware manufacturers and game publishers)
- the ones who are too lazy to get familiar with the game settings to try and match them to their PC's specs
- and the ones who absolutely must play every major release at launch
And honestly, if your system can't handle that latest hyped AAA title, maybe try playing an indie or an older game instead? There are dozens of great games that will run perfectly on a 10 year old PC