You just have to ask the specific questions:
Do I buy games specifically for graphics? No.
Are games with basic graphics necessarily boring? No.
Is it true that advanced graphics make a game more enjoyable? It depends.
Is it true that a few games require advanced graphics to be fully enjoyable? Yes!
Do you enjoy games with advanced graphics? Hell yes!
Do you like games that depend on such advanced graphics to fully flesh themselves out? YES.
Is it true that the average gamer considers graphical fidelity a generational marker? And the answer to that is absolutely.
The reason this generation feels "underwhelming" is that system requirements are about to shoot up to insane heights due to the use of ray and path tracing, very high resolution textures and general assets (which are increasingly unique with more and more variety), advanced positional audio, etc. - all the while simultaneously, in real life we're in an economic rut with people unwilling to spend money to throw hardware at the problem; all the while game developers got really good at mastering the traditional raster-based techniques make games look more than reasonably realistic, they look pretty much amazing as it is. This presents an ugly development reality, because a LOT of time and resources are spent to make games look this good with traditional technology.
Add all of that to the fact that desktop panels have not really advanced in resolution in the past 10 years whatsoever and the bulk majority of displays is still 1080p/Full HD standard with a few 1440p's at the higher end, and there's just no point in software developed with UHD/4K and beyond in mind, leading to this illusion that things haven't really advanced despite the increase in requirements.
Enter Alan Wake 2, the game that's triggering all this discussion. It gave the boot to the ancient Nvidia Maxwell and Pascal GPUs as well as AMD's underqualified RDNA 1 hardware and is requesting DX12 Ultimate, people seem to be upset by it dropping downlevel hardware but you have to remember, Nvidia's Turing is 6 years old and already offered DX12U support from day one. The upper settings aren't supported on AMD because again, it's making extensive use of raytracing which their hardware and/or drivers is simply balls at. This conflicts with "gamer pride" and "latest generation cards must always run ultra high settings". End of the day, yes, the problem is you, the gamer, and not "optimization". Is asking for a RTX 2060 6 years later really too much? IMHO, no, it is not.
But that's just how I personally see things.