You're not entirely wrong, but I don't completely agree with you either. What you're describing is the current state of AAA game development and the system load of the features present in these games. What I'm saying is that it's about time early development resources are reallocated from developing new ways of melting your GPU (which has been the key focus for a decade or more) to finding new uses for the abundant CPU power in modern PCs. Sure, CPUs are worse than GPUs for graphics, physics and lighting. Probably for spatial audio too. But is that really all there is? What about improving in-game AI? Making game worlds and NPCs more dynamic in various ways? Making player-to-world interactions more complex, deeper and more significant? That's just stuff I can come up with off the top of my head in two minutes. I'd bet a team of game or engine developers could find quite a lot to spend CPU power on that would tangibly improve game experiences in single-player. It's there for the taking, they just need to find interesting stuff to do with it.
Of course, this runs the risk of breaking the game for people with weak CPUs - scaling graphics is easy and generally accepted ("my GPU is crap so the game doesn't look good, but at least I can play"), scaling AI or other non-graphical features is far more challenging. "Sorry, your CPU is too slow, so now the AI is really dumb and there are all these nifty/cool/fun things you can no longer do" won't fly with a lot of gamers. Which I'm willing to bet the focus on improving graphics and little else comes from, and will continue to come from for a while still.