You're right. It wás nonsense, but progress / Moore's Law is falling apart everywhere. CPUs are stagnant and need more cores/bigger dies, chiplets even to extract higher performance. GPUs are approaching that too. Already the dies get bigger, and with that, cost per die increases faster than the relative performance gained.
Moore's law is marketing BS, and have always been.
But yes, in larger trends improving performance gets harder and harder.
GPUs still have good performance gains generation to generation, but that will slow down very soon, we only have a couple of node shrinks left until new materials are needed.
CPUs have only been slowly improving since Sandy Bridge, and while higher core count is helpful for some things, it does little for most desktop uses.
These baby steps will happen more often, and the perf/dollar may remain exactly where it is. In that sense, you can actually have progress more easily now without feeling screwed the moment you purchase it. Or maybe, you're screwed in a different way: now the purchase itself is too expensive
There will still be gains from more efficient architectures, but gains will be smaller, so don't expect huge improvements in performance per dollar, but it shouldn't get worse though.
While hardware have never been cheaper, and with fairly decent hardware being affordable for anyone, we shouldn't really complain too much about hardware.
But I would point out that the largest problem is software, and unfortunately the larger trends in software is more bloat and abstractions.
Maybe the time when we had very good upgrade paths are the exception to the rule?
I'm not sure what you mean here, please elaborate.