Still, Furmark gives the max, not the game. And not every game, that's the point you're missing.
As for comparing a 5nm chip with a 16nm, what the hell is that? You are not sound in mind, what's next, arguing that a GTX 660 consumption was better?
As the load in gaming rise, mid range cards will have more computational power then more power. But, let's say for a moment you can understand a graph, you would see that perf per watt always increases.
Second, the fact that NVIDIA is selling small chips at hight price is a secondary issue.
The entire suite of games ran at uncapped FPS simply pushes every GPU to its designed board TDP, pretty much most of the time.
This is how GPUs are designed these days. If a game has considerably lower load, obviously the GPU will scale down in power to meet that demand. But that's a detail that is irrelevant in this discussion, it happens in all games in all sorts of ways, sometimes even just by game engine, API, or CPU limitations - not a GPU influence.
I'm not denying the perf per watt always increases. But we have certainly entered a new realm where the TDP of cards has been going steadily up for each tier in the stack. This is not a good development - and it cannot last. Here's the story I'm looking at. You mentioned Kepler. Look at this GK104's TDP, and the other similarities.
Now; GTX 1080 (GP104) : 180W and still around 300mm2.
Turing is where all things get screwed up. This here is the 2060, a
1080 equivalent in terms of raster perf.
It requires a whopping 1,5x the die size, albeit at slightly reduced TDP, to get there. 'The cost of RT'. We won 20W off a refined node, but got 50% fatter.
Ampere is hilarious. The 3060. Because, yes, the new reality now is that you've gained a full tier of TDP; x60 is now carrying the wattage of what used to be an x80. Raster efficiency has effectively stalled since Pascal.
Ada then, it seems things are back in order. Not such a bad comparison after all then, once we're back on a good node and some refinements on RT. But this 200W is still for a mere 4070. Not the Ti, not the x80.
As for the flavorful comments about another's intelligence, let's leave them aside
I'm trying to get a point across, its a matter of understanding/perspective, not a contest.