Nobody said that, that's just what you want to make of it to live the illusion that all is well here.
The fact is, TDP has virtually doubled to get that 3,5x faster.
At the same time, your games still run at similar FPS.
One might even debate if they look better than they used to do on the 1080, honestly, given the current TAA junk.
Such progress. Have you saved the world yet with your hyper efficient GPUs? Or is the exact opposite happening, and is the reality that all efficiency gained is just a free pass to use ever more power? What's the energy footprint YoY of all GPUs on the globe? An interesting thing to place alongside your ooh-aaah about most efficient GPU in existence. It gets even funnier if we factor in the cost and footprint of the fabs and node progress required to obtain 3.5x faster. I think if you put it all together the net sum might be we've actually just wasted an immense amount of resources so we can still barely run 4K - not much other than in 2017.
Stop bullshitting yourself, there should be a shock effect when you consider the x80 has doubled its TDP over 3-4 generations when in the past, it never did that.
Doesn't matter. What matters is whether that singular gaming system with a top end GPU in it, uses 300-350W (2016), or over 600W (2024). And that's not touching the OC button, mind you, that's running both systems 'conservatively'. In the end, physics hasn't changed, you're still getting an ever increasing energy bill and heat production. Efficient or not.