675W for a graphics card is already ridiculously stupid. We need 3nm GPUs as soon as possible.
While many of us would believe that greater efficiency gains would practically translate to less power usage, this is actually not the case in basically every application. There's a well documented phenomenon called the "rebound effect", and it has basically been the reality with respect to technological efficiency gains. What has been observed is that greater efficiency DOESN'T lead to less consumption, actually the opposite, and we can see this now with GPUs, where they have the means to be more efficient than ever, and yet they're steadily consuming more power than ever, and rumors say the next generation will be even worse.
A great hypothetical for demonstrating the rebound effect is the work we do as people. Even though automation, better hardware, faster computers, etc have made workers more efficient than ever, does this ACTUALLY translate to working less? No, it never does. This is predominantly attributed to the capitalist model of perpetual economic expansion, where instead of maintaining output at its current levels and using efficiency gains to work less, we work the same amount and instead increase output. When output is increased, the economy grows and more output is desired, and that therefore translates to even more consumption and more output, so the efficiency gains actually result in increased consumption.
This is why the transition to 3nm is likely to result in even more power hungry GPUs. The fact that the "videocard" is a complete product and comes with its own cooling solution also ensures this trend, and videocard manufactures can use ever larger and elaborate cooling solutions to buttress the trend of using more power. In fact, I think it would be possible to actually empirically graph an inverse relationship in the decrease of GPU nodes with an increase in average videocard thickness... It wasn't so long ago that 3+ slot GPU coolers were an abberation rather than the norm. With CPUs, it's slightly different as the manufacturer cannot guarantee a specific level of cooling capability since it's left to the consumer/end user, though this has not hindered Intel in increasing power consumption. AMD on the other hand, puts forth a greater effort at keeping power consumption steady while increasing performance, but even this still does not achieve the goal of have a NET reduction in power consumption. Under our current economic system though, especially the paramount importance of short-term shareholder returns over all other consideration,s, especially long-term considerations, net reduction in power consumption will never be the goal.
As long as consumers do not care about efficiency as a marketing point, more performance at greater power consumption will be the ongoing trend. I honestly believe that the 650w-850w PSU being the typical wattage in the vast majority of DIY builds will sooner than later switch to 1000w-1200w.
What's funny is that I can distinctly remember that when Maxwell was the current Nvidia generation, online Nvidia advocates would ceaselessly brag about efficiency to the extent that it was one of the main points in any online argument about the "best GPUs", and with each subsequent generation released by Nvidia, effeciency as a salient point has dwindled into nothingness. Anyway, this is why future node shrinks and the capability of increased efficiency will not actually result in less power consumption, but more. The Rebound Effect is also why the overwhelming majority of leading thinkers on the topic of climate change, the environmental crisis, and the future of our species have concluded that technology by itself and technological "progress" will not solve any of these pressing issues.