During a global energy crunch and what will again be the hottest and driest summer on record!? The 3080 had a TDP of 320w (not that far but still an increase - the 3080ti was 350w), but looking just one generation past the 2080ti used "just" 250w, as did the 1080ti - the top cards on the line up mind you, now there's an expectation for an even more absurd 90/90ti tier with even higher power draws. Die shrinks and new generations should bring effiency improvements, not just moar power draw.
People need to stop rationalizing this senseless increases in power draw, it's absurd pure and simple.
I'm sure they'd be a lot happier with another insane increase in board power that would allow them to sell another psu upgrade and continue to ship the very high power psu's that now will sit on shelves after crypto went bust for a bit
I couldn't agree more, and while our culture at large ceaselessly applauds everything technology and rarely, if ever, criticizes it and we're asked to believe that technological progress is entirely a linear ascent, the truth is far from that. What you're observing, the fact that despite node shrinks and gains in technological efficiency, we are increasingly offered hardware that has higher and higher net energy consumption, is called the "rebound effect" and it has been observed for the entirety of the techno-industrial order since it began (circa 1712 when steam power was first used to pump water from an English coal mine), in every technological segment/market, and at the macro level as a whole. Despite having the capacity for greater efficiency than ever befire, we consume more energy per capita than any time previous in human history.
While there are several contributors to this phenomenon, the paramount impetus is capitalism, specifically the paradigm of infinite expansion, growth and production. For example, while workers are literally more productive than ever, we are on average working more than we were in 1970 and that's because any gains in productivity and production efficiency are NEVER utilized to lower net consumption or to maintain current levels of production (i.e. workers decrease from 40 to 32 hours per week, but production levels stay the same), but to increase net production and therefore increase energy/resourse consumption as well (i.e. at best workers continue working thr same amount, but more often either work more hours or are expected to produce more units in the same amount of time).
When it comes to GPUs, the tradition of perpetually claiming higher performance every new generation coupled with marketing that specific aspect and placing its primacy above above all others (like efficiency) in the marketing itself, has transformed it into a convention from which the companies, their marketing departments, and the overwhelming majority of consumers anticipate and from which they refuse to deviate. I think we can all agree that if either AMD or Nvidia applied efficiency gains harvested from node shrinks, architectural improvements, etc into a new generation that provided just a 10% gain in performance, but a 100% increase in efficiency, not only would the marketing departments not know what to do with such a product, but consumers would largely criticize and reject it. Would an extremely energy efficient video card capable of being cooled with a now "old fashioned" one or two slot cooler experience wide spread market adoption? I think not. But far from blaming consumers, this current reality is also the result of years of constant, incremental conditioning by companies to tolerate and accept ever higher energy consumption and ever larger video cards to manage that consumption....if we were to show a 2017 audience the four slot 3090 ti cooler that is basically the default across the GPU line and completely transcends brand variance, I think they'd probably laugh from the ridiculousness of it and be mystified at how it arrived to such an endpoint (this reaction would be even more poignant if this hypothetical audience came from the end of 2014 when Nvidia's Maxwell architecture was released and every Nvidia fanboy couldn't bring up "efficiency" enough as a cited advantage over AMD's competing GCN architectural iteration).
This has been the issue with CPUs as well. As a whole AMD has previously decreased power draw on their CPUs, while Intel has continuously increased it to the point that a leak from just the other day showed an engineering sample of raptor lake consuming nearly 400 watts when a water chiller was deployed and the frequency approached 6ghz...that's what 64 core Epyc Milans consume! And unfortunately, while there has been sporadic criticism of Intel's high power draw from the enthusiast community, on a whole, consumers have largely accepted this trend without opposition. Even more unfortunate is that AMD has undoubtedly taken notice of the market's seemingly infinite patience with perpetually increasing power draw and increased the tdp (or whatever AMD calls it) of their soon to be released 16 core zen4 cpu from 105 watts to 170 watts. Intel has been doing this for a while and has experienced no market backlash, therefore AMD has been forced into a situation by both Intel and Nvidia to increase the power draw in both the CPU and dGPU market to stay competitive. It's a great example of how when a company would, or at least could, prioritize efficiency, the behavior of their competition which maintains a substantially larger market share and thereby dictates the trends and direction of those markets creates pressures for which a company like AMD, who not only has to match, but greatly exceed their competition's performance to be considered by fickle consumers, is forced to abandon such priorities to gain market share and even just to maintain their current position. This ultimately results in the evaporation of any alternate choice for consumers who wish to prioritize efficiency or decrease their overall energy consumption and thereby basically forcing ALL consumers to increase their net energy consumption whether they will it or not.