Wednesday, June 1st 2022
PC Components with High Power Draw an Issue for an Overwhelming Majority of Users: Survey
In April, we polled our readers with the question "Are components with high power draw an issue for you?," and got back 22,040 responses. An overwhelming 85.5 percent of our readers (PC gamers and enthusiasts) take issue with high power draw for PC components. This comes in the wake of key components such as processors and discrete graphics cards rising in power-draw generation-over-generation, despite transitioning to smaller and smaller silicon fabrication nodes, signaling that Moore's Law isn't able to keep up with advances in performance and capabilities. The 85.5 percent of respondents who voted "yes," did so for very diverse reasons.
Our poll question wasn't a binary yes-or-no; and people could vote yes for different reasons—power bill, heat, noise, and the environment. 33.5 percent of respondents felt that power bill (energy costs) is the biggest reason why they chose yes. Heat is the second big factor, with 28.5 percent feeling that they don't want high power-draw component because power has a direct impact on heat, and all that heat is put out into the room despite how good the cooling solution is. The third place goes to noise at 12.2 percent, with bigger cooling requirements having an impact on system noise. Even big fat liquid cooling solutions ultimately rely on fans. Interestingly, only 11.3 percent voted that they care about the environment and hence take issue with high power-draw components. This figure, by the way, is much less than the 14.5 percent who voted that they don't care at all about components with high power draw.Our question took shape as we followed the generation-over-generation power-draw trend of two key components—processors and graphics cards. The GeForce GTX 980 Ti, NVIDIA's fastest consumer graphics card in 2015, drew just 211 W of power under gaming workloads of the time, while the GTX 1080 Ti pulled a similar 231 W to workloads tested in 2017. The RTX 2080 Ti pulled 273 W for gaming of its time in 2018; while the current RTX 3090 Ti draws a whopping 445 W for today's games. These cards were each tested in different system setups, with different driver software and games which is why we can't put the data on a graph, but they still serve to illustrate a generationally rising power-draw. If Moore's Law held true, there should be a generational increase in performance from a new architecture at negligible increase in power, as the silicon will have transitioned to a new node with increased transistor density and improved power characteristics. This, however, isn't happening.
Our poll question wasn't a binary yes-or-no; and people could vote yes for different reasons—power bill, heat, noise, and the environment. 33.5 percent of respondents felt that power bill (energy costs) is the biggest reason why they chose yes. Heat is the second big factor, with 28.5 percent feeling that they don't want high power-draw component because power has a direct impact on heat, and all that heat is put out into the room despite how good the cooling solution is. The third place goes to noise at 12.2 percent, with bigger cooling requirements having an impact on system noise. Even big fat liquid cooling solutions ultimately rely on fans. Interestingly, only 11.3 percent voted that they care about the environment and hence take issue with high power-draw components. This figure, by the way, is much less than the 14.5 percent who voted that they don't care at all about components with high power draw.Our question took shape as we followed the generation-over-generation power-draw trend of two key components—processors and graphics cards. The GeForce GTX 980 Ti, NVIDIA's fastest consumer graphics card in 2015, drew just 211 W of power under gaming workloads of the time, while the GTX 1080 Ti pulled a similar 231 W to workloads tested in 2017. The RTX 2080 Ti pulled 273 W for gaming of its time in 2018; while the current RTX 3090 Ti draws a whopping 445 W for today's games. These cards were each tested in different system setups, with different driver software and games which is why we can't put the data on a graph, but they still serve to illustrate a generationally rising power-draw. If Moore's Law held true, there should be a generational increase in performance from a new architecture at negligible increase in power, as the silicon will have transitioned to a new node with increased transistor density and improved power characteristics. This, however, isn't happening.
150 Comments on PC Components with High Power Draw an Issue for an Overwhelming Majority of Users: Survey
My issue with power draw is everything that's negative with it - increased engineering costs of components, increased heat, increased enviornmental damage
If you want to use efficient hardware, you're welcome to do so, but if it's just a 3060, then you know why. You just accept that it's "only" 30 FPS at maximum settings instead of 80, but then you have less power consumption.
Hobbies cost money, and gaming is a hobby. For office work, there are PCs that are very efficient and require little power. They are then only suitable for gaming to a limited extent.
Personally i care more about heat than power. If power use is such a big problem for you, get a celeron setup.
What did you want more? More power draw is good for Moar FPS? Free heating the winter?
Whenever i open the pc case my sound card is always one of the hottest component in it.
I think sound cards or windows power options should have some kind of stand by mode.which is cutting the power supply to card when there is no sound in system,no youtube or similar activity or no active media player.
And internal archive disk drives.Windows somehow triggering these drives randomly at least 20 times during the day even i dont access to them.Each activation takes at least 10 minutes to stop even with power options set to lower units. Disabling indexing,virtual drive,virus check and disabling the drives alltogether and some other stuff helps just a little bit.
These are just two small things come to my mind quickly.
Instead of hundreds of new user interface options on every new OS , Microsoft must deal with these basic things to improve energy saving.
As to the topic, I voted 'Yes, environment', though as noted by others above the correct answer would be 'Yes, all of the above'. IMO, the main problem is not as much the absolute power draw of each GPU as the ever-rising ceiling of GPU power draw. As the saying goes, a rising tide lifts all boats, but in this case the boats in question are the power draws of lower end gaming setups. Anyone remember back when a GTX 960 was a ~110W GPU? Now the 3060 - which is quite efficient for Ampere! - is 190W instead. That's getting awful close to a doubling for that product tier (and yes, we can absolutely discuss the dilution and transformation of GPU product tiers over the past few generations, but that's another issue) - the most popular tier. Couple that with the exploding popularity of PC gaming, and you've got a pretty significant environmental impact, while the higher power draws also force more expensive VRMs and coolers driving up card prices. Just compare the PCBs and layouts of these two GPUs - same nominal tier, same manufacturer, but the 960 was a "fancy" model (Super Jetstream) while the 3060 is a low-end SKU. And yet ...
Sure, the newer card is more densely packed, but these illustrate quite clearly why GPUs are getting so damn expensive even outside of chip shortage-induced madness.
2015 980Ti 601 mm² 8,000 253 W peak gaming and properly clocked ~1400Mhz ~~25% faster
2016 1080Ti 471 mm² 11,800 267 W ~~75% perf in 2K
2018 2080Ti 754 mm² 18,600 289 W ~~1,33 perf
2020 3080Ti 628 mm² 28,300 359 W ~~1,33 perf
2022 3090 Ti 628 mm² 28,300 478 W ~~1,40 perf, delivering just 10% more performance.
replace the preposterously overkill 24G G6X with GDDR6 12GB and the total power probably drops to 300 W just like that.
Moores law probably only states double the transistors at same size. but clock is 33% up and with it the power.
Now with next gen 4090 series we have double the transistors 60,000 at 611mm² same size 33% clock 600 watts. and delivers 2,00 performance.
But if I look around my apartment you know how many devices just sit there with a light on being in vampire mode? Almost everything. If you want to worry about this you hook it up to a power strip and you turn the damn switch off. I have a USB flash drive plugged into my router for reasons and that sucker will burn you.
The main issue is that modern CPUs and GPUs could be considered pre-overclocked and running almost at the limits of the silicon to squeeze more performance. Another is that motherboard and GPU manufacturers also try to one-up each other by using higher power limits and frequencies where possible. This completely overshadows efficiency gains at lower frequencies and saner voltages that every CPU/GPU generation still brings.
But if one can easily take back what used to be the "overclock margin" in old products, then I don't see any real issue except possibly cost for low-end product segments.
What could be improved is perhaps making it easier for everybody to configure components to run efficiently rather than overclocked, and possibly making it the default setting rather than the other way around. On many motherboards the default is no or very high power limits, with power savings mostly disabled to enhance overclockability. Another thing is that reviewers probably could focus more on highlighting performance vs power draw in GPUs and CPUs, and how components perform with saner limits.
I would of bet environment would be top
Guess I'm glad I'm not a gambler :laugh:
I would even support the creation of a regulatory law.
What would you propose?
I've been running my 1080 at 50% power for a while now, trying to maximize every bit life while keeping temps reasonable in this 40c-ambient-on-a-good-day hellscape.