# Limiting powerconsumption in Desktop builds?



## Deleted member 189968 (Oct 17, 2019)

Hi,

When looking at power consumtiopn charts for newer CPU's in Desktop computers such as I7 / I9 and Ryzen 3000
I'm surprised just how much power they consume these days even at IDLE (all the cores and threads). And Intels processers seem to be way over their promised TDP:
It's nice to have a powereffecient laptop (it feels like a waste to use a powerhungry Desktop for browsing the web and writing) for most of the daily tasks, but of course if you need a desktop for
heavy video editing or gaming you would need to build a Good desktop. So my question is this
Which CPU would you go for today if you were to build a desktop that colud do medium photo editing / 4K video editing: Ryzen or intel? Which generation and model?
Are there safe methods to limit the TDP or power draw? Are there smart ways to regulate GPU and CPU power draw these days fx at idle?






Link to power consumption thread








						Intel Core i9-9900K 8 Core and 16 Thread 5.0 GHz CPU Review Ft. Z390 AORUS Master Motherboard
					

Intel has released their 9th Generation Core family and today, I'll be taking a lookg at their Core i9-9900K CPU onthe AORUS Z390 Master.




					wccftech.com


----------



## Vya Domus (Oct 17, 2019)

Most, if not all CPUs have configurable TDPs.


----------



## Bill_Bright (Oct 17, 2019)

interstellar said:


> And Intels processers seem to be way over their promised TDP:


Says who? 

You realize your chart above is for the entire system, right? So those values for that 9900K, for example, also include the power consumed by the motherboard, the M.2SSD, 4 x 8GB of RAM, a 1080 Ti graphics card, H115i GTX cooler, the PSU's own inefficiencies, and whatever case fans where used too.


----------



## newtekie1 (Oct 17, 2019)

interstellar said:


> Hi,
> 
> When looking at power consumtiopn charts for newer CPU's in Desktop computers such as I7 / I9 and Ryzen 3000
> I'm surprised just how much power they consume these days even at IDLE (all the cores and threads). And Intels processers seem to be way over their promised TDP:
> ...



You know that chart isn't CPU power consumption, right?


----------



## EarthDog (Oct 17, 2019)

Outside of the miss on the chart, it has long been known that both Intel and AMD's TDP values are not what they seem to be for CPUs.



interstellar said:


> Which CPU would you go for today if you were to build a desktop that colud do medium photo editing / 4K video editing: Ryzen or intel? Which generation and model?


A simple 3800X and X570 system would work out well...


----------



## spectatorx (Oct 17, 2019)

I will just leave it here:


----------



## TheoneandonlyMrK (Oct 17, 2019)

Plus at the end of the day the end user controls power use, and it's rasily done , using windows own controls the CPU's performance can be regulated, i crunch on my 3800X but not at full clock i set it to 3.5Ghz by turning down the max allowed processor load in the power save profile with hibernate disabled for a nice power saving.


----------



## EarthDog (Oct 17, 2019)

spectatorx said:


> I will just leave it here:


lol, at least this video made it to a thread where it is useful (someone posted it in a gpu thread, lol).


----------



## Voluman (Oct 17, 2019)

That marketing value as TDP you have to read about, because its not a direct power consumption value. Both manufacturer describe it how they 'count' or 'mean' it. Probably you know already its different interpretation for both side.

Usually in your motherboard bios you can set it. (I can in my gigabyte boards on s1155, msi fm2 too)
Look for power current or similar setting.


----------



## GorbazTheDragon (Oct 17, 2019)

Most desktop PCs aren't particularly efficient since they operate at high voltages and frequencies. If you want to have maximum efficiency you will be looking towards a lot of the server (eypc and xeon) parts, preferably with the higher core counts. A larger processor going slower will use less power than a smaller processor going fast for the same amount of processing done.

This difference between consumer and server parts is because most consumers  don't particularly care about energy per calculation efficiency, while in servers that is a large part of your overall cost.

Like others said, basically all consumer/enthusiast platforms have configurable TDPs. In cases with voltages and/or multipliers unlocked you can also undervolt and downclock for better efficiency.

As far as hardware goes, the most efficient CPUs at the top end of performance right now are probably going to be the 48 or 64 core epyc chips.


----------



## ShrimpBrime (Oct 18, 2019)

interstellar said:


> So my question is this:
> 1)Which CPU would you go for today if you were to build a desktop that colud do medium photo editing / 4K video editing: Ryzen or intel?
> Which generation and model?
> 2)Are there safe methods to limit the TDP or power draw?
> 3)Are there smart ways to regulate GPU and CPU power draw these days fx at idle?



1) Have both Ryzen 2700X and Intel 8700K - Love em both. 
2) Yes there are safe methods to limit TDP. = Manually configure a lower clock and voltage used.
3) See answer #2, can always run a desktop cpu more efficiently. It just so happens most people (enthusiasts/gamers) like to overclock instead....


----------



## Deleted member 189968 (Oct 18, 2019)

GorbazTheDragon said:


> Most desktop PCs aren't particularly efficient since they operate at high voltages and frequencies. If you want to have maximum efficiency you will be looking towards a lot of the server (eypc and xeon) parts, preferably with the higher core counts. A larger processor going slower will use less power than a smaller processor going fast for the same amount of processing done.
> 
> This difference between consumer and server parts is because most consumers  don't particularly care about energy per calculation efficiency, while in servers that is a large part of your overall cost.
> 
> ...



That's inteteresting so the Intel X /xeon series has a lower power consumption than most i5, i7, I9 at IDL and has more cores and Threads.
The ideal scenario would be to have a PC that uses 20-40Watt max at IDLE, and then ramps up to 200-3000Watt during video editing or photoshop. 
The Xeon are a bit more expensive so maybe my best option - for a potential future build- would be to get a used Xeon and a motherboard that uses little power and perhaps a GPU where it goes into sleep mode during idle or something like that. 

It's nice to know that There are configurable TDP options, I will look into that for the CPU that suits my need. 

*Tbill_Bright*: Yes, now i realized it lol -  I was guilty of being  a little too fast!   Thank you for the clarification


----------



## newtekie1 (Oct 18, 2019)

The fact of the matter is, if you don't build a huge gaming rig, with a huge gaming GPU and a bunch of other RGB accessory garbage.  You can leave desktop processors pretty much stock and get good power efficiency.  They already pretty much do exactly what you want, idle down to 20-40w and go higher when under load.

My i5-9400 rig idles at 8w, yes 8w.  And thats with the WiFi card enabled, but not connected just scanning for available networks.  I could probably cut another 1w off if I just disabled the WiFi card.

Under full load, running Cinebench, it goes up to 84w.  Yes, that is higher than the rated TDP.  But that is how modern turbo boost works on CPUs and GPUs. Also, TDP isn't how much power the CPU will use under full load, it is a envelope or range of how much heat the CPU will produce.  From what I've gathered, it will be how much heat the CPU will product at its base clock.  But, again, this is full load, something most PCs don't really see that often.

Browsing the web with Chrome and watching a 1080p Youtube video has the power bounce around between 12w(when just the youtube video is playing) to 35w when pages are loading in chrome.  It will even very briefly spike to 60-80w, but it is very quick that it does that.

There is no need to look at a Xeon or anything like that to get power efficiency.  In fact, in my experience, they aren't really any more power efficient.  Except for the ones that don't have an iGPU, because obviously they don't have to power an iGPU.  But then you're making up for it by putting in a GPU that likely consumes more power than the iGPU would... But the fact is, even desktop processors will be power efficient, they'll idle down to use almost no power, just like laptop CPUs and they'll speed up when needed to complete the work asked of them.  They don't just sit there sucking down power full blast all the time, both Intel and AMD have done a very good job at power efficiency.

So, the question of what CPU would I pick for medium photo edition / 4k video editing?  Well, those are two very different things, aren't they?  

Medium photo editing, heck heavy photo editing, can be done with an i3.  You'd be fine with an i3-9100, but at that price point you're also looking at an R7 2700 on the AMD side.  The 2700 is hard to pass up in that situation.  

On the other hand 4k video editing is a completely different beast from photo editing, and need way more CPU horsepower, and a halfway decent GPU helps here too(while a GPU wouldn't really help with photo editing).  On top of that you need the storage that can support 4k video editing, but that's a different discussion.  Let's just focus on the processing part of it here.  For 4k video editing, you'll need the best CPU you can afford.  That's all it comes down to.  But even if you get something like an i9-9900 or i7-9700 or a 3900X, they will all idle at very low power usage.  I'd bet they all would idle at under 20w.  Probably not the whole system, especially if you have a GPU, but the processors themselves would all idle below 20w power usage, and heck probably close to 10w.  The next concern will be the GPU.  If you get something mid-range, that too and idle pretty darn low.  Going with something in the 1660 to 2060 range will help accelerate 4K video editing, but also not consume insane amounts of power.  The 1660 idles at like 6w and the 2060 idles at like 9w.  So with the GPU, you're still likely going to be idling in the 20-40w range, I would guess closer to 20w than 40w actually.


----------



## Bill_Bright (Oct 18, 2019)

I admit, I did not sit through the whole 38 minutes of that video. Pretty sure it would have just wasted about 37 minutes of my time. 

CPUs rarely sit at maximum utilization for long periods of time. In fact, most of the time, they sit closer to idle than maxed out. And as newtekie1 pointed at, at idle, they burn less than many night-lights. 

GPUs are often the most power hungry devices in our computers. If not the most, certainly the 2nd most. And they are not maxed out most of the time either.

RAM, when not idle, can get hungry. 

Drives, fans, motherboards, USB devices all consume power. Even the least efficient devices in our computers, the power supply, eats up power - often lots of it. 

So even if you game hard for 8 hours straight, for most of that 8 hours, power consumption is not anywhere near maximum. 

If wasting energy is a concern (and it should be) make sure you turn off the lights and TV when you leave the room. Know what you want in the refrigerator BEFORE opening the door. Set the thermostat a couple degrees higher in the summer and lower in the winter. Caulk around your windows and doors. Keep the garage door closed.


----------



## kapone32 (Oct 18, 2019)

Bill_Bright said:


> I admit, I did not sit through the whole 38 minutes of that video. Pretty sure it would have just wasted about 37 minutes of my time.
> 
> CPUs rarely sit at maximum utilization for long periods of time. In fact, most of the time, they sit closer to idle than maxed out. And as newtekie1 pointed at, at idle, they burn less than many night-lights.
> 
> ...



And if you don't have LED lights get some as lights are the things that burn the most energy in our homes. One 100 watt light bulb pulls a 1000 Watts of power every 10 hours. Now imagine if you have a 25 of them around your home.


----------



## generaleramon (Oct 18, 2019)

i'll say...
1- ITX Build with a AMD 3400G APU(+ Undervolt), 3466+ Mhz Ram, All solid state Storage, 80Gold/+ Rated PSU(450w Max).
You get 8 Threads and a good Vega GPU for light gaming and editing, very very good perf/watt. I really really like this config i must say.

2- ITX Build with a AMD 3700x CPU, an efficient midrange GPU(something like a 1660,2060), All solid state Storage, 80Gold/+ Rated PSU(450w Max).
16 Threads, dedicated GPU, way more powerful but consuming a bit more.

I have an undervolted (using a negative offset) Ryzen 1700, 16GB GSkill 3333Mhz@CL14, Biostar X370 GTN, Sata SSD + Sata HDD, Vega 64 Reference (1425 Core, 1075HBM2), 80Gold 450w PSU, 3x120mm fans. Idle power is 45-47w at the wall. Amazing i must say.


----------



## EarthDog (Oct 18, 2019)

newtekie1 said:


> My i5-9400 rig idles at 8w, yes 8w.


Is that just the CPU? Wondering how the motherboard and such manages this while idle but not 'sleep'. What is showing you this information?


----------



## newtekie1 (Oct 18, 2019)

EarthDog said:


> Is that just the CPU? Wondering how the motherboard and such manages this while idle but not 'sleep'. What is showing you this information?



That is the entire computer.  Measured with a Kill-A-Watt and confirmed with my UPS software.  This is power consumption at the wall, and the power supply isn't even 80+ rated AFAIK.


----------



## Deleted member 189968 (Oct 18, 2019)

generaleramon said:


> i'll say...
> 1- ITX Build with a AMD 3400G APU(+ Undervolt), 3466+ Mhz Ram, All solid state Storage, 80Gold/+ Rated PSU(450w Max).
> You get 8 Threads and a good Vega GPU for light gaming and editing, very very good perf/watt. I really really like this config i must say.
> 
> ...



Thank you for sharing these setups, it's nice to see some actualy examples of what a power effecient machine would look like ! 
Does using a *400W power supply draw less power than using an 800W power supply?* What if your GPU and CPU need more power than the power supply can supply?

*Bill_Bright*: If ram consume much power when under load, would it then not be smarter 
to have more Ram than nessecery fx 16Gb instead of 8Gb? Then they don't get as "strained" 

*So to save power on computer*
1. Get ITX motherboard
2. Platinum PSU 80+ 
3. Undervolt your CPU / TDP configuration (it's not dangerous?) what if you run a heavy program and the CPU is undervolted? what would happen? 
4. get SSD 
5. Good Cooling (effecient Coolers) 
6. Connect a power plug to your neighbors AC and run the wire to your own home (this saves 100% of your Power) 

Anything else?


----------



## John Naylor (Oct 18, 2019)

Real data not to hard to find ... and from reliable sources.   For example.









						AMD Ryzen 9 3900X Review
					

The flagship of AMD's new Ryzen 3000 lineup is the Ryzen 9 3900X, which is a 12-core, 24-thread monster. Never before have we seen such power on a desktop platform. Priced at $500, this processor is very strong competition for Intel's Core i9-9900, which only has eight cores.




					www.techpowerup.com
				




If concerned about power, pick a PSU substantially large than your power needs.  PSUs hit peak efficiency at 50% of rated load.  So yes, if you are using 400 ~ 425 watts,  an 850 watter would be a better option than a 450 watter.  But low power and best efficiency are two very different things.

Better yet, use a laptop ...  and a K-tor Power Box 20 Watt Pedal Generator for you and a few friends.


----------



## ShrimpBrime (Oct 18, 2019)

interstellar said:


> Thank you for sharing these setups, it's nice to see some actualy examples of what a power effecient machine would look like !
> Does using a *400W power supply draw less power than using an 800W power supply?* What if your GPU and CPU need more power than the power supply can supply?



No a 400w does not draw less power than an 800w PSU. This depends on the demands of the system and also efficiency of the PSU itself.

No it's not 400w sustained. That's a peak watt advertisement.
80% of 400w loaded for a period of time. This is what you should base your PSU purchase around.

So on a 400w Peak, 80% load is only 320w which you could safely sustain.
You don't want to run a PSU at it's peak for a sustained period of time.

If your PSU is too small, 2 things happen.
1. system won't turn on.
2. system shuts off under a load.

Yea, I get shunned for running 850w PSU. "it's way overkill" they say. "you wasted money" they say.
Well, I've had it for 9 years on many many systems now. It still spanks 550w (peak) corsair PSUs all day lol.


----------



## authorized (Oct 18, 2019)

John Naylor said:


> If concerned about power, pick a PSU substantially large than your power needs.  PSUs hit peak efficiency at 50% of rated load.  So yes, if you are using 400 ~ 425 watts,  an 850 watter would be a better option than a 450 watter.  But low power and best efficiency are two very different things.


Good power supplies have fairly flat efficiency curves. You'll end up spending more money buying this powerful psu than you will ever save having a potential few % increase in efficiency.
It's time for this 50% load golden zone to go away.
Not to mention that most PCs spend most of their time in idle or at least in low energy consumption mode.

People are too obsessed with power saving these days, to the point they pay premium for the privilege of doing it.


----------



## ShrimpBrime (Oct 18, 2019)

Most users in forums like this are gamers and overclockers.
These types of people will always inquire load based situations because they tend to actually load the system.

Click @newtekie1  system specs above. The build was done right.
8700k 1080ti. Liquid cooling, fans the ball of wax. He's Smart, 850w psu. He can load that to 600w load sustained. Easily run SLI.

Psu and power saving. = Buys Ryzen Athlon 220ge uses igp and can use nice small psu.

Gaming and OC.... Go big or stay home. 

Will a 650w peak suffice most single card gaming rigs? Sure why not? Would you build the system to fully pull 650w and still call it good?  I think not.

People arent "saving power" building gaming rigs. 
Save power with gaming rig? Game less lol.


----------



## oobymach (Oct 18, 2019)

80plus gold is fine, you don't need platinum. Do your research first, some psu's are better than others from same manufacturer so buyer beware, always check reviews, I recommend to buy at least 250watts more than you need for future upgrades. Just because you buy an 800watt psu doesn't mean it'll be pulling 800watts all the time (unless it's a crap psu).

Also, underclocking doesn't necessarily save money.


----------



## 64K (Oct 18, 2019)

The most common misunderstandings about PSUs:

A Gold rated PSU is always better quality than a Bronze rated PSU.

Using PSU wattage calculators on the net or recommendations from card manufacturers is the way to calculate what wattage you need for your rig.

A gold rated 600 watt PSU at maximum efficiency can only deliver 520 watts because it's only 87% efficient at full load.

Spending extra for a Platinum or Titanium PSU will pay for itself for everyone in the long run from saved electricity costs on the utility bill.


----------



## generaleramon (Oct 19, 2019)

Regarding RAM power consumption

Around 2W/ DDR4 RDIMM

16x 8GB 232W
4x 8GB 205W
Savings: 27W
That's on a server. on a desktop pc with max 4 DIMMs is not going to make a big difference. Most ITX boards use 2 dimms anyway.

References:
https://www.servethehome.com/ddr4-dimms-system-power-consumption-tested/
https://www.tomshardware.com/reviews/intel-core-i7-5960x-haswell-e-cpu,3918-13.html



64K said:


> The most common misunderstandings about PSUs:
> 
> A Gold rated PSU is always better quality than a Bronze rated PSU.
> 
> ...


it's the other way around , a 600w psu with 87% efficency is going to pull 689W from the wall. That's why using a more efficent psu reduces the total system power usage.

References:
Wikipedia: 80Plus Technical overview


> ...For instance, a 600-watt power supply with 60% efficiency running at full load would draw 1000 W from the mains and would therefore waste 400 W as heat. On the other hand, a 600-watt power supply with 80% efficiency running at full load would draw 750 W from the mains and would therefore waste only 150 W as heat...


----------



## 64K (Oct 19, 2019)

generaleramon said:


> it's the other way around , a 600w psu with 87% efficency is going to pull 689W from the wall. That's why using a more efficent psu reduces the total system power usage.



That could be why I posted that as one example of misunderstandings and you misunderstood that. This thread is getting comical at this point.


----------



## generaleramon (Oct 19, 2019)

64K said:


> That could be why I posted that as one example of misunderstandings and you misunderstood that. This thread is getting comical at this point.


 s*hit you are right... Yeah, the situation is pretty comical


----------



## newtekie1 (Oct 19, 2019)

ShrimpBrime said:


> 8700k 1080ti. Liquid cooling, fans the ball of wax. He's Smart, 850w psu. He can load that to 600w load sustained. Easily run SLI.
> 
> Psu and power saving. = Buys Ryzen Athlon 220ge uses igp and can use nice small psu.



Oh, man I'm glad you reminded me, I need to update that.  I actually dropped a 1000w in there a few months back because the 850w started acting flakey.

I have a 220G system too, it has a 450w power supply in it.  Still overkill for that system!



64K said:


> Spending extra for a Platinum or Titanium PSU will pay for itself for everyone in the long run from saved electricity costs on the utility bill.



I wrote a big long thing on this exact topic. You're correct, they almost never pay for themselves in my experience.  A cheap 80+ Gold on sale is the way to go over a Platinum/Titanium that likely will cost twice as much.


----------



## Aquinus (Oct 19, 2019)

interstellar said:


> And Intels processers seem to be way over their promised TDP:


Yeah, because you don't seem to understand what TDP means for Intel CPUs:


> The upper point of the thermal profile consists of the Thermal Design Power (*TDP*) and the associated Tcase value. Thermal Design Power (*TDP*) should be used for processor thermal solution design targets. *TDP* is not the maximum power that the processor can dissipate.


https://www.intel.com/content/dam/doc/white-paper/resources-xeon-measuring-processor-power-paper.pdf


----------



## ShrimpBrime (Oct 19, 2019)

Aquinus said:


> Yeah, because you don't seem to understand what TDP means for Intel CPUs:
> 
> https://www.intel.com/content/dam/doc/white-paper/resources-xeon-measuring-processor-power-paper.pdf



Must bold and underline the T for THERMAL design point.....


----------



## Aquinus (Oct 19, 2019)

ShrimpBrime said:


> Must bold and underline the T for THERMAL design point.....


The important part is:


> *TDP is not the maximum power that the processor can dissipate.*


I made it bold, italics, and underlined just for you.


----------



## ShrimpBrime (Oct 19, 2019)

Aquinus said:


> The important part is:
> 
> I made it bold, italics, and underlined just for you.



You the man. 
However cpu doesnt dissipate power, it consumes it. 
You dissipate heat which is the product of being not so effecient...

The cpu that releases the least amount of heat (btu) per cycle would be most effecient.


----------



## Bill_Bright (Oct 19, 2019)

interstellar said:


> *Bill_Bright*: If ram consume much power when under load, would it then not be smarter
> to have more Ram than nessecery fx 16Gb instead of 8Gb? Then they don't get as "strained"


It is smarter to have more RAM but not for that reason. You are suggesting RAM under load is being "strained". Its not. The fact is, two sticks consumes more power than one. Four sticks consume more than two. But regardless, a stick of DDR3 only consumes about 3W (regardless its density). And DDR4 consumes even less because it uses a lower operating voltage.


----------



## Aquinus (Oct 19, 2019)

ShrimpBrime said:


> However cpu doesnt dissipate power, it consumes it.


Power, not electricity. and when you say "consume" you mean that CPUs convert electric potential energy into heat energy, which is represented as power.


			
				What is Power? said:
			
		

> In physics, power is the rate of doing work or of transferring heat, i.e. the amount of energy transferred or converted per unit time.


https://en.wikipedia.org/wiki/Power_(physics)


----------



## Bill_Bright (Oct 19, 2019)

Aquinus said:


> CPUs convert electric potential energy into heat energy, which is represented as power.


True, except heat generated by a CPU is wasted energy. And that heat is dissipated by the IHS and heatsink.

CPUs convert electrical energy into work. And in physics (of which electronics is a part) work is the movement or displacement of an object. And within the CPU, those objects are the billions and billions of transistor gates that are flip-flopping back and forth to represent 1s and 0s. 

Ideally, CPUs would convert 100% of the energy they consume into work. But CPUs are not 100% efficient. So some of that energy is simply wasted in the form of heat.


----------



## Aquinus (Oct 19, 2019)

Bill_Bright said:


> True, except heat generated by a CPU is wasted energy. And that heat is dissipated by the IHS and heatsink.
> 
> CPUs convert electrical energy into work. And in physics (of which electronics is a part) work is the movement or displacement of an object. And within the CPU, those objects are the billions and billions of transistor gates that are flip-flopping back and forth to represent 1s and 0s.
> 
> Ideally, CPUs would convert 100% of the energy they consume into work. But CPUs are not 100% efficient. So some of that energy is simply wasted in the form of heat.


Well, if we're really going to get technical about it, the CPU really isn't doing any "work" as the only type of energy that the electric potential is being converted to is heat. "Work" would suggest that the CPU is converting that energy into a force through some sort of mechanism. Ideally, the CPU wouldn't convert any energy to heat at all, but that would suggest zero resistance in all states of the circuit.

You, however, can do work (in your body,) by throwing it.


----------



## Bill_Bright (Oct 19, 2019)

Aquinus said:


> Well, if we're really going to get technical about it, the CPU really isn't doing any "work" as the only type of energy that the electric potential is being converted to is heat. "Work" would suggest that the CPU is converting that energy into a force through some sort of mechanism.


It is too doing work. Those gates are made of electrons that are physically moving - as in a mechanical movement or mechanism!  They physically flip open ("0" or "low") to block current flow through the gate and close ("1" or "high") to allow current flow.

That's exactly why wires get hot when too much current is flowing - it is due to the "friction" of too many electrons banging into the atoms of the conductor.


Aquinus said:


> Ideally, the CPU wouldn't convert any energy to heat at all, but that would suggest zero resistance in all states of the circuit.


Right - as I said, ideally the CPU would convert 100% of the energy into work - that is, into flip-flopping those gates. But due to friction (resistance), some of that energy is wasted in form of heat.


----------



## Aquinus (Oct 19, 2019)

Bill_Bright said:


> It is too doing work. Those gates are made of electrons that are physically moving - as in a mechanical movement or mechanism!  They physically flip open ("0" or "low") to block current flow through the gate and close ("1" or "high") to allow current flow.
> 
> That's exactly why wires get hot when too much current is flowing - it is due to the "friction" of too many electrons banging into the atoms of the conductor.
> Right - as I said, ideally the CPU would convert 100% of the energy into work - that is, into flip-flopping those gates. But due to friction (resistance), some of that energy is wasted in form of heat.


If you're observing a single electron, sure, there is work, but the system as a whole does no work. The change in kinetic energy for the entire system is zero, so no work.


----------



## TheoneandonlyMrK (Oct 19, 2019)

authorized said:


> Good power supplies have fairly flat efficiency curves. You'll end up spending more money buying this powerful psu than you will ever save having a potential few % increase in efficiency.
> It's time for this 50% load golden zone to go away.
> Not to mention that most PCs spend most of their time in idle or at least in low energy consumption mode.
> 
> People are too obsessed with power saving these days, to the point they pay premium for the privilege of doing it.


There are other benefits of a big psu thats loaded light, they tend to have long warranties, last a long time anyway and suit any if not all reasonable or not scenarios , I would buy a suitable supply for a work or gaming pc but for enthusiasts swapping components and expectations on a whim a decent size PSU can be a reasonable purchase.
For 98% of use cases a 50-75 watt headroom gap is advisable to prolong life expectency and allow the occasional crazey power spikes that can occasionally happen, IMHO.
@the rest 
And I really think some people get a bit too involved in trying to prove themselves right in threads , usually on tangential points with marginal relevance.


----------



## ShrimpBrime (Oct 19, 2019)

Aquinus said:


> Power, not electricity. and when you say "consume" you mean that CPUs convert electric potential energy into heat energy, which is represented as power.
> 
> https://en.wikipedia.org/wiki/Power_(physics)



The "power" a cpu uses is in the form of electricity. 
You convert average electrical wattage to BTU dissipation which is a result of poor effeciency.

Since it takes power to operate a transistor which opens and closes it creates a heat because its not all entirely consumed (leaks) and transistors create heat from being on and during operation have no lubricant which also creates a heat.

However its looked at, heat is created and dissipated in the form of BTU. 

Just like TDP is not directly related to power usage but the heat which gets dissipates per hour measurement.

So yea we can say dissipate a power. Its just converted to heat dissipation, again due to lack of effeceincy.



theoneandonlymrk said:


> @the rest
> And I really think some people get a bit too involved in trying to prove themselves right in threads , usually on tangential points with marginal relevance.



Just sayin


----------



## Aquinus (Oct 19, 2019)

ShrimpBrime said:


> and during operation have no lubricant which also creates a heat.


This isn't an engine or a relay switch. Transistors have no moving parts. It converts energy to heat because circuits have resistance (not friction.) Nothing more, nothing less which is caused by electrons striking other atoms as opposed to traveling in a straight line unimpeded.


----------



## ShrimpBrime (Oct 19, 2019)

Aquinus said:


> This isn't an engine or a relay switch. Transistors have no moving parts. It converts energy to heat because circuits have resistance (not friction.) Nothing more, nothing less which is caused by electrons striking other atoms as opposed to traveling in a straight line unimpeded.


 Every action has an equal and opposite reaction. 
There is no such thing as friction without resistance. I'm not sure where exactly you're taking that?? 
Electrons being resisted creates friction and that's how heat is created no??
So you're dissipating the non efficient electrons that are being resisted causing friction dissipated as BTU correct??


----------



## Aquinus (Oct 19, 2019)

ShrimpBrime said:


> Every action has an equal and opposite reaction.
> There is no such thing as friction without resistance. I'm not sure where exactly you're taking that??
> Electrons being resisted creates friction and that's how heat is created no??
> So you're dissipating the non efficient electrons that are being resisted causing friction dissipated as BTU correct??


Just because friction and resistance both have the same outcome, converting some form of energy into heat energy, doesn't mean they're achieved through the same mechanisms. To quote wikipedia


> Electrical resistance shares some conceptual parallels with the notion of mechanical friction.


and


> A voltage difference between two points of a conductor creates an electric field that accelerates charge carriers in the direction of the electric field, giving them kinetic energy. When the charged particles collide with ions in the conductor, the particles are scattered; their direction of motion becomes random rather than aligned with the electric field, which constitutes thermal motion. Thus, energy from the electrical field is converted into thermal energy.[3]


Whereas friction is:


> *Friction* is the force resisting the relative motion of solid surfaces, fluid layers, and material elements sliding against each other.


----------



## Bill_Bright (Oct 20, 2019)

theoneandonlymrk said:


> There are other benefits of a big psu thats loaded light, they tend to have long warranties, last a long time anyway and suit any if not all reasonable or not scenarios.



This is all true - though lasting a long time is not a result - assuming the demand on the PSU is not above its capacity. A "quality" PSU can run at near capacity for just as long - as long as it is properly cooled. 

Your first point is an irritant for me that we see on all sorts of products. That is the bigger models have more features and often better warranties.

I just saw this with snow blowers - as a silly but applicable example. Go for a big 28" wide two-stage model and you can find them with multiple speeds forward and 2 or 3 speeds in reverse (or even variable speeds both ways). You can get them with lights, remote chute turners, even hand warmers and more. But try to find all those nice to have features on a 22" wide two-stage (that will fit next to my truck in my narrow garage) and you are lucky to get 2 speeds forward and that's it. And it is not that there is not room on the handles or the motor is too weak. 

Regardless, assuming the same "quality" of parts, as long as the amount of snow does not exceed the 22" blower's capacity, there is nothing to suggest the 28" blower would have a longer life expectancy.



theoneandonlymrk said:


> For 98% of use cases a 50-75 watt headroom gap is advisable to prolong life expectency and allow the occasional crazey power spikes that can occasionally happen, IMHO.


Pretty sure no one is recommending users buy a PSU that exactly matches the demand. I typically recommend an extra 100W to allow for future unforeseen upgrades and perhaps quieter operation, and I feel that makes sense. What I don't see making sense is buying a 1.2KW supply when a 600W supply is more than enough.

Sadly, with budget factory computer builds, we often see included PSUs that are barely adequate for the hardware they come with. Simply adding another hard drive may cause one to consider a bigger PSU. Upgrading the graphics solution - even from integrated to a modest card - likely would require a bigger supply. 



Aquinus said:


> The change in kinetic energy for the entire system is zero, so no work.


No. You could say that with each rest state between each clock cycle. Kinetic energy is not the only requirement for "work" in digital electronics.  Kinetic energy - which is the energy stored in a "moving" object. Part of the work done was the flipping of the gates, not the energy stored in the moving gate. But work is also done in simply "crunching the data". It takes "work" just to refresh a static display - even if that involved verifying nothing changed.

You are suggesting the lighting of lightbulb filament involves no "work" because there is no stored kinetic energy.  By your definition, a CPU would still be doing no work even if it was rendering a rotating, transparent 3D image of the USS Gerald R. Ford. 

And I am not talking about a single electron. I am talking about billions of electrons in 64-bit wide chunks, billions of times a second.

And I note a CPU is really never doing nothing - unless powered off.



ShrimpBrime said:


> You convert average electrical wattage to BTU dissipation which is a result of poor effeciency.


This is true, when you are talking about HVAC systems. We are not. You are right that heat (BTU) "generation" (not dissipation, BTW - that's another subject) is the result of poor efficiency. But heat from a CPU is a "by-product". And it is created by just  a small percentage of the total wattage consumed.

*******

Getting back on-topic - limiting power consumption in desktop builds starts before you buy your components. If the desire is to limit consumption, you wouldn't be buying a i9-9900K, or Ryzen 7 1800X with a 1080 Ti graphics card. There are many less hungry solutions out there. Careful and thorough homework will get you more efficient PSUs, motherboards,  drives, fans, and monitors too. Using headphones instead of power speakers can help. 

But frankly, if I am building a gaming rig - something for entertainment - I'm looking for the most bang for my (or the client's) money. Energy efficiency is not something I worry about except in the PSU. And even then, I'm generally happy with "Gold". This is the same for systems used for primarily for CAD/CAE or graphics editing work. If the best card or CPU for the money is a power hog, oh well. 

On the other hand, if building an "office" computer for a SOHO client, a computer to be used for Word, Excel and PowerPoint, email, updating Facebook and the occasional game of solitaire, saving energy may be a higher priority in selecting each component.


----------



## oobymach (Oct 20, 2019)

Bill_Bright said:


> True, except heat generated by a CPU is wasted energy. And that heat is dissipated by the IHS and heatsink.
> 
> CPUs convert electrical energy into work. And in physics (of which electronics is a part) work is the movement or displacement of an object. And within the CPU, those objects are the billions and billions of transistor gates that are flip-flopping back and forth to represent 1s and 0s.
> 
> Ideally, CPUs would convert 100% of the energy they consume into work. But CPUs are not 100% efficient. So some of that energy is simply wasted in the form of heat.


Yes, but it's not wasted if you live in a cold climate. My cpu heats my room to summer temperatures if I run it overclocked without the need for additional heating.


----------



## TheoneandonlyMrK (Oct 20, 2019)

oobymach said:


> Yes, but it's not wasted if you live in a cold climate. My cpu heats my room to summer temperatures if I run it overclocked without the need for additional heating.


Same here , crunching has it's direct benefits in winter but I've tuned my cooling and lowered clocks to save power and output a nice ambient temperature within the thermal limits of each component, my cpu hits 71 max the gpu 56 pulling about 338 watts average with it fully loaded crunching and folding.
Not bad with a vega 64 loaded up.


----------



## oobymach (Oct 20, 2019)

theoneandonlymrk said:


> Same here , crunching has it's direct benefits in winter but I've tuned my cooling and lowered clocks to save power and output a nice ambient temperature within the thermal limits of each component, my cpu hits 71 max the gpu 56 pulling about 338 watts average with it fully loaded crunching and folding.
> Not bad with a vega 64 loaded up.


Ran my 8370 during the summer underclocked to 3.9ghz (I ran it @ 4.7 most times) to stop it from making my room an oven. I have a window fan to control my room temp but it can only do so much when the temp outside soars. I'm playing with 3.9ghz on my 3600x atm. because it runs 10 degrees cooler than 4.2


----------



## Bill_Bright (Oct 20, 2019)

oobymach said:


> Yes, but it's not wasted if you live in a cold climate.


lol

Well, there's some truth to that. The winters here are bitterly cold too and when both computers and all 4 monitors in this room get going, this room can get rather toasty. But that then presents the problem when I then leave this room, the rest of the house "feels" like it is freezing. Not good. And in the summer time, it actually can cause the AC to work harder.  So in that respect, it means a lot of wasted energy.


----------



## ShrimpBrime (Oct 20, 2019)

Bill_Bright said:


> lol
> 
> Well, there's some truth to that. The winters here are bitterly cold too and when both computers and all 4 monitors in this room get going, this room can get rather toasty. But that then presents the problem when I then leave this room, the rest of the house "feels" like it is freezing. Not good. And in the summer time, it actually can cause the AC to work harder.  So in that respect, it means a lot of wasted energy.



2700X 28x multi 0.948v 16 threads. 30x multi 1.050v 16 threads. These are my "passive cooling" numbers.
Max package wattage usage under 50w. Cores about 26w. @ load F@H 2.8ghz under a volt. Under 75w at 3.0/3.1ghz.

Not bad huh?


----------

