# Going from 65 TDP too 125 TDP - Impact on electricity bills?



## xfxrising (Apr 10, 2013)

Building a new pc and i was gonna get a phenom 960t and try to unlock it (Im on a strict budget and i can get this for £70 which is great Price/performance) and im 15 so ya know, living with parents and all and because we now have a 4th child in the family, money is a little tighter than before xD So i was wondering how much of an impact it will have on electricity costs compared to what im using at the moment. 

I do a lot of video rendering so im spending like 4 hours+ a week at full load however 960t will probably finish these renders in like 1 hour and a half or less so im spending less time in load which should = less electricity being used over time right? And ill probably keep it at 4 cores during general use and unlock the other 2 when im editing/rendering so that im being more conservative on energy. May even see if i can underclock it if i need to 

So im guessing my real question is how much it would add to the electricity cost per month using the 960t compared to what im using now? im guessing it won't effect it too much but i wanted to ask anyway just to make sure.

*Additional Question: Would i be able to unlock the cores on a Gigabyte GA-78LMT-S2P mobo? It supports EC AOD-ACC but in the description it only mentions overclocking and nothing about unlocking*

Sorry for the crappy writting and rambaling aha


----------



## EarthDog (Apr 10, 2013)

Negligible... do the math... Find out how much your parents pay per KW/H and work it out. Chances are its certainly not anything your parents will notice. I cant imagine more than $1 /month if anything.


----------



## xfxrising (Apr 10, 2013)

EarthDog said:


> Negligible... do the math... Find out how much your parents pay per KW/H and work it out. Chances are its certainly not anything your parents will notice. I cant imagine more than $1 /month if anything.



pretty much what i was thinking, thanks


----------



## ruff0r (Apr 10, 2013)

EarthDog said:


> Negligible... do the math... Find out how much your parents pay per KW/H and work it out. Chances are its certainly not anything your parents will notice. I cant imagine more than $1 /month if anything.



Probably 30$ over 3 years more.

Price per KW/h 0.128$

Difference is 60W

60Wx3 hours = 0,180 kw/h x 0.128$ = 0,02304$ per day.


0,02304$ per day = 0,6912$ per month = 8,2944$ a year


----------



## Xenturion (Apr 10, 2013)

Joking aside, as was said, the change in price each month should be negligible. If you were planning on having your rig crunch 24/7, that might be a little bit different. Even then, I wouldn't expect it to be more than a couple of dollars.


----------



## Aquinus (Apr 10, 2013)

TDP is the measure of how much heat must be taken away from the CPU to keep it cool. TDP is *cooling power*. The actual usage of the CPU is going to be higher than the TDP because not all power is lost to heat. If you start overclocking that number changes and high TDP CPUs tend to eat more faster than lower TDP overclocked CPUs.

All in all, if you're not crunching I doubt it will change much. If the computer is under constant load all the time, I can see it easily adding up depending on the CPU. You won't know unless you put the current rig and the new on one a kill-a-watt but in general higher TDP CPUs use more power under load. Idle usages don't vary a whole lot (but do a litlte bit because SB-E has more components in the CPU,) between similar CPUs (like SB-E vs SB,) but loaded usages do.

Anything that uses a reeasonable amount of power constantly over time adds up long term.


----------



## EarthDog (Apr 10, 2013)

> The actual usage of the CPU is going to be higher than the TDP because not all power is lost to heat.



Actual power use is less usually.. it least it is on GPUs?

For example, look at the 650ti Boost. Its TDP is 140W, power use is 115W/127W(o/c). In the marketing materials they state it will hit 140W only using TDP apps (Furmark/Kombuster).

A bit confused at that statement I quoted.. but its likely me as I have been confused most of today at minimum.


----------



## Tatty_One (Apr 10, 2013)

Probably a chocolate bar a month


----------



## Sasqui (Apr 10, 2013)

There was a great article a few years back (and of course I have no idea where i read it, lol) ... about how *a faster processor that uses more wattage will save you money compared to a slower processor that uses less wattage*.

The electirc power per compute cycle is typically less for a newer/faster CPU.  Meaning a few things:


You'll be running at full speed less of the time (tasks are completed faster while using less overal power for the same task)
You'll probably have the computer on less, meaning the loss due to inefficiency will be decreased.

Viola!


----------



## Xenturion (Apr 10, 2013)

Depending on what 65W processor you're referring to, the Phenom II might have dramatically improved power gating and throttling that will reduce power consumption when the system is idle. Older processors weren't quite as effective at reducing power.

As far as the motherboard is concerned, it does support Core Unlocking. Every Gigabyte motherboard from the era had their "Core Unlocker" logo on the box. I'd imagine since the transition to the FX series, they decided to remove the logo to reduce confusion as it only applies to Phenom IIs. If you download the manual from Gigabyte's site, there is a section in the 'Advanced BIOS Features' submenu labeled "CPU Unlock". In the picture it's disabled because they've got a FX-8000 cpu installed.


----------



## xfxrising (Apr 10, 2013)

Xenturion said:


> Depending on what 65W processor you're referring to, the Phenom II might have dramatically improved power gating and throttling that will reduce power consumption when the system is idle. Older processors weren't quite as effective at reducing power.
> 
> As far as the motherboard is concerned, it does support Core Unlocking. Every Gigabyte motherboard from the era had their "Core Unlocker" logo on the box. I'd imagine since the transition to the FX series, they decided to remove the logo to reduce confusion as it only applies to Phenom IIs. If you download the manual from Gigabyte's site, there is a section in the 'Advanced BIOS Features' submenu labeled "CPU Unlock". In the picture it's disabled because they've got a FX-8000 cpu installed.



Thanks! And thanks to everyone else of course to, best forum ever lol


----------



## Aquinus (Apr 10, 2013)

EarthDog said:


> Actual power use is less usually.. it least it is on GPUs?
> 
> For example, look at the 650ti Boost. Its TDP is 140W, power use is 115W/127W(o/c). In the marketing materials they state it will hit 140W only using TDP apps (Furmark/Kombuster).
> 
> A bit confused at that statement I quoted.. but its likely me as I have been confused most of today at minimum.



Every company calculates TDP differently. nVidia is probably giving you peak consumption, not TDP. A circuit can't generate more heat than the power it takes in, that's physics. So unless its a space heater, I doubt it will be using >95% of its power as heat.

TDP is thermal design power. It represents how much cooling power in loaded operation would be required to take heat away from the GPU/CPU to maintain a safe usable temperature. Heat in this case is measured in watts, a unit of power over time as opposed to joules which is instantaneous power.


----------



## EarthDog (Apr 11, 2013)

Regurgitating the definition isn't answering my question... 

Its more than common for a tdp, no matter if its peak or average use, to be more than actual use. 3770k is 77w tdp and at stock igpu or not pulls less than that outside of ibt.


----------



## Aquinus (Apr 11, 2013)

EarthDog said:


> Regurgitating the definition isn't answering what I quoted... No worries though.



Learn to read. 



Aquinus said:


> Every company calculates TDP differently. nVidia is probably giving you peak consumption, not TDP. A circuit can't generate more heat than the power it takes in, that's physics.


----------



## EarthDog (Apr 11, 2013)

Edited... 

Still dorsnt answer shit. I haven't seen a tdp close to actual useage. Again I know the definition.


----------



## Aquinus (Apr 11, 2013)

EarthDog said:


> Still dorsnt answer shit. I haven't seen a tdp close to actual useage. Again I know the definition.



That's because TDP doesn't represent actual usage. If you understood any of that it would make sense. TDP is also a spec number, not a real number.

http://www.intel.com/content/dam/doc/white-paper/resources-xeon-measuring-processor-power-paper.pdf

You say you know the definition, but I still don't think you get it. Here, read a little more. 






...and once again TDP:



> TDP (Thermal Design Power)
> Intel defines TDP as follows: The upper point of the thermal profile consists of the Thermal Design Power (TDP) and the associated Tcase value. Thermal Design Power (TDP) should be used for processor thermal solution design targets. TDP is not the maximum power that the processor can
> dissipate. TDP is measured at maximum TCASE.1. The thermal profile must be adhered to to ensure Intel’s reliability requirements are met. Note: Different processors SKU’s have different TDP’s. At the time of this writing, Intel® Xeon® processors for 2 socket servers (5600 series) are available with a TDP specification from 40W up to 130W depending on the particular SKU1.


----------



## EarthDog (Apr 11, 2013)

You said power use of said part will be more than the tdp... That's the only point I am confused about. Not the definition... Not anything else. Just confused about what I quoted you on above. 

We know that isn't true.. look at you reviews that measure CPU power use. Its less than the tdp right?


----------



## Aquinus (Apr 11, 2013)

EarthDog said:


> We know that isn't true.. look at you reviews that measure CPU power use. Its less than the tdp right?



Not all tasks that put that CPU at 100% causes the CPU to consume the maximum amount of power possible.

Most reviews I see, the total system power is usually well over the TDP, but that is because most reviews give you power consumption of the entire rig.
http://www.guru3d.com/articles_pages/core_i7_3770k_and_3750_review_with_z77,12.html

I think Cadaveca might have done some review measuring the current off the 12v EPS on motherboard reviews, but I could be wrong.


----------



## EarthDog (Apr 11, 2013)

Yes. Testing that is done right, like Dave's...clearly system power is not the power consumption of the CPU only and can be over a CPU's tdp.


----------



## silkstone (Apr 11, 2013)

Aquinus said:


> TDP is the measure of how much heat must be taken away from the CPU to keep it cool. TDP is *cooling power*. The actual usage of the CPU is going to be higher than the TDP because not all power is lost to heat.



Where else is that energy going to go, if not heat? Light? Sound? Movement? I am pretty sure that all the energy used by a CPU is converted to heat.


----------



## phanbuey (Apr 11, 2013)

silkstone said:


> Where else is that energy going to go, if not heat? Light? Sound? Movement? I am pretty sure that all the energy used by a CPU is converted to heat.



I wouldn't think so as it takes impedance / bad conductivity to generate heat from electricity...  Im pretty sure that the transistors consume most of the electricity in their operation, and eventually it flows out of the cpu into the maiboard to communicate with other system components / dissipated some other way.

I don't know but I would bet that CPU makers rely on designs that ensure as little energy as possible is converted to heat.

Either way older cpu's tend to be much less power friendly, as was said, so OP may actually save money lol.


----------



## silkstone (Apr 11, 2013)

phanbuey said:


> I wouldn't think so as it takes impedance / bad conductivity to generate heat from electricity...  Im pretty sure that the transistors consume most of the electricity in their operation, and eventually it flows out of the cpu into the maiboard to communicate with other system components / dissipated some other way.
> 
> I don't know but I would bet that CPU makers rely on designs that ensure as little energy as possible is converted to heat.



It is all converted to heat one way or another. You are pushing electrons down a 22 nm "wire," I would say that the impedance would be quite high even though it is a semi-conductor.

The energy doesn't just flow out of the CPU. I'm not sure how they calculate the power requirements of a CPU, but I'm fairly certain that the motherboard has it's own power supply and so the energy usage of the CPU would be pretty independent of the mainboard.

http://en.wikipedia.org/wiki/Forms_of_energy


----------



## Aquinus (Apr 11, 2013)

silkstone said:


> It is all converted to heat one way or another. You are pushing electrons down a 22 nm "wire," I would say that the impedance would be quite high even though it is a semi-conductor.
> 
> The energy doesn't just flow out of the CPU. I'm not sure how they calculate the power requirements of a CPU, but I'm fairly certain that the motherboard has it's own power supply and so the energy usage of the CPU would be pretty independent of the mainboard.
> 
> http://en.wikipedia.org/wiki/Forms_of_energy



Yeah, but all circuits don't lose all their energy to heat when it starts moving electrons. For all of the power entering to be released as heat would make it a space heater.

Let's assume for a moment that 77-watts is the TDP for the 3770k. The formula for heat is that Q is directly proportional to the resistance and the square of the current. You make it sounds like the power that enters the motherboard stays there and does something until the system can "eat it." That is not the case.

As electricity powers a circuit *some power is lost due to impedance as the electron travels,* however by the time that electron is done doing work on any given circuit the electron returns to ground and gets put back on the power line in the opposite direction.

So no, a computer is not a perfect space heater and if it were, you would want a new computer. Unless you really like leakage, in that case you might have a good LN2 chip, but in general less leakage is better and not all CPUs have the maximum amount of leakage possible in any circuit.

I'm pretty sure when my computer is running full power that if it was converting it all to heat, 450-watts under load would warm up the room within minutes. It does not.

Every CPU has a different amount of leakage, so claiming that all of it is used as heat is inaccurate and wrong since different CPUs generate more or less heat during operation, which by itself means that it is not consistent.


----------



## silkstone (Apr 11, 2013)

Aquinus said:


> Yeah, but all circuits don't lose all their energy to heat when it starts moving electrons. For all of the power entering to be released as heat would make it a space heater.
> 
> Let's assume for a moment that 77-watts is the TDP for the 3770k. The formula for heat is that Q is directly proportional to the resistance and the square of the current. You make it sounds like the power that enters the motherboard stays there and does something until the system can "eat it." That is not the case.
> 
> ...



Power is measured in Joules per second. It is the rate at which work is done. Work is done when energy is transformed. If a system is using 100 W of electrical power, it is literally changing 100 J of energy per second into a different form. That from must be electrical to heat + some magnetic & radiated energy.

There is no 'leakage' energy doesn;t just dissapear. It is either transformed or it isn't. 

I'm also not exactly sure what you mean by space heater. They transform electrical energy, primarily, into radiated heat in the infrared spectrum (80%) whereas a cpu will be converting it into conductive (and to some extent convective) heat. You are mistaken that it would heat a room within minutes. Radiated heat and conducted heat act differently.

Also, Semi-conductors act differently from standard materials. some further reading: http://nopr.niscair.res.in/bitstream/123456789/8335/1/IJPAP 44(7) 543-547.pdf
Semi-conductors are non-ohmic conductors and so you cannot equate Q= I^2.R. But even if it did, remember that Power is also equal to I^2.R (P=I^2.R) 
P=Q=I^2.R - Electrical energy is transformed into heat energy at an efficiency of 100% for an Ohmic conductor.
http://en.wikipedia.org/wiki/Joule_heating
http://en.wikipedia.org/wiki/Power_(physics)

I imagine that you are right in that not 100% of the energy is transformed into heat. But, your reasoning doesn't make sense. There will be some radiant energy produced by the transformation, but I am guessing that this would be minimal in comparison to the amount of heat energy.


----------



## RCoon (Apr 11, 2013)

EarthDog said:


> You said power use of said part will be more than the tdp... That's the only point I am confused about. Not the definition... Not anything else. Just confused about what I quoted you on above.
> 
> We know that isn't true.. look at you reviews that measure CPU power use. Its less than the tdp right?



The laws of physics dictate that in order for a certain amount of energy to come out (in this case TDP(HEAT)) an equal or more amount of energy must come in(ELECTRCITY). That means the TDP will change depending on how much power is being drawn by the CPU in order to perform cycles. So if the CPU is generating 95watts of TDP, the CPU MUST be drawing AT LEAST 95watts of energy, but of course 100% of energy isnt converted to heat in a processors case, so in may actually be drawing 110watts of power and the process of the CPU using the energy has a biproduct of heat. More often than not, at full load and at full TDP (ie Maximum heat coming from the processor), the processor is likely drawing way more than the TDP wattage.


----------



## silkstone (Apr 11, 2013)

RCoon said:


> So if the CPU is generating 95watts of TDP, the CPU MUST be drawing AT LEAST 95watts of energy, but of course 100% of energy isnt converted to heat in a processors case, so in may actually be drawing 110watts of power and the process of the CPU using the energy has a biproduct of heat.



Dare I ask, what other forms of energy transformation would be involved within a CPU?


----------



## RCoon (Apr 11, 2013)

silkstone said:


> Dare I ask, what other forms of energy transformation would be involved within a CPU?



Most energy is lost in heat created by the process of the CPU, the rest is lost through kinetic energy from the switches (transistors). TDP, or heat generated from the CPU makes up for most of the electricity used. The same way that coal stations generate 90% of the energy as heat, the rest is light etc


----------



## qubit (Apr 11, 2013)

RCoon said:


> Most energy is lost in heat created by the process of the CPU, *the rest is lost through kinetic energy from the switches (transistors).* TDP, or heat generated from the CPU makes up for most of the electricity used. The same way that coal stations generate 90% of the energy as heat, the rest is light etc



A CPU has no moving parts, so there's no kinetic energy involved. Moving electrons don't count.

If a CPU is using 95W, then it's using 95W, not 95W and a bit.


----------



## silkstone (Apr 11, 2013)

RCoon said:


> The same way that coal stations generate 90% of the energy as heat, the rest is light etc



I am pretty sure that 100% of the energy from combustion is converted to heat. Either as Chemical > heat or to a much lesser extent, Chemical > Light > Heat.

You might be thinking of the fact that moisture in the coal undergoes a state change from liquid to gas which as we know is just hot water. This is pretty much equivalent to a direct chemical > heat energy transformation.

There would also be energy 'lost' from the change in state of the coal itself from a solid into a gas through the reaction of combustion, but i don't see that it could be considered equivalent to the processes that occur within a microprocessor.

The only thing I am not sure about is how the processors are rated. For example, if a processor is rated as having a maximum rated power of 100 W, does that mean just the processor or the processor + components of the motherboard on the same circuit? 

I would imagine that it is for the CPU alone as the manufacturer of the cpu would have no control over the remaining components in the circuit.

To get back to the question. The TDP would be the maximum power that a cpu would be able to dissipate as heat and would be related to to the maximum energy draw of a cpu, by design. You can probably imagine it as the maximum theoretical power draw that the cpu can cope with before melting. In reality, the maximum power draw of the cpu would be less than this as when the power usage approaches this you would encounter other issues, such as electron migration, causing the cpu to fail.


----------



## Aquinus (Apr 11, 2013)

silkstone said:


> I am pretty sure that 100% of the energy from combustion is converted to heat. Either as Chemical > heat or to a much lesser extent, Chemical > Light > Heat.
> 
> You might be thinking of the fact that moisture in the coal undergoes a state change from liquid to gas which as we know is just hot water. This is pretty much equivalent to a direct chemical > heat energy transformation.
> 
> ...



You know what happens to the rest of that energy that isn't converted to heat? It leaves the circuit and the electrons flow back to the street. If you measure the amount of energy that enters the circuit and compare it to the amount of energy that comes out of the circuit, that difference is the amount of energy converted to heat.

Resistance/impedance and current is what generates heat. If a circuit has no resistance it generates no heat and electricity can flow as much as it wants without any issue. So you're telling me, that a circuit that eats thousands of watts but has practically no resistance is going to explode because all that energy is converted to heat? 

I think you really need to learn what you're talking about.

You're right, the amount of energy must remain in equalibrium but you don't seem to understand that power flows out of the computer and back to the street after it has been used by the device.

So if you're putting P1 power into a circuit, Q is lost to heat, and P2 is the resultant power of the part of the circuit that leaves the device, you will find that:

P1 = P2 + Q, where Q is heat, P1 is the incoming power, and P2 is the outgoing power.

So just to use some easy number, if you pull 5A at 120V RMS AC and the current coming off the circuit is 4A, you can conclude that 120V*1A=120W "lost" from the circuit is the amount of power lost to heat, light, or kinetic energy (less likely).

Obviously I made up the numbers but it's supposed to be an example. In reality, I bet you that a computer doesn't release more than 50-75% of its consumed power as heat.


----------



## silkstone (Apr 11, 2013)

Aquinus said:


> You know what happens to the rest of that energy that isn't converted to heat? It leaves the circuit and the electrons flow back to the street. If you measure the amount of energy that enters the circuit and compare it to the amount of energy that comes out of the circuit, that difference is the amount of energy converted to heat.
> 
> Resistance/impedance and current is what generates heat. If a circuit has no resistance it generates no heat and electricity can flow as much as it wants without any issue. So you're telling me, that a circuit that eats thousands of watts but has practically no resistance is going to explode because all that energy is converted to heat?
> 
> ...



You really need to understand what energy, power, voltage (or more relevantly, potential difference) and current actually are.

Power is the rate at which energy is transformed (used). If it were "flowing back to the street" it wouldn't be power.

It's confusing, I know. But what you think is happening in the situation, isn't.

i am editing my post as I go to try to be polite. I just want to assure you that I do know what I am talking about.


----------



## Aquinus (Apr 11, 2013)

silkstone said:


> You really need to understand what energy, power, voltage (or more relevantly, potential difference) and current actually are.
> 
> Power is the rate at which energy is transformed (used). If it were "flowing back to the street" it wouldn't be power.
> 
> ...



Power is just work done over time. The only "work" a circuit does is generate heat or light. So the electricity that is converted to heat and light would be the only power that the circuit generates.

However we're talking about consumption. Some of that consumption generates power in the form of heat, but the rest of the electricity is not used in the sense of converted from one form of energy to another. *It does not mean that electricity that passes through the circuit always generates all of its potential energy as heat.* It's a false statement and you have to stop making it. It's a clear indication that you don't know what you're talking about.

So you're right that the electricity that flows through the circuit and not converted to another form isn't power, your right... but that doesn't mean that the entire consumption is heat or that this consumption doesn't exist.

You're describing energy lost as heat, not power consumption. A portion of that power consumption is heat, yes, but not all of it and the rest of that electricity exits/enters the circuit to neutral depending on the current phase of AC.

If you were to take the 4/8-pin EPS power off your motherboard and measure the current from +12v to the board and again from the board to the ground, you would find that the current entering the CPU is greater than the current exiting the CPU... and there is current leaving the CPU. That change in current with a stable voltage indicates a loss of electric potential energy that is converted to heat energy, which is what I'm describing.

What you're describing would convert all the electricity to heat, electrons wouldn't flow to ground. It would all get used up before then and that is definitely not the case. It's not how circuits work.

You can keep on claiming that you're knowledgable, but you've yet to prove that. You just keep claiming the same thing and spitting out the same old rhetoric. It's not proving your point and just saying I'm wrong doesn't make you right.


----------



## silkstone (Apr 11, 2013)

TBH, I don't really want to teach High School physics to you. I should really have PM'd you a while ago to explain, I'm sorry. But, I do recommend grabbing a book on this stuff and reading up a little.

Your terminology is all off for a start, and I understand your concept of electricity, but it is just plain wrong.

"Power is just work done over time. The only "work" a circuit does is generate heat or light. So the electricity that is converted to heat and light would be the only power that the circuit generates."

Let me be clear. Almost all of the energy 'used' by the cpu is converted to heat.
Talking about energy being lost as heat? What is the useful energy in the circuit? You are confusing the classical model of efficiency with what is used when talking about computers (MHz/TDP W.) Mhz is not a standard unit of measurement for any kind of energy afaik.


"It does not mean that electricity that passes through the circuit always generates all of its potential energy as heat."

The electric power, in watts, associated with a complete electric circuit or a circuit component represents the rate at which energy is converted from the electrical energy of the moving charges to some other form, e.g., heat, mechanical energy, or energy stored in electric fields or magnetic fields. -http://hyperphysics.phy-astr.gsu.edu/hbase/electric/elepow.html

The current entering the circuit is greater than the current exiting the circuit? Really? You have just violated one of the fundamental laws of Physics right there.

I was trying to point out your mistakes in previous posts rather than insult you, so please if you want to understand how it works, admit your ignorance rather than trying to insult me. If you like, I could post my credentials to 'prove' that I am more knowledgeable, but i would rather not rub it in.

If i were you, I would keep a record of your posts for if in future you decide to file a lawsuit against your high-school Physics teachers, they would prove your case quite well.

[Edit] - Your avatar is a little ironic btw


----------



## Aquinus (Apr 11, 2013)

silkstone said:


> Your terminology is all off for a start, and I understand your concept of electricity, but it is just plain wrong.



I apologize, my use of the terminology is a little rusty. It's been a while since I've had to do any physics so some of what I'm trying to convey might not be right.



silkstone said:


> Let me be clear. Almost all of the energy 'used' by the cpu is converted to heat.



Pretty sure I said that already...



Aquinus said:


> Power is just work done over time. The only "work" a circuit does is generate heat or light. So the electricity that is converted to heat and light would be the only power that the circuit generates.





silkstone said:


> The current entering the circuit is greater than the current exiting the circuit? Really? You have just violated one of the fundamental laws of Physics right there.



http://en.wikipedia.org/wiki/Kirchhoff's_circuit_laws
Yup. You got me there, the reduced voltage potential is where the heat comes from, not reduced current after being impeded. I remembered which one wrong. Like I said, I'm rusty. I'm not an engineer so I don't use it often and most of what I program doesn't calculate physics. I do own up to this one which some other statement would be rendered wrong.

The real point I'm making (despite my poor memory of physics,) is that not all power that flows through the processor will become heat in the processor. If there is a single pin off the CPU where the voltage potential is above zero and has current flowing out of the CPU, you have energy leaving the CPU that could be used elsewhere and the CPU signals plenty of things that aren't on the CPU like DRAM, PCI-E, DMI, etc. 

Anyways this is digressing in addition to me being wrong. This was about TDP and none of this actually matters for that case. A CPU that makes more heat might also execute instructions faster than another CPU, so TDP isn't an indication weather or not it will use more electricity or not.


----------



## silkstone (Apr 12, 2013)

Aquinus said:


> I apologize, my use of the terminology is a little rusty. It's been a while since I've had to do any physics so some of what I'm trying to convey might not be right.
> 
> 
> 
> ...




I think I understand what you are trying to explain, so let me try to clarify what is happening.

Electrons carry kinetic energy around a circuit. When the electrons reach the positive terminal, they are still carrying some of that 'kinetic energy,' but with DC, at least, I think it is incorrect to say that the residual kinetic energy is 'used.' 

What actually happens, is they are re-energized and pushed back on to the negative terminal of the power supply. The only energy usage would be in the internal resistance of the power supply, which is of course independent of CPU.

So when we talk about power usage of a CPU we are only talking about the energy dissipated by the CPU, we don't worry about the power dissipated by the internal resistance of the power supply.

The TDP of a CPU can be considered to be the maximum (theoretical) rate of energy usage of the CPU.


----------



## Aquinus (Apr 12, 2013)

silkstone said:


> Electrons carry kinetic energy around a circuit. When the electrons reach the positive terminal, they are still carrying some of that 'kinetic energy,' but with DC, at least, I think it is incorrect to say that the residual kinetic energy is 'used.'



Kind of yeah. It's not kinetic energy until it's converted though. If it's still electrons moving around a circuit it's electric potential energy until it's converted I thought. The electrons themselves have a little bit of kinetic energy but not much.



silkstone said:


> The TDP of a CPU can be considered to be the maximum (theoretical) rate of energy usage of the CPU.



Yes... Kind of? As I said before, TDP is a spec number.

I'll just put this here again, because Intel's definition of TDP is a little peculiar. TDP isn't the maximum, but rather what maximum would the cooler have to handle to keep in safe under all loaded conditions. I think it's over-estimated for stock speeds and voltages, but I can understand why Intel would do that as well.



> TDP (Thermal Design Power)
> Intel defines TDP as follows: The upper point of the thermal profile consists of the Thermal Design Power (TDP) and the associated Tcase value. Thermal Design Power (TDP) should be used for processor thermal solution design targets. TDP is not the maximum power that the processor can dissipate. TDP is measured at maximum TCASE.1. The thermal profile must be adhered to to ensure Intel’s reliability requirements are met. Note: Different processors SKU’s have different TDP’s. At the time of this writing, Intel® Xeon® processors for 2 socket servers (5600 series) are available with a TDP specification from 40W up to 130W depending on the particular SKU1.


----------



## Depth (Apr 12, 2013)

A 90% effective PSU will draw ~450 Watts to deliver 400 Watts.

By applying some of the logic used here I have concluded that this doesn't happen.


----------



## EarthDog (Apr 12, 2013)

Yada.

Actual CPU power consumption at stock speeds is not higher than its tdp. Period. Look at Dave's reviews which measure from the eps 12v. Simple. Factual. Over.


----------



## repman244 (Apr 12, 2013)

silkstone said:


> The TDP of a CPU can be considered to be the maximum (theoretical) rate of energy usage of the CPU.



You can't always go with TDP = power consumption theory tho. These ratings are different for every manufacturer (in this case AMD and Intel), if I remember correctly AMD designates the TDP is the absolute maximum but Intel is using some average or something like that.

The thing about TDP is that if you have 2 ratings (let's use 95W and 130W since they are quite common) and if you have a CPU which in reality is 96W it will fall in the 130W TDP category.


----------



## silkstone (Apr 12, 2013)

Aquinus said:


> Kind of yeah. It's not kinetic energy until it's converted though. If it's still electrons moving around a circuit it's electric potential energy until it's converted I thought. The electrons themselves have a little bit of kinetic energy but not much.
> 
> 
> 
> ...



It is kinetic energy, so long as the electron is acting as a particle.
electric potential energy is the energy that can be converted to kinetic energy. 

You can think of it like this: It is similar to an elastic potential energy. Electrons on the negative terminal are squashed together. The more work you do on the electrons, the closer they get together. They are exerting a force on each other (Coulomb Force) due to the fact that like charges repel each other. When the circuit is open, these electrons have no where to go.

When you close the circuit, they now have somewhere to go. The will tend towards the positive terminal, which is also a collection of electrons spaced further apart. The amount of force between the electrons is proportional to the speed at which the electrons 'flow' down the wire towards the positive terminal. The difference in energy levels is called the potential difference and is measured in Volts (joules per coulomb). 

I'm a little bit drunk atm, it being a Friday night, so I don't know if I am explaining it fully. However, this is more or less the same way as I explain it to my students, albeit with visuals and spending more time on concepts.

An example of how a DC light bulb might work is: Electric potential energy > Kinetic > Heat.
The electrons are squashed together, they fly apart (potential > Kinetic), as they move along the wire, they collide with the atomic structure of the conductor transferring their kinetic energy and causing the atomic structure of the conductor to vibrate (heat). Conductive heat can also be considered a type of kinetic energy.

I'll explain the second part after the quote below.



repman244 said:


> You can't always go with TDP = power consumption theory tho. These ratings are different for every manufacturer (in this case AMD and Intel), if I remember correctly AMD designates the TDP is the absolute maximum but Intel is using some average or something like that.
> 
> The thing about TDP is that if you have 2 ratings (let's use 95W and 130W since they are quite common) and if you have a CPU which in reality is 96W it will fall in the 130W TDP category.



As far as I understand TDP, it is the maximum theoretical energy usage that can be dissipated. So, yes it is usually lower than real-world usage.

Wiki definition: http://en.wikipedia.org/wiki/Thermal_design_power


----------



## Aquinus (Apr 12, 2013)

repman244 said:


> You can't always go with TDP = power consumption theory tho. These ratings are different for every manufacturer (in this case AMD and Intel), if I remember correctly AMD designates the TDP is the absolute maximum but Intel is using some average or something like that.
> 
> The thing about TDP is that if you have 2 ratings (let's use 95W and 130W since they are quite common) and if you have a CPU which in reality is 96W it will fall in the 130W TDP category.



You have that backwards. AMD used to use average power where Intel specifies a max. I don't know if AMD still measures TDP that way though.


----------



## silkstone (Apr 12, 2013)

Aquinus said:


> You have that backwards. AMD used to use average power where Intel specifies a max. I don't know if AMD still measures TDP that way though.



I know they are rated differently by different manufactures, but in general, I was going from the wiki article:

"The thermal design power (TDP), sometimes called thermal design point, refers to the maximum amount of power the cooling system in a computer is required to dissipate. The TDP is typically not the most power the chip could ever draw, such as by a power virus, but rather the maximum power that it would draw when running "real applications". This ensures the computer will be able to handle essentially all applications without exceeding its thermal envelope, or requiring a cooling system for the maximum theoretical power (which would cost more but in favor of extra headroom for processing power)."

Edit - I see my mistake.

So to be clear, it is the maximum energy that the CPU can dissipate before melting, as I said before, but the power draw can be higher. If the power draw exceeds the TDP, the cpu will melt without adequate cooling.

Edit 2 - in my previous post, i should have said "higher than average usage"


----------



## repman244 (Apr 12, 2013)

Aquinus said:


> You have that backwards. AMD used to use average power where Intel specifies a max. I don't know if AMD still measures TDP that way though.



Are you sure about that, because all of the sources (and even AMD) said that AMD uses the maximum and Intel uses average.



> TDP or Thermal Design Power is defined by AMD as (from AMD Family 10h Server and Workstation Processor Power and Thermal Data Sheet, Rev 3.04 - June 2009):
> 
> "The maximum power a processor draws for a thermally significant period while running commercially useful software. The constraining conditions for TDP are specified in the notes in the thermal and power tables."





> AMD's TDP is close to the electrical maximum a CPU can draw (when it is operating at its maximum voltage).
> Intel's TDP is a "round up" average of power measurements of processor intensive benchmarks.



I think what's confusing here is the ACP (which AFAIK AMD uses for Opterons since it's more relevant than TDP and is more accurate at indicating the power consumption).



> AMD's ACP uses a "round down" average of power measurements performed with industry standard benchmarks (usually running at 100% CPU load, with the exception of Stream).








silkstone said:


> I know they are rated differently by different manufactures, but in general, I was going from the wiki article:
> 
> "The thermal design power (TDP), sometimes called thermal design point, refers to the maximum amount of power the cooling system in a computer is required to dissipate. The TDP is typically not the most power the chip could ever draw, such as by a power virus, but rather the maximum power that it would draw when running "real applications". This ensures the computer will be able to handle essentially all applications without exceeding its thermal envelope, or requiring a cooling system for the maximum theoretical power (which would cost more but in favor of extra headroom for processing power)."
> 
> ...



It doesn't apply for all the manufacturers tho. AMD says it's the electrical maximum so it should mean that in real world it will probably always be lower and never higher than the specified TDP.


----------

