• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel lying about their CPUs' TDP: who's not surprised?

So in other words, you have nothing.
That's not it at all. There are methods of determining power usage in a given circuit and tools to help in such an effort. I don't like the attitude you displayed toward me and as such I'm not inclined to do your research for you. Have fun.
 
That's not it at all. There are methods of determining power usage in a given circuit and tools to help in such an effort. I don't like the attitude you displayed toward me and as such I'm not inclined to do your research for you. Have fun.

What attitude? I simply asked for you to share the type of device you recommended to measure CPU power usage since you stated it was more accurate than software tools like HWinfo, or RyzenMaster.
 
What attitude? I simply asked for you to share the type of device you recommended to measure CPU power usage
It was the way it was stated. Kinda came off aggressive and condescending.

it was more accurate than software tools like HWinfo, or RyzenMaster.
No software tool can be anything but a ballpark estimation(and most of the time, not all that close to actual usage) because of all the variations motherboard makers put into their products. More precise methods for measuring power usage are available, a simple one being a KillaWatt meter, which is an amazingly accurate type of device for how simple it is. Testing methods are simple. Measure power at idle, then induce a CPU load and measure again. That's your CPU usage under load. Prime95 is exceptional at this task as you can configure it to run on the CPU(within L2/L3 cache) alone.

Devices for more in depth and precise measurements are available, but are generally more costly than the average user would want to spend for such a task.
 
No, but if you know your baseline power usage, calculating the draw from the CPU is trivial.
the cpu draw is not above 30
you can say margin of error is 40w
in the event the apu is on the usage goes from 40-50
probs why its a 65w part
 
Turbo was not governed completely by temperature. There have been power limits in place for a long while. Power limits simply were not hit or were not hit in a significant way.
Stock 8700K will not boost to 4.6GHz on all cores, not even with the fudged power settings. Frequency table is 4.3GHz for max all-core turbo. If yours does, it's MCE or equivalent in motherboard BIOS.

I know a stock 8700k won't all core boost to 4.6GHz. Like I said, that is the default behavior in THE Z390 board I put it in. The default behavior of the 8700k is what I see in a B365 board. But what that shows is how different motherboards change the behavior of the CPU.
 
the cpu draw is not above 30
you can say margin of error is 40w
in the event the apu is on the usage goes from 40-50
probs why its a 65w part
Sorry, that's not the way an APU works. When you induce a CPU load, only the CPU portion of the APU is used. The GPU side stands mostly idle.
 
It was the way it was stated. Kinda came off aggressive and condescending.


No software tool can be anything but a ballpark estimation(and most of the time, not all that close to actual usage) because of all the variations motherboard makers put into their products. More precise methods for measuring power usage are available, a simple one being a KillaWatt meter, which is an amazingly accurate type of device for how simple it is. Testing methods are simple. Measure power at idle, then induce a CPU load and measure again. That's your CPU usage under load. Prime95 is exceptional at this task as you can configure it to run on the CPU(within L2/L3 cache) alone.

Devices for more in depth and precise measurements are available, but are generally more costly than the average user would want to spend for such a task.

Ok, we're back where we started.

The method you describe doesn't take into consideration the efficiency of the power supply. Maybe the power supply is only 80% efficient at idle, but 90% efficient when the full CPU load is induced. The amount of power measured at the kill-a-watt only reflects what the PSU is drawing and we can't determine that the power difference between idle and load isn't effected by power supply efficiency.
 
Sorry, that's not the way an APU works. When you induce a CPU load, only the CPU portion of the APU is used. The GPU side stands mostly idle.
when i say the apu is enabled
it kinda means its being used
cause its a cpu and GPu load
 
I have NEVER once, Not even in the 24 years of building have I once considered TDP as a build/Buy point.
I have read through this thread and also have come to this pondering.
I am however an AMD fanboy And can surely say the FX chips SUCK power and ASS! Need a fing 850W PSU just to power an FX8300 and HOLY crap the POWER crazy CHIP can heat a Double wide in Alaska!
SO yeah never really gave a crap about TDP...
As someone who lives in a hot climate and needs to run aircon for a good part of the year if want to be comfortable when running the PC hard for a long time, I very much do consider how much power the system draws and I'm always happy with more efficient hardware and tbh AMD are doing better at that then intel are at stock settings. The power draw of the old FX series processors made sure I never even looked that way when I set up my first rig and same with the r9 GPUs.
 
The method you describe doesn't take into consideration the efficiency of the power supply.
You're right, it doesn't. No method is perfect. However, where we are measuring actual power used, it is very accurate.

when i say the apu is enabled
it kinda means its being used
cause its a cpu and GPu load
No, it isn't. When you induce a CPU load, only the CPU is used. The GPU stands idle. Contrariwise when you induce a GPU load, the CPU more or less stands idle.
 
Last edited:
y
No, it isn't. When you induce a CPU load, only the CPU is used. The GPU stands idle. Contrariwise when you induce a GPU load, the CPU more or less stands idle.
you can load both up at the same time
both are at 100%
 
As someone who lives in a hot climate and needs to run aircon for a good part of the year if want to be comfortable when running the PC hard for a long time, I very much do consider how much power the system draws and I'm always happy with more efficient hardware and tbh AMD are doing better at that then intel are at stock settings. The power draw of the old FX series processors made sure I never even looked that way when I set up my first rig and same with the r9 GPUs.
Well that's one.
Most users even heavy over clockers do not use TDP in determining there choices.
I think it comes into play when determining the HSF or cooling I will be using for sure but nothing past that.
 
Of course it does. Performance determines how much "work" can be accomplished in a given amount of time with a given amount of energy.

What??? Do think those gates are just flipping and flopping back and forth for fun or no reason? NOOOOO! They are doing "work"! Crunching numbers. Processing data.

I go back to my previous statement. You keep dismissing, ignoring, or just plain don't understand that the amount of work being accomplished cannot just summarily be omitted from the equation when determining a processor's (or any machine's) efficiency. Work must be factored in too!

For the purpose of the this thread in relation to Intel's definition of TDP, that value is used to determine how much cooling is required. It is NOT meant as a means to compare that Intel CPU to an AMD CPU. That's why if you go to that Intel CPU's ARK again (see here) and click on the "?" next to TDP, you will see where it directs readers to the Datasheet for "thermal solution requirements". It does not mention efficiency or work accomplished. Work load, yes, but that is not the same as work accomplished.

Work in this context doesn't have anything to do with CPU performance. Work in a physical system is all about the coversion of energy. What you're considering work here is only work in the conceptual sense, the number of calculations per second, which isn't a physical quantity. We start with electrical energy (joules) that is applied over time (joules/second=watts). Forms of energy are kinetic, potential, chemical, radiant, and thermal. Electricity is a form of potential energy, which we then call on to flip transitors for our logic gates (and run some fans and probably RGB lights these days). Work happens when that potential energy takes another form. Inside a CPU, the only conversion that can happen is to thermal. Unless you manage to set it on fire (chemical/radiant), blow the lid off (kinetic) or something equally dramatic. Simply put, energy in equals energy out. The only energy in is potential/electrical, so unless I'm missing something big, all the energy out is thermal.
 
There is one thing I am getting from all this.
Intel is NOT lying!
 
True. But when you are testing power draw loads, you only want to load one or the other, not both.
but i want to measure power being drawn by the cpu
that includes that apu
there is 0 point saying my cpu draws 40w
if it draws 70w while gaming owing the the apu kicking in
 
The only energy in is potential/electrical, so unless I'm missing something big, all the energy out is thermal.
I'm glad that at least one person remembers the law of conservation of energy.
 
but i want to measure power being drawn by the cpu
that includes that apu
there is 0 point saying my cpu draws 40w
if it draws 70w while gaming owing the the apu kicking in
Holy CRAP why? What is the POINT? Pay your power bill and figure it out!
 
its called i dont trust my psu
Okay WHAT?
Well that is because you got yourself a POS you need to get yourself a real PSU a Corsair TX or CX Gold! you got the PSU gitters is all and Corsair will take all them away for good! ;):)
 
Work in this context doesn't have anything to do with CPU performance. Work in a physical system is all about the coversion of energy. What you're considering work here is only work in the conceptual sense, the number of calculations per second, which isn't a physical quantity. We start with electrical energy (joules) that is applied over time (joules/second=watts). Forms of energy are kinetic, potential, chemical, radiant, and thermal. Electricity is a form of potential energy, which we then call on to flip transitors for our logic gates (and run some fans and probably RGB lights these days). Work happens when that potential energy takes another form. Inside a CPU, the only conversion that can happen is to thermal. Unless you manage to set it on fire (chemical/radiant), blow the lid off (kinetic) or something equally dramatic. Simply put, energy in equals energy out. The only energy in is potential/electrical, so unless I'm missing something big, all the energy out is thermal.
Couldn’t agree more. Bottom line is that all electric power is transformed into heat inside CPU. The thing is that Intel states as TDP only 1 power stage of CPU, the lowest one, and AMD a portion of that overall CPU heat that “believes” or measures will be dissipated from CPU to cooler under the max power draw.

As stated on post#96
 
Testing methods are simple. Measure power at idle, then induce a CPU load and measure again. That's your CPU usage under load.

And you're just assuming CPU power draw is 0 when idle? Without knowing idle power draw, this method gives no usable information in regards to CPU power draw under load and isn't any more accurate than software methods for reading CPU power. In fact it's probably less accurate than software readings.

The best way to get CPU power draw is to put a multi-meter in line with the 12v CPU plug and directly measure the Amps going to the CPU. But even that isn't perfect, as it won't account for the inefficacy of the VRMs.

True. But when you are testing power draw loads, you only want to load one or the other, not both.

This is a gray area. The TDP rating of the CPU includes the iGPU as well. It's a package rating. So technically, if you are trying to compare it to the rating on the box, the iGPU is included. But on the other hand, when will you ever see both the CPU cores and the iGPU fully loaded? Even during gaming that isn't going to happen.
 
And you're just assuming CPU power draw is 0 when idle?
Not at all. I have done enough testing to know that regardless of the system being tested, the idle power usage is always very low, most of the time less than 50watts. Therefore testing power usage for one component or another is simple after a baseline is established.
The best way to get CPU power draw is to put a multi-meter in line with the 12v CPU plug and directly measure the Amps going to the CPU.
That is also a valid method, but is a bit more technical and involves some risk.
 
I'm so confused... AMD and Intel have been doing the same thing for FOREVER ?... the phenoms were rated for 94W that sucked down over 200W... Zen 3 while extremely efficient also consumes over its rated TDP... OP is posting on a chip rated for a TDP of 88W that at stock config will eat over 150W. Thermal Design Power (TDP) != Power Consumption.

What exactly is the problem? Is it that motherboards are yolo boosting to the moon because they can? Is it because intel can't get its sh*t together and is still on 14nm? I guess I am missing the part where we decided this was Intel's fault for lying...
The problem with Intel is this, they claimed that their product is faster than/ competitive against competitors' product, but they don't tell you the full story about the power consumption in order to get there. Their marketing material always takes aim at AMD, but reveals nothing about the power inefficiencies. I agree that TDP is now used to measure the guaranteed base clockspeed, but that base that Intel offers is very low and nowhere near competitive even if I were to compare it with an AMD with the same number of cores and TDP. You are right to say that the power required for boost will be higher, but taking the Ryzen 5900X as an example it has a TDP of 105W, with a max of 142W power limit at stock settings. Compared to a 65/95W Intel processor that actually guzzles down 250W or more, the latter is significantly more misleading. While this don't seem like a problem, but if someone who is not aware of this goes and buy a cheap motherboard, power supply and get a budget cooler, you can imagine the problems ahead.

please use your logic to explain how the 65W 10700 uses more power than the 125W 10700k
Logic and proof are here.


It could be a poorer binned chip being tested, but I believe Intel will keep the better binned chip for the K series chips since they are meant to run at high clockspeed and "overclockable" too. Where as the non K version runs at a lower clockspeed and locked out from overclocking.
 
Curiously, do the PSU calculator websites take these power draw into account?
 
Back
Top