• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Intel lying about their CPUs' TDP: who's not surprised?

Seems Ryzen master lies by quite a bit too then eh, I'll look into it.
3800X pbo on ppt225 tdc 150 edc125 says the cores are pulling 85 watts at 4.2 crunching here I think maybe AMD are calling out the wattage the actual cores will use not the whole chip perhaps but I have not seen Ryzen master report higher than the Tdp wattage used by the core's, hwinfo too and I do have a killawatt but obviously it can't really do anything but whole system.
And your whole system assumptions are tat the memory and subsystem also ramps with load and needs accounting for plus PSU losses.
And you don't comment on two to three times the wattage pulled pl1 and 2 so I can see where your at, I'll still be leaving you too it.

The other components don't really consume much more power when under load. RAM doesn't have an idle state, so it's power consumption under load is only a watt or two more than in its idle state. The rest of the system is the same deal.

And the PSU actually gets more efficient at the higher loads so that just makes things worse, or you can consider it basically cancelling out the minor extra power consumption from the other subsystems being under load. Either way, the fact remains, AMD processors definitely exceed their rated TDP too. And there is nothing wrong with it.
 
The other components don't really consume much more power when under load. RAM doesn't have an idle state, so it's power consumption under load is only a watt or two more than in its idle state. The rest of the system is the same deal.

And the PSU actually gets more efficient at the higher loads so that just makes things worse, or you can consider it basically cancelling out the minor extra power consumption from the other subsystems being under load. Either way, the fact remains, AMD processors definitely exceed their rated TDP too. And there is nothing wrong with it
I disagree with most of your points,for one thing of many ram has power down enabled by default these days.
But regardless I am out ,as I was three times now we will just have to disagree
 
Looking at the CPU PPT sensor on my 95w 3600XT, shows a max of 120w under a hard load.

At the wall this system pulls about 15w less then my highly clocked 3770K so about 250 with only a hard CPU load. The 3770K was "only" @ 84w according to Aida64. Core Temp said it was in the 100's of watts @ 4700Mhz 1.35v. My 3600XT is running 1 clock 1 voltage, like my 3770K

PSU calculator says 9900K requires 30w more than my XT. Everyone is full of shit :D

No you guys, the people working the numbers..
 
Looking at the CPU PPT sensor on my 95w 3600XT, shows a max of 120w under a hard load.

At the wall this system pulls about 15w less then my highly clocked 3770K so about 250 with only a hard CPU load. The 3770K was "only" @ 84w according to Aida64. Core Temp said it was in the 100's of watts @ 4700Mhz 1.35v. My 3600XT is running 1 clock 1 voltage, like my 3770K

PSU calculator says 9900K requires 30w more than my XT. Everyone is full of shit :D

No you guys, the people working the numbers..
The marked difference of a perspective with historical data and practical experience, is what that is.

That is the same basis I have and use for saying Intel is exceeding the norms of proper info on specsheets, and right now, certainly more so than AMD.
 
I disagree with most of your points,for one thing of many ram has power down enabled by default these days.
But regardless I am out ,as I was three times now we will just have to disagree

Not system RAM. System RAM just runs at the same speed and voltage all the time. Meaning it consumes basically the same under load as idle.
 
Not system RAM. System RAM just runs at the same speed and voltage all the time. Meaning it consumes basically the same under load as idle.
Err yes system ram has power down enabled by default on every Ryzen system I tried.
Seems we both have misconceptions then and still no comment on intel using upto 3X the power they market using but all's fair , no sir a very final goodbye to you.
 
Looking at the CPU PPT sensor on my 95w 3600XT, shows a max of 120w under a hard load.

At the wall this system pulls about 15w less then my highly clocked 3770K so about 250 with only a hard CPU load. The 3770K was "only" @ 84w according to Aida64. Core Temp said it was in the 100's of watts @ 4700Mhz 1.35v. My 3600XT is running 1 clock 1 voltage, like my 3770K

PSU calculator says 9900K requires 30w more than my XT. Everyone is full of shit :D

No you guys, the people working the numbers..
Because you CPU has a PPT limit of 125W by default and TDP is not desrcibing this...
 
Err yes system ram has power down enabled by default on every Ryzen system I tried.
Seems we both have misconceptions then and still no comment on intel using upto 3X the power they market using but all's fair , no sir a very final goodbye to you.

And Ryzen RAM Power Down is disable by default.

Actually I commented on that plenty, Intel processors don't use any more power than they market them using while AMD processors do.
 
Because you CPU has a PPT limit of 125W by default and TDP is not desrcibing this...
I'm still pretty new. Quite amateurish..
 
And Ryzen RAM Power Down is disable by default.

Actually I commented on that plenty, Intel processors don't use any more power than they market them using while AMD processors do.
Nah just re checked auto not disabled or enabled by default so depending on memory could be on or off.
And we disagree on point 2 the pl1 and 2 power use is not widely known to those not of an enthusiast level soo that's the point, and the point of this thread.
Not Intel's verses AMD.

And regardless of your opinion on it I still think Intel could do better on disclosure as do many others.
 
Just leaving some information here...

The only time the Intel rig drew more power was under artificial load like Prime95. Under gaming, single thread load, normal multi-thread load, and idle the 9900K drew less power than the 3700X.

So if your primary use case is running Prime95 AMD is definitely your best bet.


1612718968441.png
 
Nah just re checked auto not disabled or enabled by default so depending on memory could be on or off.
And we disagree on point 2 the pl1 and 2 power use is not widely known to those not of an enthusiast level soo that's the point, and the point of this thread.
Not Intel's verses AMD.

And regardless of your opinion on it I still think Intel could do better on disclosure as do many others.

All the boards I've used have it off by default, there isn't even an Auto option. And you have to go into like 5 menus deep to even find the option. So it likely comes down to a motherboard by motherboard basis. I would guess off is the default on most board simply because Memory Power Down is known to hurt RAM compatibility so most motherboard manufacturers would rather just leave it off to avoid the headache. Plus, it isn't like RAM uses that much power to begin with, 4 sticks of DDR4 use like 10w of power. And the test rig used here at TPU uses an X570 Taichi, which I know for sure from personal experience defaults to having it off.
 
The other components don't really consume much more power when under load. RAM doesn't have an idle state, so it's power consumption under load is only a watt or two more than in its idle state. The rest of the system is the same deal.

And the PSU actually gets more efficient at the higher loads so that just makes things worse, or you can consider it basically cancelling out the minor extra power consumption from the other subsystems being under load. Either way, the fact remains, AMD processors definitely exceed their rated TDP too. And there is nothing wrong with it.
Let's consider a point here that we haven't before - GPU power consumption and TDP.
My 3060Ti has a TDP of 200W given on NVIDIA's website. Even under testing during Unigine Valley and Heaven, it didn't exceed that number by more than 1-2%. Only when I adjusted the power limit of the card to 110%(1 8-pin connector meant I couldn't push it past 225W anyway), I was able to draw 220W from the card.

In other words, the TDP is something that I could depend on.
When I built my computer in 2016, I wanted to go for a 750W PSU, but ultimately went for a Gold 650W instead of a Silver/Bronze 750W unit. Still I got a motherboard with SLI compatibility so that one day I could run 2 970s instead of 1 and the CPU with a mild overclock.

Now, if the cards were consuming 175W+ each instead of 145W and the CPU 125W+ instead of its rated 88 then I'd have regretted depending on these numbers for my PSU choice.

I get that the 3060Ti is a special case because it's a power-limited card, but still, a piece of hardware should pull close to what it's rated power consumption is, otherwise the whole point of that number is moot.
 
I'm still pretty new. Quite amateurish..
Everyone was at some point. Sadly, there are some who forget that fact and sadly, assume everyone should know what they have learned. Or worse, ridicule the newbie for being a newbie and not yet knowing what they have learned. :(
 
Let's consider a point here that we haven't before - GPU power consumption and TDP.
My 3060Ti has a TDP of 200W given on NVIDIA's website. Even under testing during Unigine Valley and Heaven, it didn't exceed that number by more than 1-2%. Only when I adjusted the power limit of the card to 110%(1 8-pin connector meant I couldn't push it past 225W anyway), I was able to draw 220W from the card.

In other words, the TDP is something that I could depend on.
When I built my computer in 2016, I wanted to go for a 750W PSU, but ultimately went for a Gold 650W instead of a Silver/Bronze 750W unit. Still I got a motherboard with SLI compatibility so that one day I could run 2 970s instead of 1 and the CPU with a mild overclock.

Now, if the cards were consuming 175W+ each instead of 145W and the CPU 125W+ instead of its rated 88 then I'd have regretted depending on these numbers for my PSU choice.

I get that the 3060Ti is a special case because it's a power-limited card, but still, a piece of hardware should pull close to what it's rated power consumption is, otherwise the whole point of that number is moot.

TDP is not rated max power consumption. That's the problem doing DIY builds and not understanding what the numbers mean.

If you don't want to have to dig into and understand what the numbers mean, you should probably buy an OEM rig, or else figure on getting an outsized PSU. Alienware for example will not sell you an RTX 3090 without a 1000W PSU:

1612799683284.png
 
I'm sure people have opinions about me using a Mac as a daily driver. :p
hmmm... on checking, it's not on the approved list. :p
 
Are you surprised, really? This is from the same team that stuck a chiller under the table and pretended to release a new chip (overclocked) forgetting it is cooled by said chiller.
 
hmmm... on checking, it's not on the approved list. :p
I don't mind their phones but I wouldn't buy one of their computers :D
 
I don't mind their phones but I wouldn't buy one of their computers :D
Ditto. I bought an iPhone about a year ago when I got sick of the rampant unpatched security holes in Android that the manufacturers just don't care about. Apple isn't perfect, but at least actively patch vulnerabilities and for a good long time, too. Believe me, I didn't buy an iPhone because I got starry eyed about Apple products, but purely because of security issues. Android has more features and is more flexible and I miss that. At least I'm relatively safe, though.
 
Let's stay ontopic people. This thing is going around and around.
Clearly Some feel Intel is lying about there CPU's, Well I have some exciting news for every one...
........................ No one (not Intel nor AMD) is lying about there CPU or the power it uses...............................
First off they use engineering samples and huge equipment to test with, They (Intel/AMD) have specific precise equipment to gauge and verify the spread sheet settings.
If you think there is someone lying to you it is in fact the SOFTWARE I have found software to be very fallible as of late.
Ryzen and all the lakes HAS shocked everyone it simply has and I can see this in CPU-Z and Other software vs what the BIOS even has! It's a joke really!
I see the Youtube reviewers here on TPU utterly shocked and that is a FACT!
I do not review nor do I get free stuff to review nor do I want to.
I do however see things that do NOT add up and one of them is this thread.
No Intel is NOT lying it is the shit software that you use sorry you guys need to step it up on the program side!
 
it is the shit software that you use
Let's not shoot the messenger. Intel CPUs use an energy counter within the CPU. This counter goes up based on CPU load and speed and what type of software is being run. Monitoring software reads this counter every second, finds out how much energy has been consumed, divides that number by the time interval and reports a power consumption number. All software that is working correctly should end up reporting the same thing. This is not measured power consumption. It is estimated power consumption. The formula that Intel uses to determine how rapidly the energy counter counts up is totally up to them.

If Intel was unscrupulous, they could make all software report whatever they wanted it to report. I have not seen any evidence that Intel is doing this.

The 10850K is a power consuming pig when overclocked and running Prime95. At base frequency, where Intel TDP is measured, the 10850K operates well under the 125W TDP rating. That debunks the Intel is lying conspiracy that this thread is based on. New cars do not measure fuel mileage with a brick on the accelerator pedal while going up a big hill and no one complains. Why is everyone so butt hurt that Intel does not document power consumption at full speed while running a torture test?

If you do not like how Intel rates their CPUs, you can always switch teams and buy an AMD CPU.
 
Let's not shoot the messenger. Intel CPUs use an energy counter within the CPU. This counter goes up based on CPU load and speed and what type of software is being run. Monitoring software reads this counter every second, finds out how much energy has been consumed, divides that number by the time interval and reports a power consumption number. All software that is working correctly should end up reporting the same thing. This is not measured power consumption. It is estimated power consumption. The formula that Intel uses to determine how rapidly the energy counter counts up is totally up to them.

If Intel was unscrupulous, they could make all software report whatever they wanted it to report. I have not seen any evidence that Intel is doing this.

The 10850K is a power consuming pig when overclocked and running Prime95. At base frequency, where Intel TDP is measured, the 10850K operates well under the 125W TDP rating. That debunks the Intel is lying conspiracy that this thread is based on. New cars do not measure fuel mileage with a brick on the accelerator pedal while going up a big hill and no one complains. Why is everyone so butt hurt that Intel does not document power consumption at full speed while running a torture test?

If you do not like how Intel rates their CPUs, you can always switch teams and buy an AMD CPU.
Nah it's definitely the software :p :D , leg pulled only not ripped off.:).
 
TDP is not rated max power consumption. That's the problem doing DIY builds and not understanding what the numbers mean.

If you don't want to have to dig into and understand what the numbers mean, you should probably buy an OEM rig, or else figure on getting an outsized PSU. Alienware for example will not sell you an RTX 3090 without a 1000W PSU:

View attachment 187546
For GPUs it absolutely is. GPUs have a power limit set at TDP and they will not consume any more power than that. That has been the case for at least last 4 generations or so.
Especially in case of RTX 3090 there are some buts around the short spikes it does and something about it triggering some power supplies so Alienware just wants to be really sure.

Let's not shoot the messenger. Intel CPUs use an energy counter within the CPU. This counter goes up based on CPU load and speed and what type of software is being run. Monitoring software reads this counter every second, finds out how much energy has been consumed, divides that number by the time interval and reports a power consumption number. All software that is working correctly should end up reporting the same thing. This is not measured power consumption. It is estimated power consumption. The formula that Intel uses to determine how rapidly the energy counter counts up is totally up to them.
They use energy counter over a certain period to determine the allowed turbo amount and length. This is based pretty much solely on the power consumed. As a simplified example - every second it spends using less power than TDP, it can spend another second the same amount over TDP and then it gets averaged out over a longer period. CPU load and speed and software have less to do with this, all that simply end up as power consumption factors for determining the power limit. Not just Intel, AMD is doing a variation of the same thing. This also happens far far more frequently than a second. Software loads and shows the same data but with less frequency.

Or course, when you raise (or manufacturer raises) power limit away, this all doesn't matter :)
 
Last edited:
Let's consider a point here that we haven't before - GPU power consumption and TDP.
My 3060Ti has a TDP of 200W given on NVIDIA's website. Even under testing during Unigine Valley and Heaven, it didn't exceed that number by more than 1-2%. Only when I adjusted the power limit of the card to 110%(1 8-pin connector meant I couldn't push it past 225W anyway), I was able to draw 220W from the card.

In other words, the TDP is something that I could depend on.
When I built my computer in 2016, I wanted to go for a 750W PSU, but ultimately went for a Gold 650W instead of a Silver/Bronze 750W unit. Still I got a motherboard with SLI compatibility so that one day I could run 2 970s instead of 1 and the CPU with a mild overclock.

Now, if the cards were consuming 175W+ each instead of 145W and the CPU 125W+ instead of its rated 88 then I'd have regretted depending on these numbers for my PSU choice.

I get that the 3060Ti is a special case because it's a power-limited card, but still, a piece of hardware should pull close to what it's rated power consumption is, otherwise the whole point of that number is moot.

You are still failing to understand that TDP is not a power consumption number giving by Intel. And nVidia doesn't give TDP number for their current gen cards to the public. The specs for your 3060Ti gives a Total Board Power number, which is actually a maximum power consumption number. Intel doesn't give power consumption numbers, TDP is not a power consumption number.
 
Back
Top