• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel lying about their CPUs' TDP: who's not surprised?

Toothless

Tech, Games, and TPU!
Supporter
Joined
Mar 26, 2014
Messages
9,674 (2.46/day)
Location
Washington, USA
System Name Veral
Processor 7800x3D
Motherboard x670e Asus Crosshair Hero
Cooling Corsair H150i RGB Elite
Memory 2x24 Klevv Cras V RGB
Video Card(s) Powercolor 7900XTX Red Devil
Storage Crucial P5 Plus 1TB, Samsung 980 1TB, Teamgroup MP34 4TB
Display(s) Acer Nitro XZ342CK Pbmiiphx, 2x AOC 2425W, AOC I1601FWUX
Case Fractal Design Meshify Lite 2
Audio Device(s) Blue Yeti + SteelSeries Arctis 5 / Samsung HW-T550
Power Supply Corsair HX850
Mouse Corsair Nightsword
Keyboard Corsair K55
VR HMD HP Reverb G2
Software Windows 11 Professional
Benchmark Scores PEBCAK
To anyone that might want something to play with..

Intel has a little thing called Power Gadget that shows how much power a cpu is pulling. It'll show my 2680v2's at 95-100w full load and 4790k 60w on normal use. Maybe some of you guys can give it a try on the 10 series.
 
Joined
Jan 27, 2015
Messages
1,747 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
Doing power consumption comparisons at max turbo all the time is about as logical as testing a cars fuel efficiency at full throttle max speed for an extended period of time. Unless you are a track racer, that's meaningless.

TDP is for determining *average* minimum heat dissipation in stock form. Max power consumption under turbo is by default something that only lasts 5-8 seconds, and then drops so that it can maintain that TDP average.

If these sites were interested in coming up with a useful power metric, they would use some standard benchmark representing typical workload to measure overall power consumption in the real world, and in the real world most PCs are sitting around under 5% CPU usage. For cars, they have EPA, which is why no one would get away with measuring MPG on a race track. We don't have anyone defining that in this space so people get to make these idiotic hyperbolic arguments.
 
Joined
Jul 25, 2006
Messages
13,411 (1.99/day)
Location
Nebraska, USA
System Name Brightworks Systems BWS-6 E-IV
Processor Intel Core i5-6600 @ 3.9GHz
Motherboard Gigabyte GA-Z170-HD3 Rev 1.0
Cooling Quality case, 2 x Fractal Design 140mm fans, stock CPU HSF
Memory 32GB (4 x 8GB) DDR4 3000 Corsair Vengeance
Video Card(s) EVGA GEForce GTX 1050Ti 4Gb GDDR5
Storage Samsung 850 Pro 256GB SSD, Samsung 860 Evo 500GB SSD
Display(s) Samsung S24E650BW LED x 2
Case Fractal Design Define R4
Power Supply EVGA Supernova 550W G2 Gold
Mouse Logitech M190
Keyboard Microsoft Wireless Comfort 5050
Software W10 Pro 64-bit
The worst part is i know intel fanboys who rabidly defend these stats and say its lies
I disagree. The worst part is pi$$-poor journalism misrepresenting the facts with falsehoods and the opposing fanboys who rapidly pile on to defend the article and its falsehoods and then use that to attack the competition without even doing any fact checking to see if the article is biased or factual!

Note where the article says (my bold underline added),
Extreme Tech said:
On paper, an Intel CPU’s TDP is the maximum power consumed under a sustained workload at base frequency.

Anybody can see in seconds that is false! That is NOT how Intel defines TDP! Using the same CPU as the article did, and as seen in the ARK for the Core-i7 10700k, if you hover over TDP to see Intel's definition, it clearly says, (again, my bold underline added).
Intel said:
Thermal Design Power (TDP) represents the average power, in watts, the processor dissipates when operating at Base Frequency with all cores active under an Intel-defined, high-complexity workload.

Come on everyone! Power "consumed" does NOT and never has equaled power "dissipated"! Nothing made by Man is 100% efficient! The CPU would not generate any heat if it was. Nor does maximum equal average. :(

It is pretty clear the purpose of that article is simply to launch another bashing session against the big bad Intel even though AMD's published TDP specs are vague and deceptive too!

The fault does NOT belong with Intel, or AMD but on the entire processor industry - which includes VIA, NVIDIA, Qualcomm, Motorola and others. The industry needs to get together and come up with an industry standard for terms and how such values are measured and published - in a similar way they all came together years ago to create the ATX Form Factor standard.
 
Joined
Feb 3, 2017
Messages
3,838 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
@RandallFlagg, "typical" is very difficult to find. Intel's big numbers are usually what you get with Prime95. Even heavier productivity workloads trail by a considerable margin. Anything lower than that - desktop usage scenarios, gaming, will probably fit in TDP anyway and will vary by a large margin.

Come on everyone! Power "consumed" does NOT and never has equaled power "dissipated"!
What percentage from the power that goes into CPU is used for anything else but radiating heat?
Btw, how this works in reality is just the opposite of what you said. CPUs are incredibly inefficient - they use small amount of power to do useful work (i.e. compute stuff) and rest is wasted as heat.
 

freeagent

Moderator
Staff member
Joined
Sep 16, 2018
Messages
9,127 (3.96/day)
Location
Winnipeg, Canada
Processor AMD R7 5800X3D
Motherboard Asus Crosshair VIII Dark Hero
Cooling Thermalright Frozen Edge 360, 3x TL-B12 V2, 2x TL-B12 V1
Memory 2x8 G.Skill Trident Z Royal 3200C14, 2x8GB G.Skill Trident Z Black and White 3200 C14
Video Card(s) Zotac 4070 Ti Trinity OC
Storage WD SN850 1TB, SN850X 2TB, SN770 1TB
Display(s) LG 50UP7100
Case Fractal Torrent Compact
Audio Device(s) JBL Bar 700
Power Supply Seasonic Vertex GX-1000, Monster HDP1800
Mouse Logitech G502 Hero
Keyboard Logitech G213
VR HMD Oculus 3
Software Yes
Benchmark Scores Yes
I know this was a joke but it actually seems to be the other way around. When idle, just showing desktop and running the few background processes I have, i5 was at 6W but R5 is at 30W. Thankfully the B550 board I have is a bit more efficient than my Z370 board (and the B450 I had previously) so the overall difference for the entire computer is ~15W.

Ryzen's IO Die seems to consume a good 10-15W and this has considerable effect at idle.


For CPUs today? Unfortunately yes.
In other contexts, it is a perfectly valid engineering concept. Thermal Design Power, should indicate the maximum amount of heat components needs to dissipate so that cooling can be designed properly.
I was looking at the wall. They are pretty similar. Not bad for 6 vs 4 cores. The quad was getting a buttload of voltage too. Z77 vs B550..
 
Joined
Mar 10, 2010
Messages
11,880 (2.19/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Gskill Trident Z 3900cas18 32Gb in four sticks./16Gb/16GB
Video Card(s) Asus tuf RX7900XT /Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores laptop Timespy 6506
TDP was also never meant, and still isn't meant to be a measure of power consumption. It is measure of thermal output to determine heatsink size.
And I did say engineers Did use it correctly back in the day didn't I, and that marketing has confused it's use dramatically.
So we agree then yes, or no?!.

Looking back might not have expressed my point adequately before :).
 
Joined
Jul 25, 2006
Messages
13,411 (1.99/day)
Location
Nebraska, USA
System Name Brightworks Systems BWS-6 E-IV
Processor Intel Core i5-6600 @ 3.9GHz
Motherboard Gigabyte GA-Z170-HD3 Rev 1.0
Cooling Quality case, 2 x Fractal Design 140mm fans, stock CPU HSF
Memory 32GB (4 x 8GB) DDR4 3000 Corsair Vengeance
Video Card(s) EVGA GEForce GTX 1050Ti 4Gb GDDR5
Storage Samsung 850 Pro 256GB SSD, Samsung 860 Evo 500GB SSD
Display(s) Samsung S24E650BW LED x 2
Case Fractal Design Define R4
Power Supply EVGA Supernova 550W G2 Gold
Mouse Logitech M190
Keyboard Microsoft Wireless Comfort 5050
Software W10 Pro 64-bit
What percentage from the power that goes into CPU is used for anything else but radiating heat?
That's the problem, isn't. There is no industry standard dictating how such values can be determine. It is not like a motor, for example, where you can accurately measure the power consumed and compare it to the turning power of the spinning motor.

It is not like a power supply where you can measure the voltage and current at the wall and compare it output voltage and current.

How to you accurately measure CPU output power and then compare that to the amount consumed and then use that to compare that to competing processors AND THEN use that data to determine which processor can do more "work" in a given amount of time?
Btw, how this works in reality is just the opposite of what you said. CPUs are incredibly inefficient - they use small amount of power to do useful work (i.e. compute stuff) and rest is wasted as heat.
:( NO!!!!!!! I NEVER said anything of the sort! I was pretty clear that CPUs generate heat - that clearly means they are inefficient. I specifically said nothing man-made is 100% efficient. That includes CPUs.
 
Joined
Feb 3, 2017
Messages
3,838 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
:( NO!!!!!!! I NEVER said anything of the sort! I was pretty clear that CPUs generate heat - that clearly means they are inefficient. I specifically said nothing man-made is 100% efficient. That includes CPUs.
Sorry, I misunderstood what you meant :)
 
Joined
Oct 29, 2019
Messages
471 (0.25/day)
Doing power consumption comparisons at max turbo all the time is about as logical as testing a cars fuel efficiency at full throttle max speed for an extended period of time. Unless you are a track racer, that's meaningless.

TDP is for determining *average* minimum heat dissipation in stock form. Max power consumption under turbo is by default something that only lasts 5-8 seconds, and then drops so that it can maintain that TDP average.

If these sites were interested in coming up with a useful power metric, they would use some standard benchmark representing typical workload to measure overall power consumption in the real world, and in the real world most PCs are sitting around under 5% CPU usage. For cars, they have EPA, which is why no one would get away with measuring MPG on a race track. We don't have anyone defining that in this space so people get to make these idiotic hyperbolic arguments.
Good point. That would explain why the techpowerup gaming consumption (average, not just random spikes) shows the 10500 and 5600x as nearly identical. Even the non k 10700 is only showing 8 watts higher than the 5600x when there both at full clocks/strength yet if you only measured the short spikes the 10700 would be way higher
 
Joined
Feb 3, 2017
Messages
3,838 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
That's the problem, isn't. There is no industry standard dictating how such values can be determine. It is not like a motor, for example, where you can accurately measure the power consumed and compare it to the turning power of the spinning motor.
Physics dictates the power consumed must go somewhere. In IC, there really are not many places for the energy to go but heat. It might give off some minor RF radiation (hopefully not) but even indirectly all the other conversion chains end up in heat. Anything else is a very minor fraction of a percent, if even that. For our purposes - the same power that goes into CPU will come out as heat.
Good point. That would explain why the techpowerup gaming consumption (average, not just random spikes) shows the 10500 and 5600x as nearly identical. Even the non k 10700 on max turbo is only showing 8 watts higher than the 5600x when there both at full clocks/strength
Gaming is not a heavy load. Even games that we consider properly loading CPU cores and threads are not using large parts of actual CPU die. Even more, when it comes to Intel CPUs not using AVX2 (which almost no games use) will bring power consumption down by a lot.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.08/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
For a chip there is no real difference, is there? Practically all the power that goes in comes out as heat.

For a thermal solution standpoint there is. The peak power can be way above the rated TDP and the cooler can still handle it in bursts. Intel has been taking advantage of this since Turbo Boost was invented. It has never been a absolute limit in power consumption. It is also why the turbo was, until the recent generations, governed completely by temperature. Now it is governed by power and temperature to keep things at least somewhat reasonable. Thermal solutions don't really care about peaks in heat output, they just absorb them and keep going. However, if you have a chip that says it is going to output 95w and then it constantly outputs 125w and you put a heatsink designed for 95w on that CPU, then it is going to have thermal problems. But the TDP is a rating for heatsinks, it is basically saying if you want the advertised performance out of this CPU, your heatsink better be able to handle this much heat.

For example, my 8700K will boost to 4.6GHz when under full load(Cinebench) and consume almost 140w. This is the default behavior of the Z390 motherboard I have it in. The motherboard decides the power limit, because the motherboard manufacturer knows what their board is capable of delivering and for how long. In that Z390 motherboard, that 140w only lasts for about 60 seconds before it start to dial back, as long as the CPU cooler can keep up(which mine has no problem doing). By the end of a Cinebench run the CPU is running at 4.4GHz and the power consumption is back down below 100w. However, if I take that same 8700K and put in in the B365 motherboard that I have it never goes over 100w and 4.3GHz at full load. But at any time, I put a heatsink that can't handle that higher heat output on the CPU, then it will detect the higher temperatures and throttle back to 95w or less if needed.

But the entire point I'm trying to make is that the TDP was never an absolute power limit on the CPU, and there is not guarantee that the CPU won't consume more than that, and this goes back to Nehalem.
 
Joined
Feb 3, 2017
Messages
3,838 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
Turbo was not governed completely by temperature. There have been power limits in place for a long while. Power limits simply were not hit or were not hit in a significant way.
Stock 8700K will not boost to 4.6GHz on all cores, not even with the fudged power settings. Frequency table is 4.3GHz for max all-core turbo. If yours does, it's MCE or equivalent in motherboard BIOS.
 
Joined
Jan 27, 2015
Messages
1,747 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
Good point. That would explain why the techpowerup gaming consumption (average, not just random spikes) shows the 10500 and 5600x as nearly identical. Even the non k 10700 is only showing 8 watts higher than the 5600x when there both at full clocks/strength yet if you only measured the short spikes the 10700 would be way higher

I've run windows perfmon for several different days for myself, with a 5 second resolution. Example below.

100% usage for >=5s is not a scenario for me. I know 100% sometimes happens during things like file decompress, but it doesn't last long enough to show up here. I'm sure some will come in talking about encoding or some such but that's red herring crap IMO, it's like someone talking about how often they do tractor pulls with their Hyundai.

The big spikes at the start of the day is running a VM, which is not a typical workload. The end of the day, that's gaming. Note it never never goes much over 50% on any core for > 5s. The rest of the time while working and normal stuff like browsing / listening to itunes / youtube and so on, it's pretty damn near zero.


1612374870469.png
 
Joined
Jul 25, 2006
Messages
13,411 (1.99/day)
Location
Nebraska, USA
System Name Brightworks Systems BWS-6 E-IV
Processor Intel Core i5-6600 @ 3.9GHz
Motherboard Gigabyte GA-Z170-HD3 Rev 1.0
Cooling Quality case, 2 x Fractal Design 140mm fans, stock CPU HSF
Memory 32GB (4 x 8GB) DDR4 3000 Corsair Vengeance
Video Card(s) EVGA GEForce GTX 1050Ti 4Gb GDDR5
Storage Samsung 850 Pro 256GB SSD, Samsung 860 Evo 500GB SSD
Display(s) Samsung S24E650BW LED x 2
Case Fractal Design Define R4
Power Supply EVGA Supernova 550W G2 Gold
Mouse Logitech M190
Keyboard Microsoft Wireless Comfort 5050
Software W10 Pro 64-bit
For our purposes - the same power that goes into CPU will come out as heat.
Oh bullfeathers! That is NOT true at all. Also not true is you speaking for "our" purposes.

You are essentially dismissing all the "work" a CPU does. That's just silly and does NOT accurately reflect the laws of physics you call upon to justify your claims.

A 65W CPU today does a heck of a lot more "work" in the same amount of time while consuming significantly less energy than a 65W CPU from years past. That would be impossible if what you claimed was true.

Practically all the power that goes in comes out as heat.
So what? That is NOT the point - despite how much you want it to be. You keep dismissing, ignoring, or don't understand (I don't know which) the most important point and that is the amount of work being done with the amount of energy that is NOT going up in heat.

"Machine 1" consumes 100W of energy per minute and gives off 95W in the form of heat. It moves 10 buckets of water 10 feet in that minute.

"Machine 2" consumes 100W of energy per minute and gives off 95W in the form of heat. But it moves 20 buckets of water 10 feet in that minute.

See the difference? That's what matters for "our" purposes.
 
Joined
Feb 3, 2017
Messages
3,838 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
Oh bullfeathers! That is NOT true at all. Also not true is you speaking for "our" purposes.
You are essentially dismissing all the "work" a CPU does. That's just silly and does NOT accurately reflect the laws of physics you call upon to justify your claims.
OK, let me go back to definitions. Work as in what happens inside a CPU. Transistors switch, electrons move and all that stuff. CPU performance does not really come into play at this stage. It could be an arbitrary amount of transistors switching back and forth (well, ideally staggered switching to get even remotely steady consumption over time).

We were talking about TDP, power consumption and resulting heat output, no?

Edit: CPU performance does not play a part in how ICs use power. Unless you are saying that higher CPU performance will result in consumed power going to something else than heat. I would really like to see source or at least reasoning for that.

A 65W CPU today does a heck of a lot more "work" in the same amount of time while consuming significantly less energy than a 65W CPU from years past. That would be impossible if what you claimed was true.
Split this into a separate quote. The major factor for this is evolution towards smaller manufacturing processes, making transistors smaller and more efficient (less energy to switch).
If you want to nitpick then yes, this is very simplified and does not account for many other factors. The first things that come into mind are voltages used along with their efficiency curves and potential architectural efficiency gains.
 
Last edited:
Joined
Jul 25, 2006
Messages
13,411 (1.99/day)
Location
Nebraska, USA
System Name Brightworks Systems BWS-6 E-IV
Processor Intel Core i5-6600 @ 3.9GHz
Motherboard Gigabyte GA-Z170-HD3 Rev 1.0
Cooling Quality case, 2 x Fractal Design 140mm fans, stock CPU HSF
Memory 32GB (4 x 8GB) DDR4 3000 Corsair Vengeance
Video Card(s) EVGA GEForce GTX 1050Ti 4Gb GDDR5
Storage Samsung 850 Pro 256GB SSD, Samsung 860 Evo 500GB SSD
Display(s) Samsung S24E650BW LED x 2
Case Fractal Design Define R4
Power Supply EVGA Supernova 550W G2 Gold
Mouse Logitech M190
Keyboard Microsoft Wireless Comfort 5050
Software W10 Pro 64-bit
CPU performance does not really come into play at this stage.
Of course it does. Performance determines how much "work" can be accomplished in a given amount of time with a given amount of energy.
For all intents and purposes it could be an arbitrary amount of transistors switching back and forth
What??? Do think those gates are just flipping and flopping back and forth for fun or no reason? NOOOOO! They are doing "work"! Crunching numbers. Processing data.

I go back to my previous statement. You keep dismissing, ignoring, or just plain don't understand that the amount of work being accomplished cannot just summarily be omitted from the equation when determining a processor's (or any machine's) efficiency. Work must be factored in too!

For the purpose of the this thread in relation to Intel's definition of TDP, that value is used to determine how much cooling is required. It is NOT meant as a means to compare that Intel CPU to an AMD CPU. That's why if you go to that Intel CPU's ARK again (see here) and click on the "?" next to TDP, you will see where it directs readers to the Datasheet for "thermal solution requirements". It does not mention efficiency or work accomplished. Work load, yes, but that is not the same as work accomplished.
 
Joined
Feb 3, 2017
Messages
3,838 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
OK, my statement that I stand by is that power going into an IC will come out as heat.
This was a response to what you said above:
Power "consumed" does NOT and never has equaled power "dissipated"!
 
Last edited:
Joined
Jul 25, 2006
Messages
13,411 (1.99/day)
Location
Nebraska, USA
System Name Brightworks Systems BWS-6 E-IV
Processor Intel Core i5-6600 @ 3.9GHz
Motherboard Gigabyte GA-Z170-HD3 Rev 1.0
Cooling Quality case, 2 x Fractal Design 140mm fans, stock CPU HSF
Memory 32GB (4 x 8GB) DDR4 3000 Corsair Vengeance
Video Card(s) EVGA GEForce GTX 1050Ti 4Gb GDDR5
Storage Samsung 850 Pro 256GB SSD, Samsung 860 Evo 500GB SSD
Display(s) Samsung S24E650BW LED x 2
Case Fractal Design Define R4
Power Supply EVGA Supernova 550W G2 Gold
Mouse Logitech M190
Keyboard Microsoft Wireless Comfort 5050
Software W10 Pro 64-bit
OK, my statement that I stand by is that power going into an IC will come out as heat.
It is still wrong, or at least incomplete Why? Because some of the power going in is being consumed to do work (flip gates, crunch numbers, etc.) too.

I don't understand why you can't or refuse to see that.

It is like an incandescent light bulb. No argument (at least I hope not) that "most" of the energy consumed is being converted into heat and not light. But it is still an indisputable fact that some (even if a small amount) of the energy being consumed is indeed, being used for "work", or in this case, to create light.
 
Joined
Sep 17, 2014
Messages
22,795 (6.06/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
TDP was also never meant, and still isn't meant to be a measure of power consumption. It is measure of thermal output to determine heatsink size.

Bingo... but how do you review something with random heatsinks at the exact measure of the TDP they put in specs?

It will either not perform optimally, or it will brutally exceed TDP. Usually the CPUs do the latter and then start doing the former. Yoyo'ing to keep up, and if you remove the lock on it, they go all over the place. What used to be a simple vcore adjustment is now a whole range of tricks to keep thermal headroom and still extract some semblance of an OC.

In the end its just the same thing. Power = heat.

It is still wrong, or at least incomplete Why? Because some of the power going in is being consumed to do work (flip gates, crunch numbers, etc.) too.

I don't understand why you can't or refuse to see that.

It is like an incandescent light bulb. No argument (at least I hope not) that "most" of the energy consumed is being converted into heat and not light. But it is still an indisputable fact that some (even if a small amount) of the energy being consumed is indeed, being used for "work", or in this case, to create light.

Yes, and then we touch upon the issue of 'efficiency'. Intel has, over the past generations, constantly nudged its processors to clock higher 'when they can' which is an efficiency killer, and a heatwave guarantee. The why behind that is only to look good on spec sheets and in reviews with optimal conditions, while the quality of life of using such a CPU has steadily gone to the shitter. Aggressive temperature cycling doesn't really prolong the lifetime of any component in a system either.

That's a steep price for 5 Gigahurtz to look good. And that is why the TDP as it is being used now is a complete lie, when combined with the specs they show us. If you don't read the Intel Bible on Turbo states that is.

But if you think this through... the work being moved is irrelevant in a discussion about TDP. Performance per watt, does not relate to output temperature. The only thing that relates to temps, is the actual power going in. After all, in a comparison you're looking at an infinite amount of work. No matter how much it moves, you will need all the power it can parse through and this will result in the same temperatures.
 
Last edited:
Joined
Feb 1, 2013
Messages
1,270 (0.29/day)
System Name Gentoo64 /w Cold Coffee
Processor 9900K 5.2GHz @1.312v
Motherboard MXI APEX
Cooling Raystorm Pro + 1260mm Super Nova
Memory 2x16GB TridentZ 4000-14-14-28-2T @1.6v
Video Card(s) RTX 4090 LiquidX Barrow 3015MHz @1.1v
Storage 660P 1TB, 860 QVO 2TB
Display(s) LG C1 + Predator XB1 QHD
Case Open Benchtable V2
Audio Device(s) SB X-Fi
Power Supply MSI A1000G
Mouse G502
Keyboard G815
Software Gentoo/Windows 10
Benchmark Scores Always only ever very fast

Lying about power consumption numbers to make your products look good is just despicable.
Thank God I have a 4690k which means I don't have to deal with this mess.
Nowhere in that article was the word "lying" ever used. Anybody who buys into K processors with 4+ cores should already know what they have to cool and not be surprised at temp spikes to 99C under inadequate heat dissipation. Overclockers have known this for a decade now. If you intended to cast a blanket net, you've missed quite a few other fish.
 
Joined
Jul 25, 2006
Messages
13,411 (1.99/day)
Location
Nebraska, USA
System Name Brightworks Systems BWS-6 E-IV
Processor Intel Core i5-6600 @ 3.9GHz
Motherboard Gigabyte GA-Z170-HD3 Rev 1.0
Cooling Quality case, 2 x Fractal Design 140mm fans, stock CPU HSF
Memory 32GB (4 x 8GB) DDR4 3000 Corsair Vengeance
Video Card(s) EVGA GEForce GTX 1050Ti 4Gb GDDR5
Storage Samsung 850 Pro 256GB SSD, Samsung 860 Evo 500GB SSD
Display(s) Samsung S24E650BW LED x 2
Case Fractal Design Define R4
Power Supply EVGA Supernova 550W G2 Gold
Mouse Logitech M190
Keyboard Microsoft Wireless Comfort 5050
Software W10 Pro 64-bit
The why behind that is only to look good on spec sheets and in reviews with optimal conditions, while the quality of life of using such a CPU has steadily gone to the shitter. Aggressive temperature cycling doesn't really prolong the lifetime of any component in a system either.
Quality of life? I have seen nothing to suggest Intels have a shorter life expectancy than AMDs. Got a link?

And of course Intel wants their CPUs to look good. That's called marketing. Its why Truck Maker A claims their truck is #1 because it gets better gas mileage and Truck Maker B claims theirs is #1 because it can pull more weight and why Truck Maker C claims theirs is #1 because it has more horsepower - and they all are right!

Aggressive temperature cycling? What does that even mean? EVERY CPU can and does go from cold (ambient) when idle to fully temperature when pushed in just a few clock cycles and then back to cold again just as quickly when the load drops back to idle again. Temperature cycling is dependent on the load and cooling.
 
Joined
Sep 17, 2014
Messages
22,795 (6.06/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Quality of life? I have seen nothing to suggest Intels have a shorter life expectancy than AMDs. Got a link?

And of course Intel wants their CPUs to look good. That's called marketing. Its why Truck Maker A claims their truck is #1 because it gets better gas mileage and Truck Maker B claims theirs is #1 because it can pull more weight and why Truck Maker C claims theirs is #1 because it has more horsepower - and they all are right!

Aggressive temperature cycling? What does that even mean? EVERY CPU can and does go from cold (ambient) when idle to fully temperature when pushed in just a few clock cycles and then back to cold again just as quickly when the load drops back to idle again. Temperature cycling is dependent on the load and cooling.
- Quality of life: high temperature peaks are low quality of life; your fans get noisy. Your hands on a laptop get hot. I didn't mean durability/endurance. Laptop CPUs did always get hot, but its a difference if they slowly creep to 80C and then even slower to 85C, or if they boost straight to 85C and then cool back to 50 to start it all over again, all the time. The behaviour has changed, and Sandy Bridge was, for Core, in the optimal position. 22nm made a big dent, partly due to increased density. But when Intel started needing those last few hundred megahertz to keep competing, the limits have been stretched further and further. Yes, I do believe devices with Intel CPUs that boost aggressively are liable to last shorter than they used to in the past. Time will tell, but the average lifetime of recent laptops is nothing to write home about in general. Is AMD different? I don't think that is the subject, and I think they have a lot of work especially on mobile CPUs left to do.

- Aggressive temp cycling means what is described above. The limits are moved ever closer to the absolute boundaries of what the chip can do without burning to a crisp. What used to peak briefly at 80C, now peaks to 85C or more. At the same time, idle temps have actually dropped due to more efficient power states, and because idle requires lower clocks than it used to due to IPC gains.

As always the devil is in the details, and Intel is doing a fine job creating a box of details that cross the line.
 
D

Deleted member 202104

Guest
- Quality of life: high temperature peaks are low quality of life; your fans get noisy. Your hands on a laptop get hot. I didn't mean durability/endurance. Laptop CPUs did always get hot, but its a difference if they slowly creep to 80C and then even slower to 85C, or if they boost straight to 85C and then cool back to 50 to start it all over again, all the time. The behaviour has changed, and Sandy Bridge was, for Core, in the optimal position. 22nm made a big dent, partly due to increased density. But when Intel started needing those last few hundred megahertz to keep competing, the limits have been stretched further and further. Yes, I do believe devices with Intel CPUs that boost aggressively are liable to last shorter than they used to in the past. Time will tell, but the average lifetime of recent laptops is nothing to write home about in general. Is AMD different? I don't think that is the subject, and I think they have a lot of work especially on mobile CPUs left to do.

- Aggressive temp cycling means what is described above. The limits are moved ever closer to the absolute boundaries of what the chip can do without burning to a crisp. What used to peak briefly at 80C, now peaks to 85C or more. At the same time, idle temps have actually dropped due to more efficient power states, and because idle requires lower clocks than it used to due to IPC gains.

As always the devil is in the details, and Intel is doing a fine job creating a box of details that cross the line.

Quality of life? Really?

One of the most ridiculous things I've ever read on a tech site - and that's saying a lot.
 
D

Deleted member 205776

Guest
my locked i7-8700 be like 120w while gaming (advertised 65w)

my unlocked 3900x be like 95w while gaming (advertised 105w)

double the cores lol

if Intel had started measuring their rated TDP from boost clocks, it'd be a different story
 
Joined
Jul 25, 2006
Messages
13,411 (1.99/day)
Location
Nebraska, USA
System Name Brightworks Systems BWS-6 E-IV
Processor Intel Core i5-6600 @ 3.9GHz
Motherboard Gigabyte GA-Z170-HD3 Rev 1.0
Cooling Quality case, 2 x Fractal Design 140mm fans, stock CPU HSF
Memory 32GB (4 x 8GB) DDR4 3000 Corsair Vengeance
Video Card(s) EVGA GEForce GTX 1050Ti 4Gb GDDR5
Storage Samsung 850 Pro 256GB SSD, Samsung 860 Evo 500GB SSD
Display(s) Samsung S24E650BW LED x 2
Case Fractal Design Define R4
Power Supply EVGA Supernova 550W G2 Gold
Mouse Logitech M190
Keyboard Microsoft Wireless Comfort 5050
Software W10 Pro 64-bit
- Quality of life: high temperature peaks are low quality of life; your fans get noisy. Your hands on a laptop get hot. I didn't mean durability/endurance.
Nah! Yes you are right. Things that "annoy" humans may affect our quality of life. But is that really criteria you want to use to decide which CPU is better?

Are you really suggesting AMDs don't get hot too?

What you are describing to me is poor design by the laptop maker or PC builder. Poor choice of fans, inadequate case cooling, etc.
but the average lifetime of recent laptops is nothing to write home about in general.
That may be true but you are suggesting they are failing because the processors are failing and in particular, that those with Intels are failing at a faster rate! Not buying it. Show us evidence.

Frankly, I cannot recall the last time I saw a CPU (Intel or AMD) that just decided to die.
 
Top