• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel lying about their CPUs' TDP: who's not surprised?

Joined
Sep 17, 2014
Messages
22,808 (6.06/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Nah! Yes you are right. Things that "annoy" humans may affect our quality of life. But is that really criteria you want to use to decide which CPU is better?

Are you really suggesting AMDs don't get hot too?

What you are describing to me is poor design by the laptop maker or PC builder. Poor choice of fans, inadequate case cooling, etc.

That may be true but you are suggesting they are failing because the processors are failing and in particular, that those with Intels are failing at a faster rate! Not buying it. Show us evidence.

Frankly, I cannot recall the last time I saw a CPU (Intel or AMD) that just decided to die.

I'm mostly referring to laptops and mobile devices, which is where the TDP matters so much and where it causes issues. You chalk it up to laptop makers, I chalk it up to a combination of them and Intels current approach to clocking. The line has become VERY thin and this also spills over the usability side of a device.

Is AMD different? I'm not saying that (not sure why you keep asking), but I do think they are more honest about advertising their TDPs, and the results generally spell that out too.

As for the CPUs dying. No. Me either. But aggressive power demands do take a toll on circuitry and power delivery elsewhere, and so does heat. With lots of stuff packed together, this is no improvement. And again, this must be related to Intel's need to produce spec sheets that mean something in terms of marketing. 'Look, we gained another 100mhz and the TDP Is still the same'. Is it, really?
 
Last edited:

tabascosauz

Moderator
Supporter
Staff member
Joined
Jun 24, 2015
Messages
8,194 (2.35/day)
Location
Western Canada
System Name ab┃ob
Processor 7800X3D┃5800X3D
Motherboard B650E PG-ITX┃X570 Impact
Cooling NH-U12A + T30┃AXP120-x67
Memory 64GB 6400CL32┃32GB 3600CL14
Video Card(s) RTX 4070 Ti Eagle┃RTX A2000
Storage 8TB of SSDs┃1TB SN550
Case Caselabs S3┃Lazer3D HT5
- Quality of life: high temperature peaks are low quality of life; your fans get noisy. Your hands on a laptop get hot. I didn't mean durability/endurance. Laptop CPUs did always get hot, but its a difference if they slowly creep to 80C and then even slower to 85C, or if they boost straight to 85C and then cool back to 50 to start it all over again, all the time. The behaviour has changed, and Sandy Bridge was, for Core, in the optimal position. 22nm made a big dent, partly due to increased density. But when Intel started needing those last few hundred megahertz to keep competing, the limits have been stretched further and further. Yes, I do believe devices with Intel CPUs that boost aggressively are liable to last shorter than they used to in the past. Time will tell, but the average lifetime of recent laptops is nothing to write home about in general. Is AMD different? I don't think that is the subject, and I think they have a lot of work especially on mobile CPUs left to do.

- Aggressive temp cycling means what is described above. The limits are moved ever closer to the absolute boundaries of what the chip can do without burning to a crisp. What used to peak briefly at 80C, now peaks to 85C or more. At the same time, idle temps have actually dropped due to more efficient power states, and because idle requires lower clocks than it used to due to IPC gains.

As always the devil is in the details, and Intel is doing a fine job creating a box of details that cross the line.

I'm not going to start pointing fingers and flinging the F-word around here, but the irony is that no x86 laptop CPU boosts harder, more dynamically and more frequently than the Renoir CPUs do. Period. And because all of desktop Zen 2 and Zen 3 behaves the same way, the poor "quality of life" and "aggressive temp cycling" is 95% of what makes any Ryzen a Ryzen from 2019 onwards. In fact, the aggressive boost improves user experience if anything, because it improves the system's response to user input in the fraction of a second (a few milliseconds for CPPC on Ryzen, low double digits for Speedshift on Intel).

I don't see any Matisse, Renoir or Vermeer CPUs dying because of high temperature peaks and temp cycling. Some of the desktop chips die without any reason because AMD still doesn't know how to write firmware or work out the quality control of their N7FF chips, but that's a different story. Neither do I see Kaby-R, Coffee, Comet and Ice Lake CPUs randomly dying because they make aggressive use of their boost envelope.

I don't get where this argument is going. The high temps reflect much more on individual laptop makers' abysmal thermal solutions than the CPUs themselves. You do realize that PROCHOT is a thing, preventing the CPU from turning into molten slag if you don't spend ever waking minute monitoring package temp?

Do they last shorter on a smaller process and higher clocks? Probably. Is that going to make an 8-year-old laptop more desirable than a 12-year-old laptop?
 
Last edited:
Joined
Nov 13, 2007
Messages
10,880 (1.74/day)
Location
Austin Texas
System Name stress-less
Processor 9800X3D @ 5.42GHZ
Motherboard MSI PRO B650M-A Wifi
Cooling Thermalright Phantom Spirit EVO
Memory 64GB DDR5 6400 1:1 CL30-36-36-76 FCLK 2200
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse DeathadderV2 X Hyperspeed
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
This is not new. I get that you had an i5 3xxx that ran cool at 4.3ghz @Vayra86 (you can still buy i5's that do that) -- and I get that Sandy Bridge ran cool since it wasn't pushed to the max. But when you're comparing the top of the line chip to an old i5 i think you're skewing your memory a bit.

My Computers included:
Macbook 15" pro retina - ivy bridge, idled in the mid 60's cinebenched at 100C still being used by my mother in law
Dell XPS (2014) - hot as heck
Alienware 17" 2015- also insanely hot, had to disable turbo to keep VRM from throttling

Desktops:
6700K - super hot....
1800X - a 95W chip that sucked down 165W - would crash at 4.1Ghz with anything less than a 360mm water setup (280mm aio would crash)
8700K - also sucked down 165W but was faster
7820X - melted my house 125W, sucked down 250W thick 240 AIO minimum, overloaded air cooling
10850K - 225W in avx loads at 5.0 - but 10 cores and much faster, 85C but requires water.

This is why my memory i think is 'warped' since it's really nothing new at the high end. It's just 10 cores are hot on 14nm at 5Ghz... that's kind of to be expected. They are not hot at 4.8Ghz, in fact they are quite chilly. at 3.7 im sure they really do sit around 140W.

If you get the 10600K i think you will find it runs quite cool, is cheap, runs on cheap boards and has no issues whatsoever. If you get a 5800X 8 core - I think you will find it too runs quite hot despite being a 95W chip --especially on a board that automatically sets the most aggressive PBO settings out of the box.
 
Last edited:
Joined
Feb 1, 2013
Messages
1,274 (0.29/day)
System Name Gentoo64 /w Cold Coffee
Processor 9900K 5.2GHz @1.312v
Motherboard MXI APEX
Cooling Raystorm Pro + 1260mm Super Nova
Memory 2x16GB TridentZ 4000-14-14-28-2T @1.6v
Video Card(s) RTX 4090 LiquidX Barrow 3015MHz @1.1v
Storage 660P 1TB, 860 QVO 2TB
Display(s) LG C1 + Predator XB1 QHD
Case Open Benchtable V2
Audio Device(s) SB X-Fi
Power Supply MSI A1000G
Mouse G502
Keyboard G815
Software Gentoo/Windows 10
Benchmark Scores Always only ever very fast
The sky isn't falling... it has fallen back in the days of the introduction of AVX2/FMA to desktop processors. Even though Sandy Bridge introduced AVX to desktop, it probably wasn't used widely until Ivy Bridge appeared and brought AVX2.
 
Joined
Sep 2, 2020
Messages
1,491 (0.94/day)
System Name Chip
Processor Amd 5600X
Motherboard MSI B450M Mortar Max
Cooling Hyper 212
Memory 2x 16g ddr4 3200mz
Video Card(s) RX 6700
Storage 5.5 tb hd 220 g ssd
Display(s) Normal moniter
Case something cheap
VR HMD Vive
I'd suggest you get a better motherboard the, because its holding back your performance greatly.
It is not the CPU is actually slightly overperforming compared to benchmark
 
Joined
Jul 5, 2013
Messages
28,450 (6.77/day)
Tell that to my 3200g
60w part has never gone past 30 at max load
How do you know this? Have you measured it in some manner?

Frankly, I cannot recall the last time I saw a CPU (Intel or AMD) that just decided to die.
Neither can I. The last time I saw a CPU "die" was because of thermal concerns(clogged heatsink, poor airflow in the case). CPU resiliency has improved greatly over the last 20 years. Perhaps this is why Intel and AMD both do not worry to much about stating certain specs, because they do not fear their IC product dying.
 
Joined
Feb 3, 2017
Messages
3,841 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
Neither can I. The last time I saw a CPU "die" was because of thermal concerns(clogged heatsink, poor airflow in the case). CPU resiliency has improved greatly over the last 20 years. Perhaps this is why Intel and AMD both do not worry to much about stating certain specs, because they do not fear their IC product dying.
Controls, limits and their management has come a long way. CPUs throttle when temperature gets too high, they do have current limits so overloading them is difficult and I believe they have at least detection for voltage spikes as well. All of that is old, tried-and-true and fast enough to make a CPU really-really reliable and resilient.
 
Joined
Jul 16, 2014
Messages
8,223 (2.15/day)
Location
SE Michigan
System Name Dumbass
Processor AMD Ryzen 7800X3D
Motherboard ASUS TUF gaming B650
Cooling Artic Liquid Freezer 2 - 420mm
Memory G.Skill Sniper 32gb DDR5 6000
Video Card(s) GreenTeam 4070 ti super 16gb
Storage Samsung EVO 500gb & 1Tb, 2tb HDD, 500gb WD Black
Display(s) 1x Nixeus NX_EDG27, 2x Dell S2440L (16:9)
Case Phanteks Enthoo Primo w/8 140mm SP Fans
Audio Device(s) onboard (realtek?) - SPKRS:Logitech Z623 200w 2.1
Power Supply Corsair HX1000i
Mouse Steeseries Esports Wireless
Keyboard Corsair K100
Software windows 10 H
Benchmark Scores https://i.imgur.com/aoz3vWY.jpg?2
Hot topic today wow.

I'm in the camp that of "I'm not surprised" with the caveat that "by anything Intel does to mislead consumers for profit"
 
Joined
Jan 1, 2021
Messages
1,067 (0.73/day)
System Name The Sparing-No-Expense Build
Processor Ryzen 5 5600X
Motherboard Asus ROG Strix X570-E Gaming Wifi II
Cooling Noctua NH-U12S chromax.black
Memory 32GB: 2x16GB Patriot Viper Steel 3600MHz C18
Video Card(s) NVIDIA RTX 3060Ti Founder's Edition
Storage 500GB 970 Evo Plus NVMe, 2TB Crucial MX500
Display(s) AOC C24G1 144Hz 24" 1080p Monitor
Case Lian Li O11 Dynamic EVO White
Power Supply Seasonic X-650 Gold PSU (SS-650KM3)
Software Windows 11 Home 64-bit
In other contexts, it is a perfectly valid engineering concept
It should always be an engineering concept. It should be thermal "design" power, not thermal "what we decide to tell you" power.
How is a person supposed to know if a plain 150W tower cooler is enough for their K series CPU, or they need a proper watercooled setup that can handle 200+W of CPU heat dissipation? Printing 125W on the box and actually pulling 265W should be a banned practice.

To anyone that might want something to play with..

Intel has a little thing called Power Gadget that shows how much power a cpu is pulling. It'll show my 2680v2's at 95-100w full load and 4790k 60w on normal use. Maybe some of you guys can give it a try on the 10 series.
It's software?
I use CoreTemp for reading CPU power consumption.
Could be inaccurate, but may not be either.
 
Joined
Feb 3, 2017
Messages
3,841 (1.33/day)
Processor Ryzen 7800X3D
Motherboard ROG STRIX B650E-F GAMING WIFI
Memory 2x16GB G.Skill Flare X5 DDR5-6000 CL36 (F5-6000J3636F16GX2-FX5)
Video Card(s) INNO3D GeForce RTX™ 4070 Ti SUPER TWIN X2
Storage 2TB Samsung 980 PRO, 4TB WD Black SN850X
Display(s) 42" LG C2 OLED, 27" ASUS PG279Q
Case Thermaltake Core P5
Power Supply Fractal Design Ion+ Platinum 760W
Mouse Corsair Dark Core RGB Pro SE
Keyboard Corsair K100 RGB
VR HMD HTC Vive Cosmos
It should always be an engineering concept. It should be thermal "design" power, not thermal "what we decide to tell you" power.
How is a person supposed to know if a plain 150W tower cooler is enough for their K series CPU, or they need a proper watercooled setup that can handle 200+W of CPU heat dissipation? Printing 125W on the box and actually pulling 265W should be a banned practice.
Absolutely, I wholeheartedly agree.
 
Joined
Sep 2, 2020
Messages
1,491 (0.94/day)
System Name Chip
Processor Amd 5600X
Motherboard MSI B450M Mortar Max
Cooling Hyper 212
Memory 2x 16g ddr4 3200mz
Video Card(s) RX 6700
Storage 5.5 tb hd 220 g ssd
Display(s) Normal moniter
Case something cheap
VR HMD Vive
Joined
Jan 1, 2021
Messages
1,067 (0.73/day)
System Name The Sparing-No-Expense Build
Processor Ryzen 5 5600X
Motherboard Asus ROG Strix X570-E Gaming Wifi II
Cooling Noctua NH-U12S chromax.black
Memory 32GB: 2x16GB Patriot Viper Steel 3600MHz C18
Video Card(s) NVIDIA RTX 3060Ti Founder's Edition
Storage 500GB 970 Evo Plus NVMe, 2TB Crucial MX500
Display(s) AOC C24G1 144Hz 24" 1080p Monitor
Case Lian Li O11 Dynamic EVO White
Power Supply Seasonic X-650 Gold PSU (SS-650KM3)
Software Windows 11 Home 64-bit
Also the idle -- the 4690K doesn't have all the newer power saving tech so it idles at like 56W stock
Who told you that? Mine idles at 9.6W power consumption @800MHz 0.76V.
My CPU pulls 56W at 3.5GHz Prime95.
At 4+GHz it starts to go to 75+W power consumption.
It maxed out at 83W, but I found I could push it to 100W at 4.3 without issue.
Past 4.3 the overclock is not stable.

KillaWatt perhaps
CoreTemp.

So yeah. I think its pretty clear what happened since Skylake. Base clocks were steadily reduced while turbos were elevated, then Intel rewrote their definition of what turbo should mean, they changed some details and added more premium modes of turbo (lmao) so the old ones would seem somehow worse... except now you have a beautiful cocktail of turbos that cannot sustain even for two seconds because you'll either burn a hole in your socket or your CPU itself just runs straight into thermal shutdown.
Is this shitshow why Swan had to step down? Faster CPUs no matter what?
 
D

Deleted member 202104

Guest
Neither of those are reliable enough to be dependable. They are an ok ballpark reference but should not be used as a defacto method of measurement. KillaWatt type solutions are much more accurate.

To determine CPU power usage? Not even.

That only determines power draw for the entire system at the wall. There's no accurate way to break that down to individual component power draw - the efficiency of the power supply changes based on load. Comparing idle and full load from a kill-a-watt doesn't take into account power supply loss.
 
Joined
Feb 1, 2013
Messages
1,274 (0.29/day)
System Name Gentoo64 /w Cold Coffee
Processor 9900K 5.2GHz @1.312v
Motherboard MXI APEX
Cooling Raystorm Pro + 1260mm Super Nova
Memory 2x16GB TridentZ 4000-14-14-28-2T @1.6v
Video Card(s) RTX 4090 LiquidX Barrow 3015MHz @1.1v
Storage 660P 1TB, 860 QVO 2TB
Display(s) LG C1 + Predator XB1 QHD
Case Open Benchtable V2
Audio Device(s) SB X-Fi
Power Supply MSI A1000G
Mouse G502
Keyboard G815
Software Gentoo/Windows 10
Benchmark Scores Always only ever very fast
It should always be an engineering concept. It should be thermal "design" power, not thermal "what we decide to tell you" power.
How is a person supposed to know if a plain 150W tower cooler is enough for their K series CPU, or they need a proper watercooled setup that can handle 200+W of CPU heat dissipation? Printing 125W on the box and actually pulling 265W should be a banned practice.
Intel doesn't send out review samples for nothing. If you have some kind of load that is pushing 265W, it's even more critical to pay attention to the reviews.

There used to be a time when Intel made and sold their own motherboards. Now, if you paired a default Intel motherboard and cpu together, and your typical loads resulted in 2x TDP -- that's definitely a talking point.
 

Toothless

Tech, Games, and TPU!
Supporter
Joined
Mar 26, 2014
Messages
9,674 (2.45/day)
Location
Washington, USA
System Name Veral
Processor 7800x3D
Motherboard x670e Asus Crosshair Hero
Cooling Corsair H150i RGB Elite
Memory 2x24 Klevv Cras V RGB
Video Card(s) Powercolor 7900XTX Red Devil
Storage Crucial P5 Plus 1TB, Samsung 980 1TB, Teamgroup MP34 4TB
Display(s) Acer Nitro XZ342CK Pbmiiphx, 2x AOC 2425W, AOC I1601FWUX
Case Fractal Design Meshify Lite 2
Audio Device(s) Blue Yeti + SteelSeries Arctis 5 / Samsung HW-T550
Power Supply Corsair HX850
Mouse Corsair Nightsword
Keyboard Corsair K55
VR HMD HP Reverb G2
Software Windows 11 Professional
Benchmark Scores PEBCAK
It should always be an engineering concept. It should be thermal "design" power, not thermal "what we decide to tell you" power.
How is a person supposed to know if a plain 150W tower cooler is enough for their K series CPU, or they need a proper watercooled setup that can handle 200+W of CPU heat dissipation? Printing 125W on the box and actually pulling 265W should be a banned practice.


It's software?
I use CoreTemp for reading CPU power consumption.
Could be inaccurate, but may not be either.
Yes software. I trust Intel's reading of power draw over a third party and same for Ryzen Master. Hwinfo is close for trust but everything takes salt.
 
Joined
Jul 14, 2006
Messages
2,583 (0.38/day)
Location
People's Republic of America
System Name It's just a computer
Processor i9-14900K Direct Die
Motherboard MSI Z790 ACE MAX
Cooling 4X D5T Vario, 2X HK Res, 3X Nemesis GTR560, NF-A14-iPPC3000PWM, NF-A14-iPPC2000PWM, IceMan DD
Memory TEAMGROUP FFXD548G8000HC38EDC01
Video Card(s) MSI 4070 Ti Super w/Alphacool Eisblock Aurora RTX 4070TI Ventus with Backplate :13724
Storage Samsung 990 PRO 1TB M.2
Display(s) LG 32GK650F
Case Custom open frame chassis
Audio Device(s) Auzentech X-Meridian 7.1 2G/Z-5500
Power Supply Seasonic Prime PX-1300
Mouse Logitech MX700
Keyboard Logitech LX700
Software Win11PRO
I read through this, and found myself asking if any power user/enthusiast building a custom high-performance overclocking rig actually cares about TDP, which is merely a warranty that a processor will operate within a given set of parameters at which it will not exceed a specified power consumption level.
 
Joined
Feb 20, 2020
Messages
9,340 (5.24/day)
Location
Louisiana
System Name Ghetto Rigs z490|x99|Acer 17 Nitro 7840hs/ 5600c40-2x16/ 4060/ 1tb acer stock m.2/ 4tb sn850x
Processor 10900k w/Optimus Foundation | 5930k w/Black Noctua D15
Motherboard z490 Maximus XII Apex | x99 Sabertooth
Cooling oCool D5 res-combo/280 GTX/ Optimus Foundation/ gpu water block | Blk D15
Memory Trident-Z Royal 4000c16 2x16gb | Trident-Z 3200c14 4x8gb
Video Card(s) Titan Xp-water | evga 980ti gaming-w/ air
Storage 970evo+500gb & sn850x 4tb | 860 pro 256gb | Acer m.2 1tb/ sn850x 4tb| Many2.5" sata's ssd 3.5hdd's
Display(s) 1-AOC G2460PG 24"G-Sync 144Hz/ 2nd 1-ASUS VG248QE 24"/ 3rd LG 43" series
Case D450 | Cherry Entertainment center on Test bench
Audio Device(s) Built in Realtek x2 with 2-Insignia 2.0 sound bars & 1-LG sound bar
Power Supply EVGA 1000P2 with APC AX1500 | 850P2 with CyberPower-GX1325U
Mouse Redragon 901 Perdition x3
Keyboard G710+x3
Software Win-7 pro x3 and win-10 & 11pro x3
Benchmark Scores Are in the benchmark section
Hi,
lol yeah I'd guess those people would just be wondering how to cool the little devil
video yes GIF
 

trickson

OH, I have such a headache
Joined
Dec 5, 2004
Messages
7,595 (1.03/day)
Location
Planet Earth.
System Name Ryzen TUF.
Processor AMD Ryzen7 3700X
Motherboard Asus TUF X570 Gaming Plus
Cooling Noctua
Memory Gskill RipJaws 3466MHz
Video Card(s) Asus TUF 1650 Super Clocked.
Storage CB 1T M.2 Drive.
Display(s) 73" Soney 4K.
Case Antech LanAir Pro.
Audio Device(s) Denon AVR-S750H
Power Supply Corsair TX750
Mouse Optical
Keyboard K120 Logitech
Software Windows 10 64 bit Home OEM
I read through this, and found myself asking if any power user/enthusiast building a custom high-performance overclocking rig actually cares about TDP, which is merely a warranty that a processor will operate within a given set of parameters at which it will not exceed a specified power consumption level.
I have NEVER once, Not even in the 24 years of building have I once considered TDP as a build/Buy point.
I have read through this thread and also have come to this pondering.
I am however an AMD fanboy And can surely say the FX chips SUCK power and ASS! Need a fing 850W PSU just to power an FX8300 and HOLY crap the POWER crazy CHIP can heat a Double wide in Alaska!
SO yeah never really gave a crap about TDP...

Hi,
lol yeah I'd guess those people would just be wondering how to cool the little devil
video yes GIF
RIGHT!?!?! LMFAO!
I mean I am thinking going back to liquid cooling for the 3700X! LOL.

Oh and no I do not think they are lying at all no one is. The CPU can and does do that TDP if you set it up exactly the way they did. so there is that too.. :slap:
 
Joined
Jul 5, 2013
Messages
28,450 (6.77/day)
That only determines power draw for the entire system at the wall.
And you can determine CPU usage rather easily by isolating measurements taken. You'll note I said "KillaWatt type". There are other forms of power measurement devices.
 
Joined
Sep 3, 2019
Messages
3,641 (1.86/day)
Location
Thessaloniki, Greece
System Name PC on since Aug 2019, 1st CPU R5 3600 + ASUS ROG RX580 8GB >> MSI Gaming X RX5700XT (Jan 2020)
Processor Ryzen 9 5900X (July 2022), 200W PPT limit, 80C temp limit, CO -6-14, +50MHz (up to 5.0GHz)
Motherboard Gigabyte X570 Aorus Pro (Rev1.0), BIOS F39b, AGESA V2 1.2.0.C
Cooling Arctic Liquid Freezer II 420mm Rev7 (Jan 2024) with off-center mount for Ryzen, TIM: Kryonaut
Memory 2x16GB G.Skill Trident Z Neo GTZN (July 2022) 3667MT/s 1.42V CL16-16-16-16-32-48 1T, tRFC:280, B-die
Video Card(s) Sapphire Nitro+ RX 7900XTX (Dec 2023) 314~467W (382W current) PowerLimit, 1060mV, Adrenalin v24.12.1
Storage Samsung NVMe: 980Pro 1TB(OS 2022), 970Pro 512GB(2019) / SATA-III: 850Pro 1TB(2015) 860Evo 1TB(2020)
Display(s) Dell Alienware AW3423DW 34" QD-OLED curved (1800R), 3440x1440 144Hz (max 175Hz) HDR400/1000, VRR on
Case None... naked on desk
Audio Device(s) Astro A50 headset
Power Supply Corsair HX750i, ATX v2.4, 80+ Platinum, 93% (250~700W), modular, single/dual rail (switch)
Mouse Logitech MX Master (Gen1)
Keyboard Logitech G15 (Gen2) w/ LCDSirReal applet
Software Windows 11 Home 64bit (v24H2, OSBuild 26100.2605), upgraded from Win10 to Win11 on Jan 2024
I didnt go through almost 100 post and I dont know if and how many said it already...

Both Intel and AMD are not pulling numbers out of their arse.
The TDP ratings are Thermal Design Power and wont tell you the max power draw but the heat output towards the cooler under certain operating conditions, or so the meant to be. If you want max power draw look for CPU PackagePower or CPU PPT(PackagePowerTracking)

AMD as TDP refers to the heat towards the cooler under certain conditions (ambient temp). A specific Tdelta between CPU and ambient. Not all heat produced by the CPU is going to the cooler. Some of it is going through the CPU substrate to the socket and the board and get dissipated from there.

For example all Ryzen 3000 series are like this:

65W TDP, 88W PPT
95W TDP, 125W PPT
105W TDP, 142W PPT

Intel on the other hand is different. Intel CPUs have 2 PowerLevel stages. PL1 and PL2
As TDP they refer to PL1 power draw and that is the max sustainable power draw of the CPU. PL2 is much higher than that but by default for a certain period of time called "Tau" each (PL2/Tau) different for every CPU.

Intel-PL1-PL2-Tau-640x638.jpg
 
D

Deleted member 202104

Guest
And you can determine CPU usage rather easily by isolating measurements taken. You'll note I said "KillaWatt type". There are other forms of power measurement devices.

Please share the "KillaWatt" type device.
 
Joined
Sep 2, 2020
Messages
1,491 (0.94/day)
System Name Chip
Processor Amd 5600X
Motherboard MSI B450M Mortar Max
Cooling Hyper 212
Memory 2x 16g ddr4 3200mz
Video Card(s) RX 6700
Storage 5.5 tb hd 220 g ssd
Display(s) Normal moniter
Case something cheap
VR HMD Vive
Neither of those are reliable enough to be dependable. They are an ok ballpark reference but should not be used as a defacto method of measurement. KillaWatt type solutions are much more accurate.
yeah but its not gonna make a 60 watt draw show up as 30w
 
D

Deleted member 202104

Guest
Top