• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel lying about their CPUs' TDP: who's not surprised?

trickson

OH, I have such a headache
Joined
Dec 5, 2004
Messages
7,595 (1.04/day)
Location
Planet Earth.
System Name Ryzen TUF.
Processor AMD Ryzen7 3700X
Motherboard Asus TUF X570 Gaming Plus
Cooling Noctua
Memory Gskill RipJaws 3466MHz
Video Card(s) Asus TUF 1650 Super Clocked.
Storage CB 1T M.2 Drive.
Display(s) 73" Soney 4K.
Case Antech LanAir Pro.
Audio Device(s) Denon AVR-S750H
Power Supply Corsair TX750
Mouse Optical
Keyboard K120 Logitech
Software Windows 10 64 bit Home OEM
I think anyone still following this thread, if they haven't already done so, should look at this post by Zach_01 for Intel's own summary of TDP for their chips, as well as watch (as in the whole thing; I know it's long-ish) the GN video on AMD TDP linked a couple posts later.

Things we've learned over the course of this thread (YMMV):
  • TDP does not mean, and has never meant, maximum power consumption
    • It can, however, resemble average power consumption at base frequency, particularly with Intel
  • Intel and AMD calculate TDP differently, in ways that don't necessarily produce a useful value for an end user
  • Modern turbo and boost strategies can push power consumption well past TDP for short periods
    • Certain computational loads can also drive it higher over longer periods.
  • Overclocking completely obviates TDP as a useful value.
My takeaway is this: neither manufacturer is lying about TDP, AFAICT. It's more Thermal Design Power not actually meaning what it sounds like it should, and we DIY-ers latching onto it because it's all there is (outside of reviews and such, of course). Something more meaningful would be nice, but I'm not sure there's a compelling reason for either company to provide it. If they do, it's certainly not going to be on behalf of a "handful" of enthusiasts on forums. I mean, if even the cooler manufacturers are unhappy with it, and the chipmakers won't provide something better for them, it's probably a lost cause.
Agreed.
What a perfect way to end a thread.
 
Joined
Mar 24, 2011
Messages
2,356 (0.47/day)
Location
VT
Processor Intel i7-10700k
Motherboard Gigabyte Aurorus Ultra z490
Cooling Corsair H100i RGB
Memory 32GB (4x8GB) Corsair Vengeance DDR4-3200MHz
Video Card(s) MSI Gaming Trio X 3070 LHR
Display(s) ASUS MG278Q / AOC G2590FX
Case Corsair X4000 iCue
Audio Device(s) Onboard
Power Supply Corsair RM650x 650W Fully Modular
Software Windows 10
What about the 10900K? Why do Z590 boards have VRMs that rival Threadripper? Yesterday on The Full Nerd a question was asked. Would a 1200 Watt PSU be enough to run to 3090s and a 10900K? The answer was non committal. I have run 2 Vega 64s with a 2920X with no concern on my 1200 Watt PSU.
Yes. I'm sure it's the CPU and not the fact that each GPU(s) in that scenario are wildly different. According to TPU's own numbers a 3090 will draw 350-450W, where as a Vega 64 will draw 290-310W. This is why we don't compare completely different setups.
 
Top