• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Core i9-12900K

Joined
May 8, 2021
Messages
1,978 (1.49/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
What Intel has achieved is preparing consumers slowly to higher power consumption for CPUs across the board. Just like NV did with RTX 3000 series. I mean the power consumption for the 12900K OC is literally double than a 5950x in stress test etc.
If that's what Intel will do, then I will boycott this bullshit. CPU shouldn't suck more than 100 watts. For graphics card, my limit is nothing more than 150 watts +- 10 watts.
 
Joined
Apr 29, 2018
Messages
129 (0.05/day)
Dude, chill. We all have our reasons to hold off on upgrades, and the longer you keep your stuff instead of splurging on an upgrade the better it is for your wallet, your psyche, and the environment, so it's a win-win-win. I agree that singling out these CPUs is a bit odd - they're not that much faster than Zen3, for example - but it's not like they've said anything about their reasoning, so we can't really know. A friendlier way of putting this would be asking a question, like "What makes you consider this, but not something earlier like Zen3?"
You should really pay closer attention to context...
 
Joined
May 8, 2021
Messages
1,978 (1.49/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
From W1zzard's explanations it's essentially a normalization of MCE for K-SKU chips, with ignoring the on-paper 125W spec not only being the norm but expected power programming for motherboards. At least now there should be some modicum of standardization, if nothing else.
At least, this time they have balls to also admit that they can use nearly 300 watts of power. Not sure about you, but I treat TDP or base power are maximum expected power usage at base clocks without any turbo clocks. But I tested my own i5 and in prime95 small FFTs it uses less than 65 watts (I think it was up to 50 watts) of power with turbo off, so I guess any power number that Intel or AMD releases doesn't mean anything.

Yikes at the power consumption and heat output. That's worse than I thought it would be.

Intel made steps forward with performance matching and slightly beating Ryzen 5000 a year later, but at some cost.
In FX 9590 era, we called that desperate, in 2021 we call that excellent. "Editor's Choice" and "Highly Recommended".
 
Joined
May 31, 2016
Messages
4,443 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
If that's what Intel will do, then I will boycott this bullshit. CPU shouldn't suck more than 100 watts. For graphics card, my limit is nothing more than 150 watts +- 10 watts.
Yeah I get it. It sucks that's the case but I get that Intel had been in a strangled hold by AMD for some time and man, they are eager to finally claim some benchmarks back from AMD and brag about the performance gains despite the power consumption. Intel had been losing on all fronts so if Intel could just get the performance crown and climb to the upper charts sacrificing power consumption they would do it and that is exactly what they did. Since power was high anyway with previous gen, it was an easy pick for Intel. What bugs me is the big little and claims of lower power consumption. I think it will be improved but I have expected more in that department to be fair.
 
Last edited:
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
At least, this time they have balls to also admit that they can use nearly 300 watts of power. Not sure about you, but I treat TDP or base power are maximum expected power usage at base clocks without any turbo clocks. But I tested my own i5 and in prime95 small FFTs it uses less than 65 watts (I think it was up to 50 watts) of power with turbo off, so I guess any power number that Intel or AMD releases doesn't mean anything.
That's inaccurate. TDP is (was) roughly equivalent to the power draw level at which maintaining base clocks was guaranteed. It could always draw less power at base clocks, just not more - that would be grounds for a replacement under warranty - and was very often lower which is why most CPUs historically have boosted noticeably above base clock even when restricted to TDP power draw levels. This is especially true for lower core count CPUs, those with lower peak boost clocks, or both.

You should really pay closer attention to context...
What context? You responded to a post making a statement. I commented on your post, specifically your tone. Am I missing something?
 
Joined
Jul 14, 2018
Messages
473 (0.20/day)
Location
Jakarta, Indonesia
System Name PC-GX1
Processor i9 10900 non K (stock) TDP 65w
Motherboard asrock b560 steel legend | Realtek ALC897
Cooling cooler master hyper 2x12 LED turbo argb | 5x12cm fan rgb intake | 3x12cm fan rgb exhaust
Memory corsair vengeance LPX 2x32gb ddr4 3600mhz
Video Card(s) MSI RTX 3080 10GB Gaming Z Trio LHR TDP 370w| 566.36 WHQL | MSI AB v4.65 | RTSS v7.36
Storage NVME 2+2TB gen3| SSD 4TB sata3 | 1+2TB 7200rpm sata3| 4+4+5TB USB3 (optional)
Display(s) AOC U34P2C (IPS panel, 3440x1440 75hz) + speaker 5W*2 | APC BX1100CI MS (660w)
Case lianli lancool 2 mesh RGB windows - white edition | 1x dvd-RW usb 3.0 (optional)
Audio Device(s) Nakamichi soundstation8w 2.1 100W RMS | Simbadda CST 9000N+ 2.1 352W RMS
Power Supply seasonic focus gx-850w 80+ gold - white edition 2021 | APC BX2200MI MS (1200w)
Mouse steelseries sensei ten | logitech g440
Keyboard steelseries apex 5 | steelseries QCK prism cloth XL | steelseries arctis 5
VR HMD -
Software dvd win 10 home 64bit oem + full update 22H2
Benchmark Scores -

Intel has just launched its new Alder Lake desktop CPUs. However, and as we’ve already reported, these CPUs could have compatibility issues with a number of DRMs. And, as Intel confirmed, there are currently 32 Denuvo games that do not work with it.

This information comes from PCGamer. As Intel told them, it has yet to resolve an issue with Denuvo on Alder Lake for 32 games, which was causing issues playing these games on the platform.

Assassin’s Creed Valhalla is one of the games that appears to have stability issues. PCGamer could not run this game on their system, and Intel told them that they are working with Ubisoft on a fix. Despite that, it appears that the game can run on other Intel Alder Lake systems. For instance, PCGameshardware has benchmarks for both Assassin’s Creed Valhalla and Watch Dogs Legion.

This could be why numerous publishers and developers have been removing Denuvo from their games lately.

Square Enix and Crytek have removed Denuvo from NieR Replicant Remaster & Crysis Remastered. Additionally, 2K Games has removed Denuvo from Mafia: Definitive Edition. Not only that, but Bandai Namco has removed this anti-tamper tech from Tekken 7 and Ace Combat 7: Skies Unknown.

=== wow, it looks like it will be good news for pc gamers.......
 
Joined
Apr 29, 2018
Messages
129 (0.05/day)
What context? You responded to a post making a statement. I commented on your post, specifically your tone. Am I missing something?
My comment to him had nothing to do with looking at Zen or not. My first point was that there had been plenty of cpus worth upgrading to. My next point was that it seemed silly for him to point that 144 fps was his goal yet again he skipped the numerous worthy upgrades he could have gone with over the years. And he is still talking about waiting until the end of 2022 when there will be other cpus by that time. Really his whole comment was just nonsensical. And upgrading costs a lot less when you sell off your other stuff but his stuff will soon be nearly irrelevant as he keeps waiting and waiting.
 
Joined
Apr 16, 2019
Messages
632 (0.30/day)
If I were you though, I'd hold off until we see how the Zen3 refresh with 3D cache plays out. If their promised 15% average (and up to 25% depending on the application) uplift plays out, that would make those chips notably faster than these. But of course we can't know until we see reviews.
Fair enough advice I suppose (if you don't need/want a new PC right now, especially in the light of ridiculous graphics cards prices, but I suspect in this particular case, he'll hold on to his 2080 for the time being anyway), however the problem is, that we don't know which skus will get the 3D treatment; some indications say only octa-cores and up, some even only 12&16 and none mention the 6-core and the latter is just mind boggling. Not only have they already pretty much abandoned the sub $300 class so far, but with 5600x remaining all they will offer here, they'll lose it completely. Even if they drop its price to $200 (unlikely), there still won't be any competition for the 12600k, especially given that motherboard availability and pricing will only improve with time.
 
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
My comment to him had nothing to do with looking at Zen or not. My first point was that there had been plenty of cpus worth upgrading to. My next point was that it seemed silly for him to point that 144 fps was his goal yet again he skipped the numerous worthy upgrades he could have gone with over the years. And he is still talking about waiting until the end of 2022 when there will be other cpus by that time. Really his whole comment was just nonsensical. And upgrading costs a lot less when you sell off your other stuff but his stuff will soon be nearly irrelevant as keeps waiting and waiting.
Which is exactly why the reasonable approach would be to ask for the reasoning behind the statement rather than calling it out in the tone you used. How is that constructive or useful? All you're achieving is making them defensive or angry. Also, some of your reasoning here is problematic: just because they can't hit 144fps with their current setup doesn't invalidate that as a desire for a future upgrade. Quite the opposite, I would say. As for there being plenty of CPUs worth upgrading to: sure, but as I said, we all have our reasons not to. So maybe ask, so that a constructive discussion is possible? And yes, I did recommend looking at future reviews myself, didn't I? That argument for upgrades costing less if you sell your parts is also highly variable depending on the used market where you live and a bunch of other factors. It's absolutely possible, but it's a poor general recommendation.

Fair enough advice I suppose (if you don't need/want a new PC right now, especially in the light of ridiculous graphics cards prices, but I suspect in this particular case, he'll hold on to his 2080 for the time being anyway), however the problem is, that we don't know which skus will get the 3D treatment; some indications say only octa-cores and up, some even only 12&16 and none mention the 6-core and the latter is just mind boggling. Not only have they already pretty much abandoned the sub $300 class so far, but with 5600x remaining all they will offer here, they'll lose it completely. Even if they drop its price to $200 (unlikely), there still won't be any competition for the 12600k, especially given that motherboard availability and pricing will only improve with time.
That would be a valid response if the decision was between "buy now" or "wait for Zen 3 V-cache reviews". That isn't the scenario here, the scenario is "hopefully I can upgrade before the end of 2022". At that point, all of your questions above will be answered, and then some.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.87/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
Yeah because no other cpu has been worth upgrading to over that old cpu. lol get real. And a solid 144 is important to you yet you held on this long to a cpu that cant even maintain 60 fps in some games.
Mind your attitude. :slap::slap:

@Valantar Thanks for the technical explanation, it does sound about right. I knew it was for reasons along these lines and said so in my post.

And yes, it's funny how some people get all personal over a friggin' CPU. :laugh:

And yeah, it's been wonderous for my wallet. Contrary to that immature child above, my CPU does well over 60fps in the all games I play, even the latest, but it can't reach the magic 144fps, or even 120fps in many cases although the experience is still surprisingly smooth. This thing probably has something like an 80-100% performance increase over my aged CPU so will have no trouble at all hitting those highs. Can't wait! :cool:
 
Joined
May 31, 2016
Messages
4,443 (1.42/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Fair enough advice I suppose (if you don't need/want a new PC right now, especially in the light of ridiculous graphics cards prices, but I suspect in this particular case, he'll hold on to his 2080 for the time being anyway), however the problem is, that we don't know which skus will get the 3D treatment; some indications say only octa-cores and up, some even only 12&16 and none mention the 6-core and the latter is just mind boggling. Not only have they already pretty much abandoned the sub $300 class so far, but with 5600x remaining all they will offer here, they'll lose it completely. Even if they drop its price to $200 (unlikely), there still won't be any competition for the 12600k, especially given that motherboard availability and pricing will only improve with time.
All of them. It will be a new line of CPUs with improved performance. I'm sure AMD will not hold onto both 5000 series CPUs with and without 3d V-cash. The 3d V-cash is supposed to be a refresh of the CPU just like the Zen+ was.
 
Joined
Mar 18, 2008
Messages
5,444 (0.89/day)
Location
Australia
System Name Night Rider | Mini LAN PC | Workhorse
Processor AMD R7 5800X3D | Ryzen 1600X | i7 970
Motherboard MSi AM4 Pro Carbon | GA- | Gigabyte EX58-UD5
Cooling Noctua U9S Twin Fan| Stock Cooler, Copper Core)| Big shairkan B
Memory 2x8GB DDR4 G.Skill Ripjaws 3600MHz| 2x8GB Corsair 3000 | 6x2GB DDR3 1300 Corsair
Video Card(s) MSI AMD 6750XT | 6500XT | MSI RX 580 8GB
Storage 1TB WD Black NVME / 250GB SSD /2TB WD Black | 500GB SSD WD, 2x1TB, 1x750 | WD 500 SSD/Seagate 320
Display(s) LG 27" 1440P| Samsung 20" S20C300L/DELL 15" | 22" DELL/19"DELL
Case LIAN LI PC-18 | Mini ATX Case (custom) | Atrix C4 9001
Audio Device(s) Onboard | Onbaord | Onboard
Power Supply Silverstone 850 | Silverstone Mini 450W | Corsair CX-750
Mouse Coolermaster Pro | Rapoo V900 | Gigabyte 6850X
Keyboard MAX Keyboard Nighthawk X8 | Creative Fatal1ty eluminx | Some POS Logitech
Software Windows 10 Pro 64 | Windows 10 Pro 64 | Windows 7 Pro 64/Windows 10 Home
Did anyone notice how much actual IPC % gain there is over Zen 3 when clocked at the same clocked speed? Im not sure if I believe it but it showed only 1% over Zen 3 .........
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.87/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
Fair enough advice I suppose (if you don't need/want a new PC right now, especially in the light of ridiculous graphics cards prices, but I suspect in this particular case, he'll hold on to his 2080 for the time being anyway), however the problem is, that we don't know which skus will get the 3D treatment; some indications say only octa-cores and up, some even only 12&16 and none mention the 6-core and the latter is just mind boggling. Not only have they already pretty much abandoned the sub $300 class so far, but with 5600x remaining all they will offer here, they'll lose it completely. Even if they drop its price to $200 (unlikely), there still won't be any competition for the 12600k, especially given that motherboard availability and pricing will only improve with time.
If you're referring to me and my trusty 2080, then absolutely I'm holding onto it. :)

When I bought it in March 2020, it was supposed to be a "temporary" purchase to tide me over until I got the upcoming 3080.

I'd previously been stuck with my ancient 780 Ti after the failure of two RTX 1080 cards (got refunds) which made it painful to play current, demanding, games on. Then the market turned to sh* with zero availability and sky high prices, so, indeed, this "temporary" card has turned out to be rather permanent, lol and there's no end in sight, either.
 
Joined
May 8, 2021
Messages
1,978 (1.49/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
That's inaccurate. TDP is (was) roughly equivalent to the power draw level at which maintaining base clocks was guaranteed. It could always draw less power at base clocks, just not more - that would be grounds for a replacement under warranty - and was very often lower which is why most CPUs historically have boosted noticeably above base clock even when restricted to TDP power draw levels. This is especially true for lower core count CPUs, those with lower peak boost clocks, or both.
But if it's significantly lower, then that's also misleading. The crazy thing is that in games my i5 10400F can boost to the max (4GHz all core, 4.3GHz single) and still remain under 56 watts. Other chips like Celerons and Pentiums consumed nearly 2 tiems less power than TDP stated, meanwhile locked i9s or i7s were not able to reach their upper boost states and just barely maintained base clock. Regarding AMD, they just straight up never worked at advertised TDP and did nothing to fix that or be more transparent about it. They don't even properly disclose what is out of spec and what is in spec and they strongly encourage people to keep overclocking chips, without them even realizing it. If I'm not too cynical, then they do this to inflate their benchmark results. If I'm cynical, then I think that they do this so that most users will degrade chips faster and then AMD will be dicks about RMA. Either way, neither chip maker discloses power usage properly and for that matter, cooling requirements too.
 
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Did anyone notice how much actual IPC % gain there is over Zen 3 when clocked at the same clocked speed? Im not sure if I believe it but it showed only 1% over Zen 3 .........
It's not much, that's for sure. In Anandtech's SPEC2017 testing, the differences are as follows (they tested ADL with both DDR4 and DDR5):

Normalizing for clock speed (they are all likely to maintain peak boost in these ST workloads, so 5.2GHz vs. 4.9GHz):
12900K D5: 1,56538 pts/GHz INT, 2,72308 pts/GHz FP
12900K D4: 1,54038 pts/GHz INT, 2,62885 pts/GHz FP
5950X: 1,56122 pts/GHz INT, 2,48776 pts/GHz FP

Which, using the 5950X as a baseline:
ADL D5 IPC: +0,3% INT, +9,5% FP
ADL D4 IPC: -1,3% INT, +5,7% FP

Of course this is just in one set of workloads, but at least SPEC is an industry standard. The numbers will obviously be different in different workloads. But it's very close overall.
But if it's significantly lower, then that's also misleading. The crazy thing is that in games my i5 10400F can boost to the max (4GHz all core, 4.3GHz single) and still remain under 56 watts. Other chips like Celerons and Pentiums consumed nearly 2 tiems less power than TDP stated, meanwhile locked i9s or i7s were not able to reach their upper boost states and just barely maintained base clock. Regarding AMD, they just straight up never worked at advertised TDP and did nothing to fix that or be more transparent about it. They don't even properly disclose what is out of spec and what is in spec and they strongly encourage people to keep overclocking chips, without them even realizing it. If I'm not too cynical, then they do this to inflate their benchmark results. If I'm cynical, then I think that they do this so that most users will degrade chips faster and then AMD will be dicks about RMA. Either way, neither chip maker discloses power usage properly and for that matter, cooling requirements too.
Yeah, this is where the difference between what TDP actually means vs. what it is understood to mean comes in. After all, TDP is actually a spec for OEMs and the like to say "this is the class of cooler you need for this chip to maintain stock performance", and it is divided into classes for simplicity rather than calculating an accurate TDP for each chip (as that would be a complete mess in terms of coolers). That's why you get those "54W" Celerons running at 30W, and so on. Something similar is true for AMD, though the problem here is that (just like Intel) they've also used TDP numbers as public-facing marketing classes without really explaining what the numbers mean, or even working to make people understand that TDP does not mean peak power consumption. That the relation between TDP and power draw (to the extent that there is one at all) differs between the two manufacturers just makes this into even more of a mess.

I was initially glad that Intel had ditched TDP for their two-tier base/boost power system, though it seems that is effectively worthless for K SKUs, with boost power being the only number to look at. It'll be interesting to see how this plays out across non-K SKUs though, and I would love to see AMD be more transparent as well.

Mind your attitude. :slap::slap:

@Valantar Thanks for the technical explanation, it does sound about right. I knew it was for reasons along these lines and said so in my post.

And yes, it's funny how some people get all personal over a friggin' CPU. :laugh:

And yeah, it's been wonderous for my wallet. Contrary to that immature child above, my CPU does well over 60fps in the all games I play, even the latest, but it can't reach the magic 144fps, or even 120fps in many cases although the experience is still surprisingly smooth. This thing probably has something like an 80-100% performance increase over my aged CPU so will have no trouble at all hitting those highs. Can't wait! :cool:
Heh, I kept my Q9450 for nearly a decade (2008-2017!), and it served me well the entire time. If you buy a good CPU to begin with, it can last for ages. I only upgraded my 1600X to a 5800X because I could get it funded through work, otherwise that chip (which now lives a new and better, calmer life running my NAS) would have stayed in my main PC for at least a few generations more.

Still, as I said, I would keep my options open and take a close look at the upcoming Zen3 V-cache chips. I generally dislike Intel (mostly due to their long history of shitty business practices), but obviously make up your own mind from the factors that matter the most to you - both major CPU manufacturer deliver excellent performance and great platforms these days.
 
Joined
Mar 18, 2008
Messages
5,444 (0.89/day)
Location
Australia
System Name Night Rider | Mini LAN PC | Workhorse
Processor AMD R7 5800X3D | Ryzen 1600X | i7 970
Motherboard MSi AM4 Pro Carbon | GA- | Gigabyte EX58-UD5
Cooling Noctua U9S Twin Fan| Stock Cooler, Copper Core)| Big shairkan B
Memory 2x8GB DDR4 G.Skill Ripjaws 3600MHz| 2x8GB Corsair 3000 | 6x2GB DDR3 1300 Corsair
Video Card(s) MSI AMD 6750XT | 6500XT | MSI RX 580 8GB
Storage 1TB WD Black NVME / 250GB SSD /2TB WD Black | 500GB SSD WD, 2x1TB, 1x750 | WD 500 SSD/Seagate 320
Display(s) LG 27" 1440P| Samsung 20" S20C300L/DELL 15" | 22" DELL/19"DELL
Case LIAN LI PC-18 | Mini ATX Case (custom) | Atrix C4 9001
Audio Device(s) Onboard | Onbaord | Onboard
Power Supply Silverstone 850 | Silverstone Mini 450W | Corsair CX-750
Mouse Coolermaster Pro | Rapoo V900 | Gigabyte 6850X
Keyboard MAX Keyboard Nighthawk X8 | Creative Fatal1ty eluminx | Some POS Logitech
Software Windows 10 Pro 64 | Windows 10 Pro 64 | Windows 7 Pro 64/Windows 10 Home
It's not much, that's for sure. In Anandtech's SPEC2017 testing, the differences are as follows (they tested ADL with both DDR4 and DDR5):

Normalizing for clock speed (they are all likely to maintain peak boost in these ST workloads, so 5.2GHz vs. 4.9GHz):
12900K D5: 1,56538 pts/GHz INT, 2,72308 pts/GHz FP
12900K D4: 1,54038 pts/GHz INT, 2,62885 pts/GHz FP
5950X: 1,56122 pts/GHz INT, 2,48776 pts/GHz FP

Which, using the 5950X as a baseline:
ADL D5 IPC: +0,3% INT, +9,5% FP
ADL D4 IPC: -1,3% INT, +5,7% FP

Of course this is just in one set of workloads, but at least SPEC is an industry standard. The numbers will obviously be different in different workloads. But it's very close overall.

Yeah, this is where the difference between what TDP actually means vs. what it is understood to mean comes in. After all, TDP is actually a spec for OEMs and the like to say "this is the class of cooler you need for this chip to maintain stock performance", and it is divided into classes for simplicity rather than calculating an accurate TDP for each chip (as that would be a complete mess in terms of coolers). That's why you get those "54W" Celerons running at 30W, and so on. Something similar is true for AMD, though the problem here is that (just like Intel) they've also used TDP numbers as public-facing marketing classes without really explaining what the numbers mean, or even working to make people understand that TDP does not mean peak power consumption. That the relation between TDP and power draw (to the extent that there is one at all) differs between the two manufacturers just makes this into even more of a mess.

I was initially glad that Intel had ditched TDP for their two-tier base/boost power system, though it seems that is effectively worthless for K SKUs, with boost power being the only number to look at. It'll be interesting to see how this plays out across non-K SKUs though, and I would love to see AMD be more transparent as well.


Heh, I kept my Q9450 for nearly a decade (2008-2017!), and it served me well the entire time. If you buy a good CPU to begin with, it can last for ages. I only upgraded my 1600X to a 5800X because I could get it funded through work, otherwise that chip (which now lives a new and better, calmer life running my NAS) would have stayed in my main PC for at least a few generations more.

Still, as I said, I would keep my options open and take a close look at the upcoming Zen3 V-cache chips. I generally dislike Intel (mostly due to their long history of shitty business practices), but obviously make up your own mind from the factors that matter the most to you - both major CPU manufacturer deliver excellent performance and great platforms these days.

Seems to be inline then I guess to the results they got at Guru3D
IPC.jpg
 
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Seems to be inline then I guess to the results they got at Guru3D
View attachment 223815
Yep, seems similar. Using just a single application for IPC calculations is very sketchy though. You need a broader selection (ideally stressing different parts of the core and cache/memory subsystems) to get any kind of representative number.
 

qubit

Overclocked quantum bit
Joined
Dec 6, 2007
Messages
17,865 (2.87/day)
Location
Quantum Well UK
System Name Quantumville™
Processor Intel Core i7-2700K @ 4GHz
Motherboard Asus P8Z68-V PRO/GEN3
Cooling Noctua NH-D14
Memory 16GB (2 x 8GB Corsair Vengeance Black DDR3 PC3-12800 C9 1600MHz)
Video Card(s) MSI RTX 2080 SUPER Gaming X Trio
Storage Samsung 850 Pro 256GB | WD Black 4TB | WD Blue 6TB
Display(s) ASUS ROG Strix XG27UQR (4K, 144Hz, G-SYNC compatible) | Asus MG28UQ (4K, 60Hz, FreeSync compatible)
Case Cooler Master HAF 922
Audio Device(s) Creative Sound Blaster X-Fi Fatal1ty PCIe
Power Supply Corsair AX1600i
Mouse Microsoft Intellimouse Pro - Black Shadow
Keyboard Yes
Software Windows 10 Pro 64-bit
Heh, I kept my Q9450 for nearly a decade (2008-2017!), and it served me well the entire time. If you buy a good CPU to begin with, it can last for ages. I only upgraded my 1600X to a 5800X because I could get it funded through work, otherwise that chip (which now lives a new and better, calmer life running my NAS) would have stayed in my main PC for at least a few generations more.

Still, as I said, I would keep my options open and take a close look at the upcoming Zen3 V-cache chips. I generally dislike Intel (mostly due to their long history of shitty business practices), but obviously make up your own mind from the factors that matter the most to you - both major CPU manufacturer deliver excellent performance and great platforms these days.
Indeed, especially the bold bit.

At the time, the 2500K was all the rage for being the sweet spot between price and performance and it was indeed pretty good, but was definitely not as fast as the 2700K. However, I remember the comparative benchmarks + the HT capability and figured that the top version had better long term life potential and so it has proved to be. I think HT has proved to be more important than it at first seemed.

Dodgy business practices aside (sadly true, plus looking at you, NVIDIA) I still tend to prefer Intel over AMD, but not by the margin that I used to. If, when, it comes to upgrade time (6 months absolute minimum) AMD has something that looks better than Intel, then I'd go for that instead.

Back in 2005, I had the AMD 64-bit CPUs, single core then dual core (Manchester) and they were so fast for their time! I paired them with an Abit AN8 Ultra socket 939 mobo which was an amazing mobo for its day. I still have the hardware to this day.
 
Joined
May 8, 2021
Messages
1,978 (1.49/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
Yeah, this is where the difference between what TDP actually means vs. what it is understood to mean comes in. After all, TDP is actually a spec for OEMs and the like to say "this is the class of cooler you need for this chip to maintain stock performance", and it is divided into classes for simplicity rather than calculating an accurate TDP for each chip (as that would be a complete mess in terms of coolers). That's why you get those "54W" Celerons running at 30W, and so on. Something similar is true for AMD, though the problem here is that (just like Intel) they've also used TDP numbers as public-facing marketing classes without really explaining what the numbers mean, or even working to make people understand that TDP does not mean peak power consumption. That the relation between TDP and power draw (to the extent that there is one at all) differs between the two manufacturers just makes this into even more of a mess.
I remember GN got deep into what AMD's TDP means and basically the conclusion is that it means nothing as it was way too complicated to understand and was not representing anything of meaning. AMD in particular has an awful track record of measuring power and tried to invent their own marketing friendly numbers (ADC), which meant absolutely nothing for buyer or OEM. At least Intel is a bit better than AMD, but once this shit with boost started (board makers violating it right and left, people expecting boost to work like base, Intel making gazillion names and stages for boost), Intel really sunk to AMD level of "measuring" power usage and heat output.


I was initially glad that Intel had ditched TDP for their two-tier base/boost power system, though it seems that is effectively worthless for K SKUs, with boost power being the only number to look at. It'll be interesting to see how this plays out across non-K SKUs though, and I would love to see AMD be more transparent as well.
I will replay like grandpa here and say that Intel should just stop all their TDP and TDP tier bullshit altogether. They should just disclose maximum achievable power usage at base and boost (also manufacturing variation), then chip to heatsink transfer efficiency and for good measure, make this government regulated, because we are all getting screwed over this stuff. You know, it would be nice for once having some numbers that aren't complete hoopla and be able to plan heatsink buying decisions properly. Or at the very least give them heavy fines for lying to public. If they could do it in Pentium 3 era, they certainly can do it now. I'm not buying BS that they can't, because modern chips are too advanced.
 
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I remember GN got deep into what AMD's TDP means and basically the conclusion is that it means nothing as it was way too complicated to understand and was not representing anything of meaning. AMD in particular has an awful track record of measuring power and tried to invent their own marketing friendly numbers (ADC), which meant absolutely nothing for buyer or OEM. At least Intel is a bit better than AMD, but once this shit with boost started (board makers violating it right and left, people expecting boost to work like base, Intel making gazillion names and stages for boost), Intel really sunk to AMD level of "measuring" power usage and heat output.
AMD's TDP value is a reverse engineering of Intel's way of doing this, to provide OEMs with comparable classes of coolers to avoid confusion. That is it's only real-world use case. And power doesn't factor into the calculation at all, funnily enough. For consumers, the only value of it is to tell us the other, non-published numbers behind it, such as various power limits, but the TDP number itself is useless (and despite its use in marketing, which is a really harebrained idea) has never been meant to be useful for consumers.
I will replay like grandpa here and say that Intel should just stop all their TDP and TDP tier bullshit altogether. They should just disclose maximum achievable power usage at base and boost (also manufacturing variation), then chip to heatsink transfer efficiency and for good measure, make this government regulated, because we are all getting screwed over this stuff. You know, it would be nice for once having some numbers that aren't complete hoopla and be able to plan heatsink buying decisions properly. Or at the very least give them heavy fines for lying to public. If they could do it in Pentium 3 era, they certainly can do it now. I'm not buying BS that they can't, because modern chips are too advanced.
I would love for there to be a standardized way of measuring and describing this, though that would inevitably be difficult with different die sizes, IHS sizes (and thicknesses), different internal TIMs, and so on. Still, it would be nice to at least give it a try. You'd still need some sort of tiering system though, as the formula you'd need for something like this wouldn't produce a human-readable output, but just some number. Is a higher or lower number better? By how much? Ultimately it wouldn't be any less complex than TDP - though you might have the benefit of removing the "W" and thus not having people think this number directly represents power draw, which would be good. Standardized tiers could lead to standardized cooler classes, which would simplify things quite a bit (no more of the dubiously-labeled "150W" or "250W" coolers, but, say, "tier 5" coolers for "tier 5" CPUs. The divisions would always be somewhat arbitrary, and no cooler design would fit perfectly within a category, but at least you'd have some guarantee that it uphold base performance unless you screw up the installation.
 
Joined
May 8, 2021
Messages
1,978 (1.49/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
I would love for there to be a standardized way of measuring and describing this, though that would inevitably be difficult with different die sizes, IHS sizes (and thicknesses), different internal TIMs, and so on. Still, it would be nice to at least give it a try.
In a way that I described that wouldn't be a problem, since I said power usage and heat transfer from chip to IHS efficiency (for non IHS chips, it's from source to outer surface of it). Thermal paste is completely dependent on OEM or DIY builder.

You'd still need some sort of tiering system though, as the formula you'd need for something like this wouldn't produce a human-readable output, but just some number. Is a higher or lower number better? By how much?
That's something that Intel or AMD shouldn't care about as it is solely OEM's business (or DIY builder's). Intel's or AMD's only responsibility is to provide useful and accurate data (which they don't do now).

Ultimately it wouldn't be any less complex than TDP - though you might have the benefit of removing the "W" and thus not having people think this number directly represents power draw, which would be good. Standardized tiers could lead to standardized cooler classes, which would simplify things quite a bit (no more of the dubiously-labeled "150W" or "250W" coolers, but, say, "tier 5" coolers for "tier 5" CPUs. The divisions would always be somewhat arbitrary, and no cooler design would fit perfectly within a category, but at least you'd have some guarantee that it uphold base performance unless you screw up the installation.
Sounds like an interesting idea, but I would rather have watts. Most of those dubious measurements exist mostly due to cooler makers trying to guess what AMD's or Intel's TDP is and they want to make sure that they won't specify too weak coolers for chips as it has many negative consequences for them.
 
Joined
May 2, 2017
Messages
7,762 (2.78/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
In a way that I described that wouldn't be a problem, since I said power usage and heat transfer from chip to IHS efficiency (for non IHS chips, it's from source to outer surface of it). Thermal paste is completely dependent on OEM or DIY builder.
Sorry, but you're wrong here. First, I said internal TIM, i.e. between the IHS and die, not cooler and IHS. Similarly, heat transfer from the chip to the IHS isn't a simple linear function, but is dependent on the thermal density of the die, the thickness of the diffusion barrier on top of the die, the internal TIM, and the materials and thickness of the IHS, so "measuring" this is really complicated. Remember, heat isn't generated evenly across the die, and thus doesn't transfer evenly through the IHS. The most important job of the IHS is to spread heat outwards from hot spots, which is where the thickness and materials of the IHS comes into play (a thicker IHS will have a larger corss section through which to spred heat outwards). And of course the thermal density (and overall power consumption/heat output of the die, though the two aren't directly related) of the die forms the basis for this. So, any kind of standardized way of measuring this would by its very nature privilege certain designs over others, depending on the specifics of the testing methodology. For example, do you calculate a single number for the entire IHS, despite this being a gross oversimplification of real-world temperatures? Do you calculate an "overall" and a "hotspot" number? If so, how do you balance the two? And variables like die size, IHS size, the ratio between the two, and several other factors will affect all of this.
That's something that Intel or AMD shouldn't care about as it is solely OEM's business (or DIY builder's). Intel's or AMD's only responsibility is to provide useful and accurate data (which they don't do now).
Again: no. OEMs want to know what they're buying, and want to know the parameters in which the parts work and the necessary capabilities of ancillary components like coolers. The only party capable of reliably specifying this is the chipmaker - though ideally this would be done in a standardized way. The last thing an OEM in a low-margin commodity market wants to do is have to deal with implementing a dynamic system for specifying coolers across systems.
Sounds like an interesting idea, but I would rather have watts. Most of those dubious measurements exist mostly due to cooler makers trying to guess what AMD's or Intel's TDP is and they want to make sure that they won't specify too weak coolers for chips as it has many negative consequences for them.
But then you're entirely missing the point. "Watts" isn't a viable measure for what you need to keep a CPU cool. Two different CPUs with identical 50W power draws can be incredibly easy or essentially impossible to cool - with the same cooler! - depending on factors like thermal density and the capability of the IHS to spread heat out from hotspots. That's why Zen3 CPUs often run hot - they don't consume that much power, but they have extreme thermal density coupled with an off-centre die placement, which leads to very different behaviour at the same wattage as other CPUs. Using "watts" to measure this is why we have a problem to begin with. This is similar to how GPUs are much easier to cool than CPUs thanks to direct-die cooling and much more even thermal loads across the die. For CPUs, a single number for this can only ever be an abstraction calculated from many different metrics, and trying to align this directly with power draw is really problematic.

Also, cooler makers don't need to guess anything. They are provided with the formulas for TDP calculation and know precisely what these mean. This is why the term TDP exists at all. The problem lies in the implementation, as well as tiering and the lack of adherence to TDP-like power draw levels - after all, there literally doesn't exist a >125W Intel MSDT TDP, so how do you design a cooler for a ~250W CPU on that platform? Plus, of course, cooler design and performance varies wildly depending on uncontrollable factors like ambient temperature and case airflow. The dubious ratings come from cooler makers either overselling their products, deviating from TDP standards, shooting higher than those allow for, or trying to account for something else in their numbers.
 
Joined
Feb 1, 2019
Messages
3,667 (1.70/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
Wprime tests not good?

Shame I don't know how it compares to my 9900k or on Windows 10.

Also I wonder if we see DDR4 based reviews, as we don't know how much of the performance gains are from reduced i/o wait time.

I personally see lack of overclocking as a good thing, user's getting more performance out of the box is a good thing and also reduces silicon lottery issues.
 
Joined
May 8, 2021
Messages
1,978 (1.49/day)
Location
Lithuania
System Name Shizuka
Processor Intel Core i5 10400F
Motherboard Gigabyte B460M Aorus Pro
Cooling Scythe Choten
Memory 2x8GB G.Skill Aegis 2666 MHz
Video Card(s) PowerColor Red Dragon V2 RX 580 8GB ~100 watts in Wattman
Storage 512GB WD Blue + 256GB WD Green + 4TH Toshiba X300
Display(s) BenQ BL2420PT
Case Cooler Master Silencio S400
Audio Device(s) Topping D10 + AIWA NSX-V70
Power Supply Chieftec A90 550W (GDP-550C)
Mouse Steel Series Rival 100
Keyboard Hama SL 570
Software Windows 10 Enterprise
Sorry, but you're wrong here. First, I said internal TIM, i.e. between the IHS and die, not cooler and IHS. Similarly, heat transfer from the chip to the IHS isn't a simple linear function, but is dependent on the thermal density of the die, the thickness of the diffusion barrier on top of the die, the internal TIM, and the materials and thickness of the IHS, so "measuring" this is really complicated.
That's why I mentioned heat transfer efficiency.


Remember, heat isn't generated evenly across the die, and thus doesn't transfer evenly through the IHS. The most important job of the IHS is to spread heat outwards from hot spots, which is where the thickness and materials of the IHS comes into play (a thicker IHS will have a larger corss section through which to spred heat outwards). And of course the thermal density (and overall power consumption/heat output of the die, though the two aren't directly related) of the die forms the basis for this. So, any kind of standardized way of measuring this would by its very nature privilege certain designs over others, depending on the specifics of the testing methodology. For example, do you calculate a single number for the entire IHS, despite this being a gross oversimplification of real-world temperatures? Do you calculate an "overall" and a "hotspot" number? If so, how do you balance the two? And variables like die size, IHS size, the ratio between the two, and several other factors will affect all of this.
Seems like average temperature of whole IHS would be the best way to do that, while also stating hotspot temp.


Again: no. OEMs want to know what they're buying, and want to know the parameters in which the parts work and the necessary capabilities of ancillary components like coolers. The only party capable of reliably specifying this is the chipmaker - though ideally this would be done in a standardized way. The last thing an OEM in a low-margin commodity market wants to do is have to deal with implementing a dynamic system for specifying coolers across systems.
I would argue that they should do it anyway, as OEMs like Dell, HP or Acer have really poor reputation for many overheating machines. They cannot be making crap forever at some point it will hurt their sales.


But then you're entirely missing the point. "Watts" isn't a viable measure for what you need to keep a CPU cool. Two different CPUs with identical 50W power draws can be incredibly easy or essentially impossible to cool - with the same cooler! - depending on factors like thermal density and the capability of the IHS to spread heat out from hotspots. That's why Zen3 CPUs often run hot - they don't consume that much power, but they have extreme thermal density coupled with an off-centre die placement, which leads to very different behaviour at the same wattage as other CPUs. Using "watts" to measure this is why we have a problem to begin with. This is similar to how GPUs are much easier to cool than CPUs thanks to direct-die cooling and much more even thermal loads across the die. For CPUs, a single number for this can only ever be an abstraction calculated from many different metrics, and trying to align this directly with power draw is really problematic.
Watts aren't a problem, if you measure watts and then heat transfer from chip to IHS efficiency, what is then left unclear or misinforming? That covers odd chips like Ryzens.


Also, cooler makers don't need to guess anything. They are provided with the formulas for TDP calculation and know precisely what these mean. This is why the term TDP exists at all. The problem lies in the implementation, as well as tiering and the lack of adherence to TDP-like power draw levels - after all, there literally doesn't exist a >125W Intel MSDT TDP, so how do you design a cooler for a ~250W CPU on that platform? Plus, of course, cooler design and performance varies wildly depending on uncontrollable factors like ambient temperature and case airflow. The dubious ratings come from cooler makers either overselling their products, deviating from TDP standards, shooting higher than those allow for, or trying to account for something else in their numbers.
GN did a video about AMD's TDP and asked Cooler Master about this, they said that TDP doesn't mean much and that it's certainly not clear how capable coolers they should design.
 
Joined
Jan 27, 2015
Messages
1,746 (0.48/day)
System Name Legion
Processor i7-12700KF
Motherboard Asus Z690-Plus TUF Gaming WiFi D5
Cooling Arctic Liquid Freezer 2 240mm AIO
Memory PNY MAKO DDR5-6000 C36-36-36-76
Video Card(s) PowerColor Hellhound 6700 XT 12GB
Storage WD SN770 512GB m.2, Samsung 980 Pro m.2 2TB
Display(s) Acer K272HUL 1440p / 34" MSI MAG341CQ 3440x1440
Case Montech Air X
Power Supply Corsair CX750M
Mouse Logitech MX Anywhere 25
Keyboard Logitech MX Keys
Software Lots
Wprime tests not good?

Shame I don't know how it compares to my 9900k or on Windows 10.

Also I wonder if we see DDR4 based reviews, as we don't know how much of the performance gains are from reduced i/o wait time.

I personally see lack of overclocking as a good thing, user's getting more performance out of the box is a good thing and also reduces silicon lottery issues.

Just keep in mind the DDR5 used here is 6000.

Computerbase.de shows DDR4-3200 C12 and DDR4-3800 walking all over DDR5-4400. This is not surprising, but I doubt early adopters are going to be running DDR5-4400.

Tom's used DDR5-4800 C36 and DDR4-3200 on Alder Lake and the older platforms, a bit more realistic but they didn't specify the DDR4 settings that I saw. The kit was DDR5-6000 but they set it to one of the more normal speeds.

So far my take is that 'good' DDR4 is faster than DDR5, at least the obtainable DDR5-5200. I think that is not necessarily true of the DDR5-6000+ when it is full speed, but you can't actually buy that stuff.
 
Top