# Intel Core i5-12600K



## W1zzard (Nov 4, 2021)

The Core i5-12600K is the price/performance king in the Intel Alder Lake lineup. With its competitive pricing of $300, it's a clear winner against AMD's Ryzen 5 5600X and faster than even the 5800X in many applications and games. This is the gaming CPU you want.

*Show full review*


----------



## sepheronx (Nov 4, 2021)

It's good. But the price of the components like DDR5 and motherboards are beyond what I'm willing to spend for a few FPS more.


----------



## bug (Nov 4, 2021)

Nice little beast, despite not being the most efficient.
Sadly, if Win11 can't get scheduling right, there's probably zero change these will work right on Win10 for a while. I will also have to check the support on Linux.


----------



## mechtech (Nov 4, 2021)

Nice chip.  New motherboard is a bummer. Be nice if there was more new gpus at msrp available.   Not much point buying one to pair with an rx480 lol

@W1zzard 

any plans to do testing on windows 10???


----------



## W1zzard (Nov 4, 2021)

mechtech said:


> any plans to do testing on windows 10???


Yes, next week https://www.techpowerup.com/forums/threads/intel-core-i9-12900k.288509/post-4642317


----------



## bug (Nov 4, 2021)

sepheronx said:


> It's good. But the price of the components like DDR5 and motherboards are beyond what I'm willing to spend for a few FPS more.


You don't have to pay for DDR5, most motherboards are built for DDR4. The test system is using DDR5 because it has to be forward-looking, but expect a DDR5 vs DDR4 comparison for Alder Lake in a few days at most.


----------



## P4-630 (Nov 4, 2021)

Interesting little CPU...
Would be fine with my 2070 Super.


----------



## Fourstaff (Nov 4, 2021)

Intel definitely did expectation management properly this time. Fast? check. Power hungry? check. 

Its nice to see real competition again, not just on price, but absolute performance.


----------



## phanbuey (Nov 4, 2021)

bug said:


> Nice little beast, despite not being the most efficient.
> Sadly, if Win11 can't get scheduling right, there's probably zero change these will work right on Win10 for a while. I will also have to check the support on Linux.



Gamers nexus ran 12900k  with windows 10 with gear 2 ram lol, still was awesome.


----------



## Agent_D (Nov 4, 2021)

"Beats 5600x conclusively". Well, sure, in pure numbers with no other consideration; 14% overall performance increase in CPU tests with a multi-core increase of 50% power consumption (126w vs 189w). I'm sure each individual test varies, but it's not that impressive in relation to the power increase.

Nice reviews on all the new CPU's regardless!


----------



## W1zzard (Nov 4, 2021)

Agent_D said:


> "Beats 5600x conclusively". Well, sure, in pure numbers with no other consideration; 14% overall performance increase in CPU tests with a multi-core increase of 50% power consumption (126w vs 189w). I'm sure each individual test varies, but it's not that impressive in relation to the power increase.


MT power test is Cinebench, so 126 W vs 189 W = ~+50%, and 11225 CB vs 17699 CB = ~+57%


----------



## mechtech (Nov 4, 2021)

Actually with the efficiency cores I was expecting better idle power.  This due to scheduling in windows or nm or both or other?

edit https://www.newegg.ca/intel-core-i5-12600k-core-i5-12th-gen/p/N82E16819118347
At least cad price is inline with US price.  I wonder if any of AMD cpus will get a price adjustment?


----------



## sepheronx (Nov 4, 2021)

bug said:


> You don't have to pay for DDR5, most motherboards are built for DDR4. The test system is using DDR5 because it has to be forward-looking, but expect a DDR5 vs DDR4 comparison for Alder Lake in a few days at most.


It's also the motherboard prices.


----------



## Hossein Almet (Nov 4, 2021)

Sure, It beats the 5600X conclusively by consuming 90W more power.  Sorry, the 5600X still gets my money.


----------



## nguyen (Nov 4, 2021)

Looks like Intel is back on the menu boys 

Though for 4K gaming, I doubt 12900K make any difference to 9900K


----------



## lexluthermiester (Nov 4, 2021)

While this will be a good mid-range CPU, I think the 12700k is still the better bang-for-buck value for many use-case-scenario's.


----------



## Agent_D (Nov 4, 2021)

W1zzard said:


> MT power test is Cinebench, so 126 W vs 189 W = ~+50%, and 11225 CB vs 17699 CB = ~+57%


Just now realized that the power consumption test said Cinebench; thanks for replying on that!

I know it's not really feasible (time constraints) to do, but it would be interesting to see power consumption comparisons on other things like gaming and real world application use. I say this because of the huge disparity between something like Cinebench and the Prime95 power charts. It's probably irrelevant anyways as most people don't particularly care about the difference in power consumption as long as the performance is higher.


----------



## yeeeeman (Nov 4, 2021)

Linus tech tips youtube leaked slides show ADL consuming less power than zen 3 parts in gaming.


----------



## bug (Nov 4, 2021)

Hossein Almet said:


> Sure, It beats the 5600X conclusively by consuming 90W more power.  Sorry, the 5600X still gets my money.


That's whole system power, not just CPU. And that's 63W, unless you stress your system around the clock.

@W1zzard What's up with testing Java8? Java is at version 17 now.


----------



## W1zzard (Nov 4, 2021)

bug said:


> What's up with testing Java8? Java is at version 17 now.


Thanks ! Noted for next rebench (early 2022), wasn't there some compatibility issues introduced with newer versions? or was it a licensing change?


----------



## tussinman (Nov 4, 2021)

Hossein Almet said:


> Sure, It beats the 5600X conclusively by consuming 90W more power.  Sorry, the 5600X still gets my money.


To be fair you could just wait for the non E-core version to come out (12400/12500). Should be still faster than the 5600x but significantly cheaper and less power


----------



## ncrs (Nov 4, 2021)

W1zzard said:


> Thanks ! Noted for next rebench (early 2022), wasn't there some compatibility issues introduced with newer versions? or was it a licensing change?


Both: JavaFX was moved out, and licensing for 11 was changed in the Oracle version. They relented because their bait-and-switch didn't really pan out and 17 is back at "normal" licensing. A bit too late, because most people moved to some variant of OpenJDK.


----------



## 80-watt Hamster (Nov 4, 2021)

Hossein Almet said:


> Sure, It beats the 5600X conclusively by consuming 90W more power.  Sorry, the 5600X still gets my money.



From these results, it doesn't look to me like AL has quite caught up to Zen 3 in efficiency.  The gap isn't huge, though.  Comparing 12600K to 5600X makes notional sense because they share core count (ignoring E-cores) and projected street price, but test results show that the Z3 part that it's most on parity with is the 5800X.  With that in mind, and singling out Cinebench as it does a good job loading all cores:

Power consumption





Energy usage




Score




I'm honestly impressed, and was not expecting AL to perform this well.  For my part, the decision of 12600K vs. 5600X is easy, unless I really felt like giving Intel the metaphorical finger.  Of course, cost of Z690 boards and the need for more robust cooling tempers that enthusiasm a bit.  It'll be really interesting to see where 12400, 12600 and mainstream boards come in for pricing.


----------



## bug (Nov 4, 2021)

W1zzard said:


> Thanks ! Noted for next rebench (early 2022), wasn't there some compatibility issues introduced with newer versions? or was it a licensing change?


Compatibility - not exactly. Some internal APIs which were exposed up until Java9 were properly locked away. Problems have been mostly taken care of in the meantime.
Licensing - yes. But only for whoever installs Java in production environments.

There were however internal changes that may produce very different results from Java 8.

Thanks for the hard work, now go catch some sleep (till Sunday or so).


----------



## kruk (Nov 4, 2021)

The leaks overhyped Alder Lake to the moon, but in reality it's just a normal generational jump with disappointing power consumption, possible backwards compatibility issues and high platform costs ...


----------



## Tsukiyomi91 (Nov 4, 2021)

i5-12600K dethroned the 5600X in most benchmarks while maintaining a small gap with the 5800X, 5900X and 5950X. AMD is feeling the pinch now for being complacent.


----------



## mechtech (Nov 4, 2021)

W1zzard said:


> Yes, next week https://www.techpowerup.com/forums/threads/intel-core-i9-12900k.288509/post-4642317



awesome.  Thanks

Will you also add the amd 5600g and 5700g to all the charts??


----------



## dicktracy (Nov 4, 2021)

Basically no reason to get a 5600x or 5800x when both are overpriced right out of the gate. Intel dominates every single price point and you don’t need to participate in their mandatory AGESA beta-testing program .


----------



## ncrs (Nov 4, 2021)

Tsukiyomi91 said:


> i5-12600K dethroned the 5600X in most benchmarks while maintaining a small gap with the 5800X, 5900X and 5950X. AMD is feeling the pinch now for being complacent.


You are comparing a completely new platform with an old platform. I don't see how you can say AMD is being complacent here. They can always slash prices, and we're still going to get the increased cache variants.
Intel winning here is the least they could do, to be honest I was expecting more (especially in the power department).


----------



## W1zzard (Nov 4, 2021)

mechtech said:


> Will you also add the amd 5600g and 5700g to all the charts??


AMD has borrowed my samples "for 3-4 weeks", that was on October 6th  But yeah, eventually I'll add them


----------



## RandallFlagg (Nov 4, 2021)

Tsukiyomi91 said:


> i5-12600K dethroned the 5600X in most benchmarks while maintaining a small gap with the 5800X, 5900X and 5950X. AMD is feeling the pinch now for being complacent.



Actually it dethroned the 5800X in most benchmarks, and beats every Zen 3 in all 3 resolutions tested in games.  From other review sites it runs quite well on DDR4 too, even DDR4-3200.  

AMD is going to need to make the 5600X $200 and the 5800X $300 for them to make any kind of sense at all for a new build or significant upgrade.  They would only make sense at current prices for someone who is just doing a CPU swap on a compatible motherboard.


----------



## bobsled (Nov 4, 2021)

Tsukiyomi91 said:


> i5-12600K dethroned the 5600X in most benchmarks while maintaining a small gap with the 5800X, 5900X and 5950X. AMD is feeling the pinch now for being complacent.


I don't think AMD is being complacent, if anything, they're doing extremely well considering they're fighting for fab space and have a much lower budget than Intel.

Alder Lake was under design/testing long before the new CEO took the helm too - he'll no doubt take the credit regardless.


----------



## HenrySomeone (Nov 4, 2021)

Agent_D said:


> "Beats 5600x conclusively". Well, sure, in pure numbers with no other consideration; 14% overall performance increase in CPU tests with a multi-core increase of 50% power consumption (126w vs 189w). I'm sure each individual test varies, but it's not that impressive in relation to the power increase.
> 
> Nice reviews on all the new CPU's regardless!


It also beats the 5800x, at the same power efficiency. All that is left is to wait for more affordable motherboards to arrive (ddr4 ones for now) and it will be the undisputable champion of the mid-range (and better in single thread and gaming than any Ryzen, hehe)


----------



## Dristun (Nov 4, 2021)

HWUnboxed said that DDR5 vs DDR4 difference is insignificant even against that ludicrous G.Skill kit so it's possible to count that out of equation in value comparisons. That leaves the motherboards and 100$ saving against 5800X handily covers the difference imo - wouldn't you rather be faster at the same price anyway? Can't wait to see how AMD responds with pricing and where Zen3D lands.


----------



## HD64G (Nov 4, 2021)

So much so for the revolution that AL would bring...

To sum up, when you build a new AL system with a 12600K on DDR5 with a decent board you will spend more than building a system on AM4 with a 5800X that will use less energy, gain on average ~3% at cpu tasks and lose on average ~3% in gaming @1440P when using a high-end GPU. Where is the benefit to customers with the arrival of AL? I hope AMD is feeling threatened (I wouldn't) to lower the Zen3 CPU's prices by $50 for 5600X, 5800X and 5900X and by $100 for 5950X.

Anyone remembers the Intel test that showed them winning by much bigger % in gaming vs 5950X using the broken version of win11 for Ryzen CPUs?

Let's wait for Zen3D now to see if it will take back the gaming crown.


----------



## RandallFlagg (Nov 4, 2021)

Dristun said:


> HWUnboxed said that DDR5 vs DDR4 difference is insignificant even against that ludicrous G.Skill kit so it's possible to count that out of equation in value comparisons. That leaves the motherboards and 100$ saving against 5800X handily covers the difference imo - wouldn't you rather be faster at the same price anyway? Can't wait to see how AMD responds with pricing and where Zen3D lands.



It's really not even $100.  The problem is there are no midrange Z670 type socket 1700 boards.  If you compare the lower end Z690 prices to X570 you're probably talking $50, and for that you get PCIe 5.0 with more unshared PCIe 4.0 lanes.  It'll be a few months before we see those lower end boards.

This is a limited release for higher end enthusiast parts is what it boils down to.   That is a negative, but only if you are not looking for higher end enthusiast parts.


----------



## bug (Nov 4, 2021)

HD64G said:


> So much so for the revolution that AL would bring...
> 
> To sum up, when you build a new AL system with a 12600K on DDR5 with a decent board you will spend more than building a system on AM4 with a 5800X that will use less energy, gain on average ~3% at cpu tasks and lose on average ~3% in gaming @1440P when using a high-end GPU. Where is the benefit to customers with the arrival of AL? I hope AMD is feeling threatened (I wouldn't) to lower the Zen3 CPU's prices by $50 for 5600X, 5800X and 5900X and by $100 for 5950X.
> 
> Let's wait for Zen3D now to see if it will take back the gaming crown.


12600k isn't a high-end chip. Why would you put it in a DDR5 system?
The article is out already: you can pair this with a $250 DDR4 mobo even today. Though the ideal time to build a mainstream system is when the mainstream chipsets are launched, too.


----------



## londiste (Nov 4, 2021)

@W1zzard, Single Thread Energy Usage based on SuperPi is an interesting point in this review - this is a single test result that seems to be very different from all the other single-thread test results when it comes to performance.


----------



## thegnome (Nov 4, 2021)

Not worth a Z690 board and 12600K compared to the 5600x with a much cheaper motherboard. Will obviously be better with a B-series board but especially if AMD drops price it's barely worth it for gaming. For people doing more than just gaming I suppose it's a much better deal compared to the 5600x.


----------



## HenrySomeone (Nov 4, 2021)

HD64G said:


> So much so for the revolution that AL would bring...
> 
> To sum up, when you build a new AL system with a 12600K on DDR5 with a decent board you will spend more than building a system on AM4 *with a 5800X that will use less energy*, gain on average ~3% at cpu tasks and lose on average ~3% in gaming @1440P when using a high-end GPU. Where is the benefit to customers with the arrival of AL? I hope AMD is feeling threatened (I wouldn't) to lower the Zen3 CPU's prices by $50 for 5600X, 5800X and 5900X and by $100 for 5950X.
> 
> Let's wait for Zen3D now to see if it will take back the gaming crown.


This is a straight out LIE!


----------



## kruk (Nov 4, 2021)

HenrySomeone said:


> It also beats the 5800x, at the same power efficiency.



But how? If you would read the review:

- It has worse single-thread efficiency than the 5800x
- It has the same multi-thread efficiency than the 5800x
- It's barely slower in CPU tests than the 5800x
- It's barely faster in gaming tests than the 5800x

It might have slightly better energy efficiency in fully threaded games and in fully threaded apps, but in everything else this is not true.


----------



## bug (Nov 4, 2021)

londiste said:


> @W1zzard, Single Thread Energy Efficiency based on SuperPi is an interesting point in this review - this is a single test result that seems to be very different from all the other single-thread test results when it comes to performance.


Probably one of the cases where the workload went to the E core


----------



## HD64G (Nov 4, 2021)

bug said:


> 12600k isn't a high-end chip. Why would you put it in a DDR5 system?
> The article is out already: you can pair this with a $250 DDR4 mobo even today. Though the ideal time to build a mainstream system is when the mainstream chipsets are launched, too.


I said that it will end up being more expensive than AM4 platform. And even with a DDR4 board it will be more expensive. The truth of my post about the price difference doesn't change either way.


----------



## R0H1T (Nov 4, 2021)

Tsukiyomi91 said:


> i5-12600K dethroned the 5600X in most benchmarks while maintaining a small gap with the 5800X, 5900X and 5950X. AMD is feeling the pinch now for being complacent.


AMD's biggest issue has been capacity, at TSMC, for at least the last 2 if not 3 years, including GPU's as well. Complacency has little to do with it, also no they're not feeling any pinch right now!


----------



## Chrispy_ (Nov 4, 2021)

Having read your reviews here and watched a couple of youtube reviews, it's clear that Intel have the lead in IPC when using DDR5, but can only achieve the wins at hideous power limits.

This 12600K is clearly where Intel shines this generation. Competitive power consumption and better performance across the board than the the 5600X for a similar CPU-only price.

What I'm waiting to see is the performance of the 12600K (and budget offerings, presumably 12500F etc) using DDR4 on B660 boards. As good as the 12600K is in this review, it costs as much as a 5900X, B550, and good quality DDR4-3600 kit just because the Z690 and DDR5 carry such a high premium that they utterly decimate the value proposition.


----------



## Richards (Nov 4, 2021)

Slaps the 5600x & 5800x  around and cheaper... i.m sure it makes 5600x & 5800x owners  insecure  about cores and performance (he he)


----------



## W1zzard (Nov 4, 2021)

londiste said:


> @W1zzard, Single Thread Energy Usage based on SuperPi is an interesting point in this review - this is a single test result that seems to be very different from all the other single-thread test results when it comes to performance.





bug said:


> Probably one of the cases where the workload went to the E core


I just checked for you, it gets scheduled on P cores, but bounces between two of them. I rather suspect that the new architecture isn't a best fit for the old x86 floating-point instruction mix of Super Pi

Maybe for the 2022 bench I can replace it with another 1T workload. Any suggestions? I rather not Cinebench but something else


----------



## ncrs (Nov 4, 2021)

W1zzard said:


> I just checked for you, it gets scheduled on P cores, but bounces between two of them. I rather suspect that the new architecture isn't a best fit for the old x86 floating-point instruction mix of Super Pi
> 
> Maybe for the 2022 bench I can replace it with another 1T workload. Any suggestions? I rather not Cinebench but something else


You are probably already aware of it, but the open-source Phoronix Test Suite also runs on Windows and has some ST-focused tests like simdjson and Ngspice. It would be nice to see some more Linux benchmarks on TPU as well


----------



## W1zzard (Nov 4, 2021)

ncrs said:


> You are probably already aware of it, but the open-source Phoronix Test Suite also runs on Windows and has some ST-focused tests like simdjson and Ngspice. It would be nice to see some more Linux benchmarks on TPU as well


No plans for Linux and no plans for exotic tests that nobody has heard of


----------



## ncrs (Nov 4, 2021)

W1zzard said:


> No plans for Linux and no plans for exotic tests that nobody has heard of


Understandable, none of them are "mainstream", but Linux might, maybe, finally, perhaps, be having a shot at that with the Steam Deck.


----------



## bug (Nov 4, 2021)

HD64G said:


> I said that it will end up being more expensive than AM4 platform. And even with a DDR4 board it will be more expensive. The truth of my post about the price difference doesn't change either way.


I like that you deal in absolutes. So, if I can get 5800X performance for a lot less, same RAM, a mobo for the same $250, how exactly is the AM4 platform cheaper? Enlighten me, please.


----------



## Dristun (Nov 4, 2021)

Well, just ordered a 12600K and Gigabyte's GamingX DDR4! Might be stuck with boxes for a bit because the shop didn't have Scythe's mounting kits or any new coolers with them in stock though


----------



## bug (Nov 4, 2021)

W1zzard said:


> I just checked for you, it gets scheduled on P cores, but bounces between two of them. I rather suspect that the new architecture isn't a best fit for the old x86 floating-point instruction mix of Super Pi
> 
> Maybe for the 2022 bench I can replace it with another 1T workload. Any suggestions? I rather not Cinebench but something else


Leave it be. Don't change the benchmark because the scheduler can't deal with the code.


----------



## HenrySomeone (Nov 4, 2021)

bug said:


> I like that you deal in absolutes. So, if I can get 5800X performance for a lot less, same RAM, a mobo for the same $250, how exactly is the AM4 platform cheaper? Enlighten me, please.


You'll be hard pressed to get an evidence-backed reply from an obvious team red member...


----------



## mechtech (Nov 4, 2021)

W1zzard said:


> AMD has borrowed my samples "for 3-4 weeks", that was on October 6th  But yeah, eventually I'll add them


. “Borrowed”


----------



## W1zzard (Nov 4, 2021)

mechtech said:


> . “Borrowed”


Yes really, to send them to another reviewer


----------



## londiste (Nov 4, 2021)

W1zzard said:


> I just checked for you, it gets scheduled on P cores, but bounces between two of them. I rather suspect that the new architecture isn't a best fit for the old x86 floating-point instruction mix of Super Pi
> 
> Maybe for the 2022 bench I can replace it with another 1T workload. Any suggestions? I rather not Cinebench but something else


I would say the SuperPi test is fine, it is a good data point either way. 
What nags me a little is that this is what efficiency is based on and this result does seem to be an outlier.


----------



## W1zzard (Nov 4, 2021)

londiste said:


> outlier


Not seeing it?


----------



## mechtech (Nov 4, 2021)

W


W1zzard said:


> Yes really, to send them to another reviewer


well hopefully they are back in your hands soon!!


----------



## LifeOnMars (Nov 4, 2021)

Nowhere near 2500K/2600K levels of excitement.


----------



## W1zzard (Nov 4, 2021)

mechtech said:


> well hopefully they are back in your hands soon!!


and hopefully not totally fucked up like that 5900X that another reviewer borrowed. It can't even run 1600 MHz Infinity Fabric now. When asked about replacement my AMD contacts said "we can't help", and my emails to another AMD contact got ignored


----------



## dicktracy (Nov 4, 2021)

ADL is very power efficient during gaming, which most people will be using these chips for.


----------



## HD64G (Nov 4, 2021)

bug said:


> I like that you deal in absolutes. So, if I can get 5800X performance for a lot less, same RAM, a mobo for the same $250, how exactly is the AM4 platform cheaper? Enlighten me, please.


AM4 boards can be bought for much less than Z690. Why to buy a $250 one? Intel reduced CPU prices to battle the much higher board (and DDR5 for whoever goes for it) cost. For less money as total cost for both platforms why bother with AL?


----------



## The red spirit (Nov 4, 2021)

This thing only has a reason to exist to be a placeholder for 12400(F), which will actually deliver some value, but I wonder if that will happen, IPC gains are small and locked chips have lower clocks.


----------



## bug (Nov 4, 2021)

HD64G said:


> AM4 boards can be bought for much less than Z690. Why to buy a $250 one? Intel reduced CPU prices to battle the much higher board (and DDR5 for whoever goes for it) cost. For less money as total cost for both platforms why bother with AL?


There is no "much higher board cost", just cheaper CPUs.
Z690 boards cost roughly the same as X570 boards. AMD made us pay the X570 tax for about a year before releasing B550, Intel will follow up on Z690 in about 3 months.


----------



## cst1992 (Nov 4, 2021)

bug said:


> You don't have to pay for DDR5, most motherboards are built for DDR4


Does that mean that the sockets are the same 288-pin ones?


----------



## Deleted member 215115 (Nov 4, 2021)

bug said:


> There is no "much higher board cost", just cheaper CPUs.
> Z690 boards cost roughly the same as X570 boards. AMD made us pay the X570 tax for about a year before releasing B550, Intel will follow up on Z690 in about 3 months.


Dude, you're wrong. Z boards have always been more expensive than X boards.

Asus X570-P: $160
Asus Z690-P: $230

Strix X570-F: $320
Strix Z690 F: $400


----------



## RandallFlagg (Nov 4, 2021)

rares495 said:


> Dude, you're wrong. Numbers don't lie.
> 
> Asus X570-P: $160
> Asus Z690-P: $230
> ...



It's a $45 to $75 difference to get Z690 vs X570 same model line - today, on launch day for the Z690.   Yes I've looked.

For that you typically get a board with demonstrably more features and capabilities - PCIe 5.0, more PCIe 4.0 lanes typically for more 4.0/4x m.2, Thunderbolt 4 in many cases. 

It is a negative that at this moment there are no mid-range boards for AL, but there will be in a few months. 

There are some other consequences, by virtue of only being higher end Z690, there's a very small selection for mATX and ITX.   In fact the only mATX I can find in stock is Asus Prime, at $179.   The only other mATX I've seen offered at all is $360.  ITX is more available it seems but only the higher end models.

So none of this is real surprising on launch day.


----------



## The King (Nov 4, 2021)

Launch Price for the 12600K in South Africa R6499 (427USD) 

Current Price for a 5600X R5299 (348USD) 

So much for price vs performance.


----------



## cst1992 (Nov 4, 2021)

W1zzard said:


> and hopefully not totally fucked up like that 5900X that another reviewer borrowed. It can't even run 1600 MHz Infinity Fabric now. When asked about replacement my AMD contacts said "we can't help", and my emails to another AMD contact got ignored


What'd they do, beat it with a hammer?


----------



## bug (Nov 4, 2021)

The King said:


> Launch Price for the 12600K in South Africa R6499 (427USD)
> 
> Current Price for a 5600X R5299 (348USD)
> 
> So much for price vs performance.


It still comes down to whether you want 5800X performance for 12600k money or for 5800X money


----------



## cst1992 (Nov 4, 2021)

The King said:


> Launch Price for the 12600K in South Africa R6499 (427USD)
> 
> Current Price for a 5600X R5299 (348USD)
> 
> So much for price vs performance.


What about 5800x? What I saw in this review is that the 12600k is on par with 5800x on many counts.


----------



## Deleted member 215115 (Nov 4, 2021)

RandallFlagg said:


> It's a $45 to $75 difference to get Z690 vs X570 same model line - today, on launch day for the Z690.   Yes I've looked.
> 
> For that you typically get a board with demonstrably more features and capabilities - PCIe 5.0, more PCIe 4.0 lanes typically for more 4.0/4x m.2, Thunderbolt 4 in many cases.
> 
> ...


mATX is dying because it's a weird form factor. Less and less SKUs each year.

PCI-E 5.0 is even more useless than PCI-E 4.0. Few minutes ago we were complaining about board prices, now we're thinking of adding more NVMe storage? Really? Thunderbolt 4 is cool if you need it. 95% of people don't.

Then there's B550 which is unlocked and way cheaper...


----------



## The King (Nov 4, 2021)

bug said:


> It still comes down to whether you want 5800X performance for 12600k money or for 5800X money


Not if you factor the cost of a new LGA 1700 motherboard and going for DDR5 RAM. Cost goes out the window.
B450 boards can run 5800X and 5600X for cheap.


----------



## cst1992 (Nov 4, 2021)

The King said:


> going for DDR5 RAM


Based on what I've read, DDR4 modules are compatible (with LGA1700).


----------



## R0H1T (Nov 4, 2021)

cst1992 said:


> What I saw in this review is that the 12600k is on par with 5800x on many counts.


So is the 5600x, depends a lot on what you're measuring though.



cst1992 said:


> Based on what I've read, DDR4 modules are compatible.


Not on the same board, unless I've missed some that support dual (types of) memory.


----------



## Deleted member 24505 (Nov 4, 2021)

Nice chip. think i will be having me a 12600k setup very soon.


----------



## wheresmycar (Nov 4, 2021)

I don't know why I get sucked into all that pre-launch hype "EVERYTIME". I was expecting something more, more faster and less on the power consumption. Seems like an ordinary next Gen jump to push past ZEN 3. AMD will reply with the same.. and so on. Both AMD and Intel are taking the pee pee with pricing hence the competition doesn't help the consumer.

Intels AL mobo pricing....YUCK!

Dunno how fat all your pockets are but mine are saying stay well away from these daylight robbers. For 1440p gaming, I can't see anything special enough to consider an upgrade over my current 7700K setup. Certainly not worth forking out $500+ for (unless specific game/core requirements suggest otherwise).

EDIT: just venting!!!!! I wanna/needa/shoulda upgradee!


----------



## cst1992 (Nov 4, 2021)

R0H1T said:


> Not on the same board, unless I've missed some that support dual (types of) memory.


No, not on the same board. The Z690 Hero that TPU uses supports only DDR5, whereas the Prime Z690-P D4 supports only DDR4. Both support LGA1700.


----------



## The King (Nov 4, 2021)

cst1992 said:


> Based on what I've read, DDR4 modules are compatible.


Yes but you have to go for either a DDR4 board or DDR5 board. Going to a new Platform and sticking to DDR4 seems like a step backwards.
Review for DDR4 boards will answer this question. Still cost of the LGA 1700 boards have to be factored in with the cost of the 12XX CPUS. 
So it adds to overall cost at the end of the day.


----------



## R0H1T (Nov 4, 2021)

Similar prices? I'm assuming DDR4 boards could be cheaper?


cst1992 said:


> No, not on the same board. The Z690 Hero that TPU uses supports only DDR5, whereas the Prime Z690-P D4 supports only DDR4. Both support LGA1700.


Also with unlocked processors & basically unlimited turbo you better get a good board now!


----------



## cst1992 (Nov 4, 2021)

The King said:


> Yes but you have to go for either a DDR4 board or DDR5 board. Going to a new Platform and sticking to DDR4 seems like a step backwards.


Apart from DDR4 support, the boards support most other features of 12th gen like PCIe 5.0. So for those willing to wait for matured DDR5 it seems like a good stepping stone.
But if you're willing to wait, best to wait until Alder Lake launches proper.


----------



## dir_d (Nov 4, 2021)

If i were buying today it would be ADL on a DDR4 board but i mainly game and this 5600x works just fine at 4k.


----------



## prtskg (Nov 4, 2021)

About power consumption method Wizzard, why not use average for single thread tests and multiple thread tests? It will be better, more rounded value, than using any particular tests. Thanks for all the tests though.


----------



## RandallFlagg (Nov 4, 2021)

The King said:


> Yes but you have to go for either a DDR4 board or DDR5 board. Going to a new Platform and sticking to DDR4 seems like a step backwards.
> Review for DDR4 boards will answer this question. Still cost of the LGA 1700 boards have to be factored in with the cost of the 12XX CPUS.
> So it adds to overall cost at the end of the day.



If you already own a Zen 3 capable motherboard and aren't interested in PCIe 5.0 and have no need of additional 4.0 lanes for storage, then the case for doing anything more than buying a Zen 3 for an upgrade isn't very good.

However for anyone on older hardware where they will need a new motherboard + CPU, I think it would be dumb to not do an Alder Lake upgrade given these results and current pricing.  This would pretty much apply to anyone on Intel who is planning to upgrade, anyone on Zen 1 or 1.5, and many on Zen 2 if their motherboard does not support Zen 3.  

That is a pretty narrow set who might upgrade to Zen 3 based on empirical results and not just feelings.  It really just leaves people who have Zen 2 and compatible motherboards and don't intend to do more than a CPU upgrade, maybe those who have a 5600X and want to go to a 5900 or 5950X, where Zen 3 makes sense.


----------



## Dyatlov A (Nov 4, 2021)

Awesome processor, this good priced i5 beats all AMD processors in single thread use and in gaming. I want one!


----------



## mechtech (Nov 4, 2021)

W1zzard said:


> and hopefully not totally fucked up like that 5900X that another reviewer borrowed. It can't even run 1600 MHz Infinity Fabric now. When asked about replacement my AMD contacts said "we can't help", and my emails to another AMD contact got ignored


Wow. That’s too bad.  I guess I can understand AMDs point.  If people are going to ruin chips not much incentive to give them away for nothing.   But should have been the other reviewer.  Not TPU.


----------



## HD64G (Nov 4, 2021)

bug said:


> There is no "much higher board cost", just cheaper CPUs.
> Z690 boards cost roughly the same as X570 boards. AMD made us pay the X570 tax for about a year before releasing B550, Intel will follow up on Z690 in about 3 months.


For now, B550 offers max performance for Zen3 and cost at least $100 or 100 euros less than Z690 though. In 3 months, Zen3D might get in the front again and this discussion could be irrelevant.



cst1992 said:


> What'd they do, beat it with a hammer?


They must have heavily OCed the SOC to reach and stabilise very high RAM frequency. That, rapidly deteriorates the MC in the CPU.


----------



## cst1992 (Nov 4, 2021)

HD64G said:


> In 3 months, Zen3D might get in the front again and this discussion could be irrelevant.


Personally, I'm looking to get a 5600x but now I'm thinking I should wait until Vermeer-X3D reviews surface.


----------



## R0H1T (Nov 4, 2021)

HD64G said:


> Zen3D might get in the front again and this discussion could be irrelevant.


It will gain a lot in gaming, that's for sure but (other) applications would be even more interesting.


----------



## The King (Nov 4, 2021)

RandallFlagg said:


> If you already own a Zen 3 capable motherboard and aren't interested in PCIe 5.0 and have no need of additional 4.0 lanes for storage, then the case for doing anything more than buying a Zen 3 for an upgrade isn't very good.
> 
> However for anyone on older hardware where they will need a new motherboard + CPU, I think it would be dumb to not do an Alder Lake upgrade given these results and current pricing.  This would pretty much apply to anyone on Intel who is planning to upgrade, anyone on Zen 1 or 1.5, and many on Zen 2 if their motherboard does not support Zen 3.
> 
> That is a pretty narrow set who might upgrade to Zen 3 based on empirical results and not just feelings.  It really just leaves people who have Zen 2 and compatible motherboards and don't intend to do more than a CPU upgrade, maybe those who have a 5600X and want to go to a 5900 or 5950X, where Zen 3 makes sense.


If Zen1 thought us anything is that first revisions are always buggy and most of these are ironed out in the next revision or two with better performance and efficiency. 
ADL does have its fair share of issues along high power consumption under full load.  Also Windows 11 is still buggy with ADL.

If your GPU is bottled necked by your CPU or you upgraded from your 1080p 60hz monitor to a 1440p 240hz or 360hz monitor which
is a very niche market according to steam. Vast majority of gamer's are still on 1080p 60hz and others on 1440p 144hz at least the ones that I know. 

Then you get those who suffer from CUD.


----------



## RedelZaVedno (Nov 4, 2021)

Great to see 5600X and 5800X eat dust due to AMD's pricing policy. AMD will have to lower prices now. I'm waiting on 5600X/5800X to come down to sub $170/250 space before buying. IF AMD stays greedy, I'll have to opt for 12400F or 12600KF. I love good competition.


----------



## Turmania (Nov 4, 2021)

cst1992 said:


> Apart from DDR4 support, the boards support most other features of 12th gen like PCIe 5.0. So for those willing to wait for matured DDR5 it seems like a good stepping stone.
> But if you're willing to wait, best to wait until Alder Lake launches proper.


----------



## Bruno_O (Nov 4, 2021)

RedelZaVedno said:


> Great to see 5600X and 5800X eat dust due to AMD's pricing policy. AMD will have to lower prices now. I'm waiting on 5600X/5800X to come down to sub $170/250 space before buying. IF AMD stays greedy, I'll have to opt for 12400F or 12600KF. I love good competition.


then pair a 12600KF with a Z690 motherboard, and end up with the same platform cost a 5800X + B550 would cost... (using DDR4 on both)
~5% better at 1080p gaming and PCI-E 5.0 make it worth it, for now
but as someone said, if we do get +15% on 5800X 3D in 3 months as AMD promised, you'd be losing ~10% performance in gaming, again at the same cost

Personally, I will wait for Zen 4 and whatever Lake after AL. And I did expect more of AL as per "impressive" leaks... 9% better than a 5600X while using 50W more at 720p is not great.


----------



## Meanhx (Nov 4, 2021)

The King said:


> If Zen1 thought us anything is that first revisions are always buggy and most of these are ironed out in the next revision or two with better performance and efficiency.
> ADL does have its fair share of issues along high power consumption under full load.  Also Windows 11 is still buggy with ADL.



I agree, if you can wait I think you should. I think that in a year or so when 13th gen launches it will be worth upgrading, by then bios, OS and other software will have matured to take advantage of and work well almost always with the "big.little" design. We will have more cheap motherboards on the market and DDR5 will have matured a bit.


----------



## looniam (Nov 4, 2021)

on a side note:
having had, albeit briefly, a B560 mobo its damned silly to buy a Z chipset anymore for most stuff ie. gaming; unless you need the pci-e/nvme expansion.(x4/x8 DMI lanes). 
but hey, throw away $100+  . . i guess.


----------



## WhoDecidedThat (Nov 4, 2021)

In Igor Lab's review, (<- linked here) they measure CPU power consumption when gaming -






and measure watts consumed per fps.






Just putting it out there as an additional data point to consider.


----------



## Agent_D (Nov 4, 2021)

blanarahul said:


> According to Igor Lab's review (<- linked here) where they measure CPU power consumption when gaming -
> 
> 
> 
> ...


Stuff like this is why I like to see multiple measurement points for power consumption. Having multiple measurements gives a much broader audience the information they need to decide if they want whatever the product is.


----------



## RedelZaVedno (Nov 4, 2021)

Bruno_O said:


> then pair a 12600KF with a Z690 motherboard, and end up with the same platform cost a 5800X + B550 would cost... (using DDR4 on both)
> ~5% better at 1080p gaming and PCI-E 5.0 make it worth it, for now
> but as someone said, if we do get +15% on 5800X 3D in 3 months as AMD promised, you'd be losing ~10% performance in gaming, again at the same cost
> 
> Personally, I will wait for Zen 4 and whatever Lake after AL. And I did expect more of AL as per "impressive" leaks... 9% better than a 5600X while using 50W more at 720p is not great.


I'd never buy overpriced Z MB. I'll wait on B or even A MB... These should be priced between $100 and $150. I'm upgrading to whatever gives me most bang for a buck in February/March. Be it AMD or Intel. Value is all that matters to me and 'll finally have a lot of very compelling options to choose from in 2022. I hope Intel brings the same heat to dGPU market too.


----------



## W1zzard (Nov 4, 2021)

prtskg said:


> About power consumption method Wizzard, why not use average for single thread tests and multiple thread tests? It will be better, more rounded value, than using any particular tests. Thanks for all the tests though.


What do you mean? Somehow measure the several hours mix of all my benchmarks? What about the idle time between tests? Bench startup time? Also many of my tests are neither "single-threaded" not "fully multi-threaded". Now you can interpolate your scenario because you know 1T = x W and nT = y W.


----------



## cst1992 (Nov 4, 2021)

@W1zzard What do you think about using a separate PSU when doing CPU tests to see how much only the CPU is consuming?

LTT did it in this video but I'm not sure if it's an effective method.


----------



## RandallFlagg (Nov 4, 2021)

blanarahul said:


> According to Igor Lab's review (<- linked here) where they measure CPU power consumption when gaming -
> 
> 
> 
> ...




This mirrors what I've seen at some other sites, including Anandtech.  Peak load benchmarks have really always just been troll bait.


----------



## Turmania (Nov 4, 2021)

Overall it is a great improvement from Intel, definetely on the right track, and taking performance as well. I would not call it an ryzen destroyer though. This is why I love competitive competition, just that with all this ruckus about manufacturing availability, everyone raised prices on every compenent makes it really hard to adjust to this new world economics so soon.


----------



## Gameslove (Nov 4, 2021)

4K gaming Intel Core i5-12600K vs Ryzen 5 2600?


----------



## NuCore (Nov 4, 2021)

TPU Did you get a lot of money from Intel for advertising and favoritism? You tested Alder Lake on 2 x 16GB DDR5-6000 36-36-36-76 1T memories, which are as common on the market as virgins in the brothel.


----------



## RandallFlagg (Nov 4, 2021)

12600K power draw at max draw 720p testing on multiple games.  

No 5600X here but 12600K is by far the most efficient of the higher performance gaming CPUs :


----------



## seth1911 (Nov 4, 2021)

They are to expensive but a 12600k took the crown back to intel against 5950/5900X in games 

If i would buy new, then a 12600K and not that poor 12 or 16 cores from AMD


----------



## HenrySomeone (Nov 4, 2021)

Turmania said:


> Overall it is a great improvement from Intel, definetely on the right track, and taking performance as well. I would not call it an ryzen destroyer though. This is why I love competitive competition, just that with all this ruckus about manufacturing availability, everyone raised prices on every compenent makes it really hard to adjust to this new world economics so soon.


Not completely, but it does destroy their mid-high range (5600x - 5900x), while in the lower to mid end they've been successfully taken out by ... themselves by not offering anything worth buying under $300. The only AMD chip therefore with some advantages left is the 5950x and that's mostly if you do lots and lots of cpu renders.


----------



## cst1992 (Nov 4, 2021)

NuCore said:


> tested Alder Lake on 2 x 16GB DDR5-6000 36-36-36-76 1T memories


So? Nowhere in the review does a reviewer say that the user has to have the exact same config as what the review has. They only say that to replicate the results you have to have the same setup.

The RAM has to be the fastest possible so that the only bottleneck is the CPU. Honestly, you should read more reviews.


----------



## R0H1T (Nov 4, 2021)

RandallFlagg said:


> No 5600X here but 12600K is by far the most efficient of the higher performance gaming CPUs :


You do realize you're quoting efficiency numbers for the CPU in what's generally a GPU heavy task i.e. gaming 

I assume you also have the GPU locked at a certain frequency & normalized the results with other variables taken care of?


----------



## NuCore (Nov 4, 2021)

TPU why do you censor statements by changing the sense of metaphors used in the comments?

Where is the announced AMD pogrom? Just under 5% at 1440p is a joke considering the extremely high temperatures on such a large cooling system, not to mention the enormous power consumption.

Intel's marketing did a good job of driving the hype around Alder Lake. 12th gen of Intel Core is like reactivate the Pentium 4 (temperatures and power consumption).

Intel will collect a nice mant from Zen3+ : D


----------



## seth1911 (Nov 4, 2021)

HenrySomeone said:


> Not completely, but it does destroy their mid-high range (5600x - 5900x), while in the lower to mid end they've been successfully taken out by ... themselves by not offering anything worth buying under $300. The only AMD chip therefore with some advantages left is the 5950x and that's mostly if you do lots and lots of cpu renders.


yeah they take em self out, no competition to 10400 or 11400.

AMD have for the same prices only a 1200 12nm (130€ like 10400) or a 3600 (169€ like a 11400)


----------



## HenrySomeone (Nov 4, 2021)

R0H1T said:


> You do realize you're quoting efficiency numbers for the CPU in what's generally a GPU heavy task i.e. gaming
> 
> I assume you also have the GPU locked at a certain frequency & normalized the results with other variables taken care of?


720p is a GPU heavy task?   It's great that he found that table, because hopefully it will lay to rest at least some of the blatantly false statements about Intel also using up a lot more power in gaming, just like the one ultra-fanboy comment below yours. But knowing how hard-headed the redsters are, probably not many...


----------



## R0H1T (Nov 4, 2021)

HenrySomeone said:


> 720p is a GPU heavy task?   It's great that he found that table, because hopefully it will lay to rest at least some of the blatantly false statements about Intel also using up a lot more power in gaming, just like the one ultra-fanboy comment below yours. But knowing how hard-headed the redsters are, probably not many...


It is a "gaming" task so yeah I would say it's generally GPU heavy ~ I'm not comparing it to 4k if that's what you're thinking! Btw I'm waiting for those GPU power numbers on 720p gaming, do you have those?


----------



## RandallFlagg (Nov 4, 2021)

FPS per watt in gaming :

Edit :  So, I did not notice this before but the DDR5 12900K is notably more efficient than the DDR4 based variants.  About 15% more efficient than the DDR4 with the same 241/241 PL1/PL2 power settings.


----------



## R0H1T (Nov 4, 2021)

Which clearly shows the huge variability in results ~ is the DDR4 based test consuming more watts because the *IMC* is less efficient than the one for DDR5 or is it something else? When the GPU itself is probably consuming more in a task than the CPU it's not really a "CPU efficiency" test but carry on


----------



## HenrySomeone (Nov 4, 2021)

It IS a CPU efficiency test for gaming! Denying this just shows you are one of the above described as well.


----------



## ncrs (Nov 4, 2021)

RandallFlagg said:


> FPS per watt in gaming :
> 
> Edit :  So, I did not notice this before but the DDR5 12900K is notably more efficient than the DDR4 based variants.  About 15% more efficient than the DDR4 with the same 241/241 PL1/PL2 power settings.


I wish there was a 5600X and 5800X in both your sources. I wonder how much difference that second chiplet in Ryzens makes.


----------



## R0H1T (Nov 4, 2021)

You can't do an "efficiency" test with half the power being consumed omitted from your charts, WTH are you on about? Still waiting for those GPU power numbers 


HenrySomeone said:


> It IS a CPU efficiency test for gaming! Denying this just shows you are one of the above described as well.


I had someone on AT use the same BS numbers for "gaming IPC" ~ suffice to say that's another BS metric for basically what can be termed as "horse manure" in polite terms!


----------



## Agent_D (Nov 4, 2021)

R0H1T said:


> You can't do an "efficiency" test with half the power being consumed omitted from your charts, WTH are you on about? Still waiting for those GPU power numbers


You're literally not understanding the concept here. The efficiency is only from the CPU perspective of the test, on purpose, to show the efficiency of the CPU power usage during said testing... It's not that hard to comprehend, come on.


----------



## HenrySomeone (Nov 4, 2021)

You'll keep on waiting then, I certainly won't use a single minute trying to find them for someone who flat out denies the category in the first place.


----------



## RandallFlagg (Nov 4, 2021)

ncrs said:


> I wish there was a 5600X and 5800X in both your sources. I wonder how much difference that second chiplet in Ryzens makes.



Yes that is missing.  I think those would be pretty efficient as well, since the 5900X has higher clocks and high clocks = low efficiency.  Nevertheless 5900X and 5950X are losing from an efficiency / power perspective in gaming and it appears based on the PCWorld graph in productivity too.  

Another thing to note is that the 12900K is locked max power 241W.  It will beat the 5900X/5950X in games handily at that setting, but technically it can go to higher power limits and be a little bit faster too.   The choice to use that extra power for a couple percent is of course up to the user.


----------



## R0H1T (Nov 4, 2021)

Agent_D said:


> to show the efficiency of the CPU power usage during said testing


And again that's a huge variable in & of itself! You do realize with unlimited turbo the power consumption numbers can vary greatly from board to board? Depending largely on the clocks. If it was really something that useful, or consistent, more outlets like AT or TPU would test it.


----------



## WhoDecidedThat (Nov 4, 2021)

RandallFlagg said:


> 12600K power draw at max draw 720p testing on multiple games.
> 
> No 5600X here but 12600K is by far the most efficient of the higher performance gaming CPUs :
> 
> View attachment 223688


Source?


----------



## Why_Me (Nov 4, 2021)

It's nice to see Team Blue back on top of the mountain again but tbh I'm looking forward to the locked cpu's w/B660 boards and DDR4.


----------



## WhoDecidedThat (Nov 4, 2021)

cst1992 said:


> @W1zzard What do you think about using a separate PSU when doing CPU tests to see how much only the CPU is consuming?
> 
> LTT did it in this video but I'm not sure if it's an effective method.



Do it! This sounds awesome!


----------



## phanbuey (Nov 4, 2021)

12600K and MSI mobo on the way.,.. cant wait to tune some Gear 1 ram settings and run some benches.

I have a disease...


----------



## The red spirit (Nov 4, 2021)

Why_Me said:


> It's nice to see Team Blue back on top of the mountain again but tbh I'm looking forward to the locked cpu's w/B660 boards and DDR4.


Only in power consumption it's on top of mountain.


----------



## Why_Me (Nov 4, 2021)

The red spirit said:


> Only in power consumption it's on top of mountain.


Obviously you haven't looked at the benchmarks.


----------



## The red spirit (Nov 4, 2021)

Why_Me said:


> Obviously you haven't looked at the benchmarks.


I have, horrendous power consumption with tiny edge over competition. It's literally the same as overclocking FX 9590 and saying that it's faster than i7 2600K, while ignoring horrendous power consumption and heat output.


----------



## Why_Me (Nov 4, 2021)

The red spirit said:


> I have, horrendous power consumption with tiny edge over competition. It's literally the same as overclocking FX 9590 and saying that it's faster than i7 2600K, while ignoring horrendous power consumption and heat output.


Thankfully I live in a country with electricity.  With that said Alder Lake is at the top of the mountain atm and anybody who says otherwise either didn't look at the benches or is straight up lying.


----------



## B-Real (Nov 4, 2021)

"Intel Core i5-12600K 47% Faster Than Ryzen 5 5600X in Leaked CPU-Z Benchmark"

Real life circumstances: 16% faster with 4 extra cores.


----------



## The red spirit (Nov 4, 2021)

Why_Me said:


> Thankfully I live in a country with electricity.  With that said Alder Lake is at the top of the mountain atm and anybody who says otherwise either didn't look at the benches or is straight up lying.


Benches say: STAY AWAY, IT'S GONNA BURN DOWN YOUR HOUSE.


----------



## B-Real (Nov 4, 2021)

Why_Me said:


> Thankfully I live in a country with electricity.  With that said Alder Lake is at the top of the mountain atm and anybody who says otherwise either didn't look at the benches or is straight up lying.


Yes. It is an undoubtedly faster than a 1 year old CPU. *With 4 extra cores*...  What an achievement, really!


----------



## Why_Me (Nov 4, 2021)

B-Real said:


> Yes. It is an undoubtedly faster than a 1 year old CPU. *With 4 extra cores*...  What an achievement, really!


What line-up of AMD cpu's was Alder Lake supposed to be compared to?


----------



## R-T-B (Nov 4, 2021)

bug said:


> I will also have to check the support on Linux.


Thread Director presently has zero linux support.


----------



## HenrySomeone (Nov 4, 2021)

The red spirit said:


> I have, horrendous power consumption with tiny edge over competition. It's literally the same as overclocking FX 9590 and saying that it's faster than i7 2600K, while ignoring horrendous power consumption and heat output.


Yeah....no. You'd burn through your motherboard OC-ing it and it still wouldn't match a perhaps slightly OCed 2600k at stock voltage and going the other way, downclocking it to Sandy Bridge level of power use, you got about q6600 level of performance. Alder Lake on the other hand, while indeed consuming a fair bit of juice while under all-core loads, can outdo even the (more expensive) 5950x and when downtuned a bit, will still healthily beat 5900x at the same power or match it at about 5800x's, so no comparison at all. Oh, and just to drive the point from the first sentence home:
MSI AMD 970 Krait with a 9590 on fire - YouTube


----------



## B-Real (Nov 4, 2021)

Why_Me said:


> What line-up of AMD cpu's was Alder Lake supposed to be compared to?



Intel was preparing for this new architecture like crazy. What they achieved is out of the 3 models, only 1 is able to beat its Zen 3 rival. The other 2 are 4-5% slower than the Zen 3 models because they have the same amount of cores. The i5 has 67% more cores and can outpower the 5600X by 16%. And this is with DDR5 memory compared to DDR4, so it's possible their will be a bigger loss for i7 and i9 and a smaller win for i5. Not to mention that all 3 Intel models have worse efficiency than the Zen 3 models. And Zen 4 is yet to come. I have no idea why you are that optimistic. And BTW, your "With that said Alder Lake is at the top of the mountain atm" is basically NOT TRUE. Regarding CPU power, the 5950X beats the i9 while the 5900X beats the i7. So in terms what AL is at the top of the mountain? Because it's 7% faster in FHD with an RTX 3080 which no one will use in real life circumstances?


----------



## Turmania (Nov 4, 2021)

Isn't this wonderfull, now it's your turn AMD, consumers wins. . This is what I want to see, they outdoing each release...


----------



## HenrySomeone (Nov 4, 2021)

B-Real said:


> "Intel Core i5-12600K 47% Faster Than Ryzen 5 5600X in Leaked CPU-Z Benchmark"
> 
> Real life circumstances: 16% faster with 4 extra cores.


You dirty little liar! Real life circumstances - 58% faster and that's in AMD's favorite benchmark of them all - Cinebench multi thread!


----------



## B-Real (Nov 4, 2021)

HenrySomeone said:


> You dirty little liar! Real life circumstances - 58% faster and that's in AMD's favorite benchmark of them all - Cinebench multi thread!



What I see is an overall 16% lead over the 5600X with all the results averaged. What do You see?







HenrySomeone said:


> and when downtuned a bit, will still healthily beat 5900x at the same power or match it at about 5800x's


This is really funny tbh. Hope you was writing the same regarding Vega 56.


----------



## HenrySomeone (Nov 4, 2021)

The leaked CPU-z benchmark (that you of course quoted) was multi-thread and in that regard it actually outdoes it by another 10% and if you can't understand (or rather pretend you don't because it doesn't suit your red agenda) why an aggregate of all tests can never be so high (not even 5950x is 50% ahead of 5600x), we can have no further discussion.
And oh, Vega was crap that couldn't even compete against 1080, never mind the 1080Ti, both of which came out before (way before in the case of the former). You could tune the 56 to about 1070 power and performance, but that one was much cheaper, so what was the point in that? Either way, the market had it's say and the fact that they never achieved any penetration is saying enough.


----------



## The red spirit (Nov 4, 2021)

HenrySomeone said:


> Yeah....no. You'd burn through your motherboard OC-ing it and it still wouldn't match a perhaps slightly OCed 2600k at stock voltage and going the other way, downclocking it to Sandy Bridge level of power use, you got about q6600 level of performance. Alder Lake on the other hand, while indeed consuming a fair bit of juice while under all-core loads, can outdo even the (more expensive) 5950x and when downtuned a bit, will still healthily beat 5900x at the same power or match it at about 5800x's, so no comparison at all. Oh, and just to drive the point from the first sentence home:
> MSI AMD 970 Krait with a 9590 on fire - YouTube


It wouldn't beat any Ryzen at power normalized benches. It manages to get 10% advantage over Ryzen with over 300 watts. Ryzen's turbo (for X parts) is a little bit over 100 watts. Sure power usage and clock speed scaling is non linear (it's exponential), but still, i9 would (and i5) would need a lot of detuning and as result, performance would suffer.


----------



## HenrySomeone (Nov 4, 2021)

It (handily!) beats 5950x! It will easily outdo the 5900x at the same power!


----------



## regs (Nov 4, 2021)

Would be also interesting to see Gracemont cores performance in comparison to Skylake - single thread, multi thread and single thread per same clock. Is it possible to switch off P-cores instead to use E-cores only, so they can be tested?


----------



## W1zzard (Nov 4, 2021)

regs said:


> Is it possible to switch off P-cores instead to use E-cores only, so they can be tested?


The official answer is "at least 1 p core must be active". Working around that is on my list for next week


----------



## The red spirit (Nov 4, 2021)

HenrySomeone said:


> It (handily!) beats 5950x! It will easily outdo the 5900x at the same power!


That's just 2000 point difference. IPC of Alder lake sucks. You will need 1GHz reduction at minimum to start looking at similar power consumption to 5950X and that will mean performance way worse than 5950X.


----------



## HenrySomeone (Nov 4, 2021)

Read again what I said, carefully this time.


----------



## The red spirit (Nov 4, 2021)

HenrySomeone said:


> Read again what I said, carefully this time.


I read everything properly last time. I simply doubt what you say, that's all.


----------



## Why_Me (Nov 5, 2021)

B-Real said:


> Intel was preparing for this new architecture like crazy. *What they achieved is out of the 3 models, only 1 is able to beat its Zen 3 rival.* The other 2 are 4-5% slower than the Zen 3 models because they have the same amount of cores. The i5 has 67% more cores and can outpower the 5600X by 16%. And this is with DDR5 memory compared to DDR4, so it's possible their will be a bigger loss for i7 and i9 and a smaller win for i5. Not to mention that all 3 Intel models have worse efficiency than the Zen 3 models. And Zen 4 is yet to come. I have no idea why you are that optimistic. And BTW, your "With that said Alder Lake is at the top of the mountain atm" is basically NOT TRUE. Regarding CPU power, the 5950X beats the i9 while the 5900X beats the i7. So in terms what AL is at the top of the mountain? Because it's 7% faster in FHD with an RTX 3080 which no one will use in real life circumstances?


----------



## wolf (Nov 5, 2021)

as a 5900X owner I am extremely pleased Intel took back the gaming crown, your move AMD! this is great news for all.

Also a very interesting and passionate thread, some great arguments to be made for either really. A 12600K DDR4 combo should certainly at least be priced and considered for any gaming build from today onward.


----------



## Mistral (Nov 5, 2021)

Very happy for Intel. This should do them nicely untill the next Ryzen generation.


----------



## arni-gx (Nov 5, 2021)

i5 12600k = a new hope....... the sith luke skywalker is still newbie on the dark side of the force.....


----------



## The King (Nov 5, 2021)

So you own a 5600X and the 12600K is overall 6.6% faster @1080p and difference @1440p 4.9%.  (PBO should reduced that )

CPU work loads is definitely were ADL shines. Gaming wise it's really not that impressive compared to a 1 year old CPU from AMD.
Definitely not for those who have boards capable for running AMD 5XXX series CPUS in terms of gaming performance and the cost of upgrading your mobo as well.


----------



## Hyderz (Nov 5, 2021)

impressive performance for its price as well


----------



## DanglingPointer (Nov 5, 2021)

For those that don't care about Window$ and have move to the Light Side of the OS Force...

That's 146 Tests!
All results here... 
Review here...












Quote Reply


----------



## MarsM4N (Nov 5, 2021)

The red spirit said:


> I have, horrendous power consumption with tiny edge over competition. It's literally the same as overclocking FX 9590 and saying that it's faster than i7 2600K, while ignoring horrendous power consumption and heat output.



According to *igorsLAB's Review* the i7 consumes in gaming *1,2w less* & the i5 *3,9w more* than a *5600x* (while delivering more fps than a *5950x*). 

He hasn't tested workload power consumption yet, but judging by W1zzard's review, I guess the picture will there look different. To me it looks like the Intel's are, when not pushed, like in games, very efficent (esp. for 10nm). When pushed they power will spike up, but with the more horsepower the'll get the job faster done. So in the end it's still efficent. How efficent we will see in the comming reviews.


----------



## The King (Nov 5, 2021)

MarsM4N said:


> According to *igorsLAB's Review* the i7 consumes in gaming *1,2w less* & the i5 *3,9w more* than a *5600x* (while delivering more fps than a *5950x*).
> 
> He hasn't tested workload power consumption yet, but judging by W1zzard's review, I guess the picture will there look different. To me it looks like the Intel's are, when not pushed, like in games, very efficent (esp. for 10nm). When pushed they power will spike up, but with the more horsepower the'll get the job faster done. So in the end it's still efficent. How efficent we will see in the comming reviews.


Different reviewers are reporting different Power consumption seems ADL silicon quality is not the same for all reviewers.
I would be concerned of certain chip performing much better for some reviewers than others. Usually reviewers are given higher binned chips that retail consumers almost never see.



> What many of you are missing is that the power draw figures right now are all over the place. At Guru3D this is the score they got with an all core *5.3GHz overclock and a 469W CPU Power draw figure*





> This is my score, that's higher but at *240W and only 5.1GHz*. My CPU won't do 5.3GHz all core.








						Intel Alder Lake early adopters thread | Tech Other
					

Oh, I'll also bet that 99% of people complaining about heat are never going to use the CPU in such a way - during gaming it's on par with AMD. That said, Blender uses AVX512 which everyone knows puts out a ton of heat/draws a lot of power. AMD doesn't support AVX512, hence the massive power...




					carbonite.co.za


----------



## Turmania (Nov 5, 2021)

Someone in another place described it perfectly,
"This is Intel's Zen1 moment."


----------



## Why_Me (Nov 5, 2021)

Turmania said:


> Someone in another place described it perfectly,
> "This is Intel's Zen1 moment."


Not a C2D moment for sure but still good.


----------



## MarsM4N (Nov 5, 2021)

The King said:


> Different reviewers are reporting different Power consumption seems ADL silicon quality is not the same for all reviewers.
> I would be concerned of certain chip performing much better for some reviewers than others. Usually reviewers are given higher binned chips that retail consumers almost never see.
> 
> 
> ...



igorsLAB didn't receive a "Golden Sample" from Intel, he didn't overclock & he did bench multiple games, not just one game like in your chart. 

Here's the footnote of his review (last page):

_"The test samples are retail CPUs and were not provided by Intel. A voluntary signing of the NDA for the CPUs was not accepted by Intel due to lack of relevance, so I am purely voluntarily adhering to the CPU embargo periods (keyword collegiality). Motherboard and memory come from the respective manufacturers and were only provided on the condition that the embargo periods for these products were adhered to."_

Ohh, just noticed he uploaded his *Review Part 2* where he tested the workload performance & workload power consumption. Have to check it out now.


----------



## GURU7OF9 (Nov 5, 2021)

But can you actually buy one right now,  and is it at $300?    unfortunately I think not on both counts !   

but it def looks best value by far! power limit seems to do nothing for gaming .

  if only you could find a way to crank it way harder for gaming !      have to be overclocking i guess?


----------



## Why_Me (Nov 5, 2021)

GURU7OF9 said:


> But can you actually buy one right now,  and is it at $300?    unfortunately I think not on both counts !
> 
> but it def looks best value by far! power limit seems to do nothing for gaming .
> 
> if only you could find a way to crank it way harder for gaming !      have to be overclocking i guess?


$320 USD at Newegg









						Intel Core i5-12600K - Core i5 12th Gen Alder Lake 10-Core (6P+4E) 3.7 GHz LGA 1700 125W Intel UHD Graphics 770 Desktop Processor - BX8071512600K - Newegg.com
					

Buy Intel Core i5-12600K - Core i5 12th Gen Alder Lake 10-Core (6P+4E) 3.7 GHz LGA 1700 125W Intel UHD Graphics 770 Desktop Processor - BX8071512600K with fast shipping and top-rated customer service. Once you know, you Newegg!




					www.newegg.com


----------



## Hossein Almet (Nov 5, 2021)

dicktracy said:


> Basically no reason to get a 5600x or 5800x when both are overpriced right out of the gate. Intel dominates every single price point and you don’t need to participate in their mandatory AGESA beta-testing program .


Right now in Australia the 5600X is A$100 cheaper, and the X570s are also significantly cheaper than the Z690s.


----------



## The King (Nov 5, 2021)

Pricing in India 5600X Rs21999 (296USD)


			https://www.amazon.in/AMD-Ryzen-5600X-Processor-100-100000065BOX/dp/B08166SLDF
		


EST retail price from online retailers - 12600K Rs31335 (421USD)


----------



## lexluthermiester (Nov 5, 2021)

The King said:


> So you own a 5600X and the 12600K is overall 6.6% faster @1080p and difference @1440p 4.9%.  (PBO should reduced that )


You need to review and think that over for a moment;








						Intel Core i5-12600K Review - Winning Price/Performance
					

The Core i5-12600K is the price/performance king in the Intel Alder Lake lineup. With its competitive pricing of $300, it's a clear winner against AMD's Ryzen 5 5600X and faster than even the 5800X in many applications and games. This is the gaming CPU you want.




					www.techpowerup.com
				



Not to mention that the 5600X is $10 more.

Let's stop with the disinformation, it's not winning any opinion awards.


----------



## The King (Nov 5, 2021)

lexluthermiester said:


> You need to review and think that over for a moment;
> 
> 
> 
> ...


Not in India and in South Africa. There are international users on this forum.

@W1zzard Should change the title so that you don't get confused, and realize the USA is not the only market in the world.
Winning Price/Performance in the USA​


----------



## Pepamami (Nov 5, 2021)

I dunno, DDR5 costs ~2 times more than current DDR4, new 1700 motherboars also costly.
I think I gonna wait for AMD cache and prices updates for now.
12600K looks cheap only on paper right now.


----------



## lexluthermiester (Nov 5, 2021)

The King said:


> Not in India and in South Africa. There are international users on this forum.


Ok, so the price is different where you live. It's still going to be close and that doesn't change your blatantly incorrect statement about performance.


----------



## The red spirit (Nov 5, 2021)

MarsM4N said:


> According to *igorsLAB's Review* the i7 consumes in gaming *1,2w less* & the i5 *3,9w more* than a *5600x* (while delivering more fps than a *5950x*).
> 
> He hasn't tested workload power consumption yet, but judging by W1zzard's review, I guess the picture will there look different. To me it looks like the Intel's are, when not pushed, like in games, very efficent (esp. for 10nm). When pushed they power will spike up, but with the more horsepower the'll get the job faster done. So in the end it's still efficent. How efficent we will see in the comming reviews.


First of all what Intel calls 10 nm is not really 10 nm, there's a lot of secrecy about actual nms and where they could be found, but same goes for AMD too with 7nm. It's impossible to know which process is actually smaller and better. That aside, I looked at LTT gaming benches and power consumption while gaming was a lot more sane. That's great, it used to be worse by a bit previously. So Intel improved while comparing to Intel, meanwhile AMD already had reasonably efficient stuff for years. And to add insult to injury, Intel's own i5 10400(F) and i5 11400(F) chips were exceptionally efficient, far more than Ryzens or Intels. So what actually happened there is that Intel merely matched competitor and that's like 20-30% reduction from previous gen. Despite that, chip runs obscenely hot and performance is only 10-20% higher than Ryzen, while also matching Ryzen's prices. Sorry mate, but I don't see how Alder Lake brings anything truly meaningful. The biggest thing are two types of cores, which may or may not be utilized somewhat better with Windows 11 later, but so far Alder Lake is not great. i5 is okay, but i9 is not. With i9, you can't just look at gaming and say that thermals and power usage are okay, with something like i9, you are expected to push it to the max sometimes or maybe even everyday and since it has serious flaws (power usage, heat output), I don't think that it could be recommended to anyone in good conscience. At that point, you may as well just get Comet Lake i9, overclock it and unlock power limit, performance and power usage will be the same.


----------



## The King (Nov 5, 2021)

lexluthermiester said:


> Ok, so the price is different where you life. It's still going to be close and that doesn't change your blatantly incorrect statement about performance.


The figures I quoted for 1080p gaming performance are from this very article. Please explain which part of my statement was incorrect.

So you own a 5600X and the 12600K is overall 6.6% faster @1080p and difference @1440p 4.9%.


----------



## The red spirit (Nov 5, 2021)

lexluthermiester said:


> You need to review and think that over for a moment;
> 
> 
> 
> ...


But 5600X was an awful chip in terms of price, when chips liek i5 11400F or i5 10400F or 10600K existed. Even AMD's own 3600 is much better value than 5600X. The only reason why 5600X didn't become joke is due to positive media coverage, mostly ignoring pricing of it and how it sits in market. You can't just create whatever, claim that it is value champ and then see it at the bottom of value, but it seems that AMD thinks that they can.



Turmania said:


> Isn't this wonderfull, now it's your turn AMD, consumers wins. . This is what I want to see, they outdoing each release...


AMD already won this battle


----------



## Vayra86 (Nov 5, 2021)

sepheronx said:


> It's good. But the price of the components like DDR5 and motherboards are beyond what I'm willing to spend for a few FPS more.



This summarizes it perfectly. ADL looks decent, it is finally some real progress - an IPC boost more than a few percent after all these years bravo Intel. With DDR5.

In other words, you can totally forget about this gen and wait for its successor. For the market, its great, there will be price/perf parity and thus competition. CPU chips are also not in major shortages like everything else. But yeah, if you have anything half decent from 2018 or later... window shopping is all you should do.

The best thing about ADL is that its not good enough to really kick AMD off the throne. They just change seats for a little while, both squarely in the range of diminishing returns performance wise.


----------



## Chrispy_ (Nov 5, 2021)

The red spirit said:


> I have, horrendous power consumption with tiny edge over competition. It's literally the same as overclocking FX 9590 and saying that it's faster than i7 2600K, while ignoring horrendous power consumption and heat output.


That's for 8 Performance cores and 8 Atoms against 16 Performance cores though. The 12900K is dumb because it's marketed as a 24-thread part trying to beat a 32-thread part and the only way it can do this is by pushing the clocks and power envelope well beyond what is reasonable. 350W for a CPU is obscene outside of LN2 overclocking IMO.

What's a more useful comparison is the 12600K with E-cores disabled vs the 5600X. Both use similar amounts of power and operate in their sweet-spot clock/voltage ranges. Intel simply have the IPC advantage with Alder Lake compared to Zen3 right now. Whether that's architecture or DDR5 is unknown at this stage and what I (and a lot of other people) am waiting for is a like-for-like battle using the same speed DDR4.


----------



## Dyatlov A (Nov 5, 2021)

Why_Me said:


>



Is 12600K with a high end DDR4 maybe faster than with DDR5?


----------



## Luminescent (Nov 5, 2021)

The press coverage of these new Intel cpu's sounds like Tomshardware "Just buy it" coverage of Nvidia RTX.
I fear that if AMD gets access to TSMC 5nm we could see Intel to consume triple the power for same performance.


----------



## The red spirit (Nov 5, 2021)

Chrispy_ said:


> That's for 8 Performance cores and 8 Atoms against 16 Performance cores though. The 12900K is dumb because it's marketed as a 24-thread part trying to beat a 32-thread part and the only way it can do this is by pushing the clocks and power envelope well beyond what is reasonable. 350W for a CPU is obscene outside of LN2 overclocking IMO.
> 
> What's a more useful comparison is the 12600K with E-cores disabled vs the 5600X. Both use similar amounts of power and operate in their sweet-spot clock/voltage ranges. Intel simply have the IPC advantage with Alder Lake compared to Zen3 right now. Whether that's architecture or DDR5 is unknown at this stage and what I (and a lot of other people) am waiting for is a like-for-like battle using the same speed DDR4.


I think that all this situation is really dumb and Alder Lake is just not great. They have a small edge over AMD, but at horrendous power use and heat output. You know, it's not like they are the only ones releasing stuff. If AMD released their own version of big.LITTLE design, I think that they would have an edge and, importantly, soon that power consumption and heat output will look prehistoric.


----------



## The King (Nov 5, 2021)

Chrispy_ said:


> That's for 8 Performance cores and 8 Atoms against 16 Performance cores though. The 12900K is dumb because it's marketed as a 24-thread part trying to beat a 32-thread part and the only way it can do this is by pushing the clocks and power envelope well beyond what is reasonable. 350W for a CPU is obscene outside of LN2 overclocking IMO.
> 
> What's a more useful comparison is the 12600K with E-cores disabled vs the 5600X. Both use similar amounts of power and operate in their sweet-spot clock/voltage ranges. Intel simply have the IPC advantage with Alder Lake compared to Zen3 right now. Whether that's architecture or DDR5 is unknown at this stage and what I (and a lot of other people) am waiting for is a like-for-like battle using the same speed DDR4.


This type of comparison is incorrect. AMD and Intel have different architectures for one.

When the 6C-12T 5600X Zen3 CPU was released it beat AMDs own previous Zen2 3800X 8C-16T in both CPU and Gaming tests..
Comparing two different Gens and saying one is better because of cores numbering X is not a valid comparison when one is new and one is old.

Lets not forget the 5800X with its 8C-16T laying a smack down on Intel 10C-20T 10900K

Saying Alder lake has 24threads and it should not be compared to a previous Gen 32 thread CPU is not a valid point.


----------



## The red spirit (Nov 5, 2021)

Luminescent said:


> The press coverage of these new Intel cpu's sounds like Tomshardware "Just buy it" coverage of Nvidia RTX.


To some extent all tech media lies. GoodOldGamer on YT, disclosed that, in his video what it means to be techtuber. That's just how they stay afloat. Some sell out more, some sell out less, but nearly all of them sell out and those that don't have very limited content coverage or don't last long.


----------



## lightning70 (Nov 5, 2021)

The most efficient processor in the 12600k series in terms of power consumption. Frankly, I prefer 12600k over 5600x. This model will be the processor that I intend to buy after this introduction. More efficient than 12900k. It's superior to the 5800x in terms of power, and the power consumption difference at full load isn't excessive. 175W to 190W Cinebech test.


----------



## Chrispy_ (Nov 5, 2021)

The red spirit said:


> Alder Lake is just not great. They have a small edge over AMD, but at horrendous power use and heat output.


That's plainly incorrect! Energy usage for a task directly translates to power use and heat produced:






Stock 12600K = 10KJ
Stock 5600X = 9.6KJ

Stock 12700K = 9.7KJ
Stick 5800X = 10KJ

That seems pretty competitive to me. I've already agreed that the 12900K is pushed too far, but that's the exception to the rule and the other chips in the Alder Lake family and likely the locked SKUs coming later like the 12400F etc will also have competitive power efficiency. You can also make Zen3 use 250W if you force enough voltage through it with an aggressive overclock or sloppy PBO+ setting - that's not representative of the Zen3 lineup as a whole though.



The King said:


> Saying Alder lake has 24threads and it should not be compared to a previous Gen 32 thread CPU is not a valid point.


It's a valid point because it's _undisputed fact _that a 12900K has 24 threads and a 5950X has 32 threads.
The fact that in many tests a 12900K beats a 5950X despite a thread disadvantage only serves to reinforce my original point that Intel has the IPC advantage with Alder Lake. 

Intel have officially priced the 12900K higher than the 12-month old original MSRP of the 5900X and close to the current street price of the 5950X so the comparison is not one I'm making, but one that Intel and the real-world performance results are making.

I'm not even trying to say that the two architectures or generations are the same, because they're obviously not. What kind of idiot would even try and claim that?


----------



## The King (Nov 5, 2021)

Chrispy_ said:


> It's a valid point because it's _undisputed fact _that a 12900K has 24 threads and a 5950X has 32 threads.
> The fact that in many tests a 12900K beats a 5950X despite a thread disadvantage only serves to reinforce my original point that Intel has the IPC advantage with Alder Lake.
> 
> Intel have officially priced the 12900K higher than the 12-month old original MSRP of the 5900X and close to the current street price of the 5950X so the comparison is not one I'm making, but one that Intel and the real-world performance results are making.
> ...


If you want to get technical the 12900K has 8+8cores total 16cores TDP 125W and the 5950X 16 physical cores. TDP 105W. How each architecture is configured is down to design. HT or no HT.
The higher TDP 16 physical core CPU wins. That's amazing.


----------



## Chrispy_ (Nov 5, 2021)

The King said:


> If you want to get technical the 12900K has 8+8cores total 16cores TDP 125W and the 5950X 16 physical cores. TDP 105W. How each architecture is configured is down to design. HT or no HT.
> The higher TDP 16 physical core CPU wins. That's amazing.


I've said in a pre-launch thread that I suspect two E-cores do a better job than one P-core using SMT. I've had plenty of experience with Tremont laptops and Atom servers and those cores are genuinely capable.

The software running has no concept of a physical core, only a logical core (thread), so talking about threads instead of cores _does_ matter, but not as much as the overall performance/Watt and IPC for the CPU as a whole. A new CPU could be a hypothetical 64x E-core monster or a mad single-core CPU with otherworldly IPC and the configuration of that CPU simply wouldn't matter - all that actually matters is how well it performs against competing products on the market and how energy-efficient is it when doing so. The i5 and i7 are competitive with the Ryzens in both performance/Watt and have a performance advantage. The only place where Alder Lake comes into question is when comparing the 12900K to the 5950X instead of the 5900X, since the 5950X is a much bigger step up from the 5800X that the 12700K is compared to.

The 12900K absolutely nails the 5900X to the wall when it comes to performance and it does so at sensible power consumption that is competitive when looking at task energy. The fact that Intel have chosen to overclock and overvolt the snot out of it to chase down the 5950X is purely marketing; Just because they can doesn't mean that they should and anyone buying a 12900K who isn't concerned about those last couple of percent should just reduce the TDP to ~225W and enjoy the massive efficiency gains of not boosting at 350W.


----------



## R0H1T (Nov 5, 2021)

Chrispy_ said:


> That seems pretty competitive to me. I've already agreed that the 12900K is pushed too far, but that's the exception to the rule and the *other chips in the Alder Lake family*


I think you'll find a lot depends on the boards too, the unlimited turbo is good at winning benches but with hotter tropical climates, like for 2 billion+ pop in Asia, or anywhere without AC it'll run into a thermal wall without adequate cooling! I still think it will get better with win11 updates, but as *ratirt* opined that may push the performance down ~ maybe a lot?

Yes I know this applies to AMD as well.


----------



## chrcoluk (Nov 5, 2021)

Scheduling issues not unexpected, my experience on Android phones is similar.

Your review stated foreground apps get P cores, background E cores, that explains your mysql observation as mysql server runs in the background non interactive.  Does the thread director allow overrides to be configured?

wPrime however, what's going on there?


----------



## Agent_D (Nov 5, 2021)

chrcoluk said:


> Scheduling issues not unexpected, my experience on Android phones is similar.
> 
> Your review stated foreground apps get P cores, background E cores, that explains your mysql observation as mysql server runs in the background non interactive.  Does the thread director allow overrides to be configured?
> 
> wPrime however, what's going on there?


Yea, if you look at the 12900k bench; with e-cores disabled, the SQL stuff jumps hugely in performance. Same with wPrime.


----------



## W1zzard (Nov 5, 2021)

chrcoluk said:


> Scheduling issues not unexpected, my experience on Android phones is similar.
> 
> Your review stated foreground apps get P cores, background E cores, that explains your mysql observation as mysql server runs in the background non interactive.  Does the thread director allow overrides to be configured?
> 
> wPrime however, what's going on there?


Thread director is not configurable, it also has no API. However, you can always manually set affinity for processes in task manager (or through your own code), which overrides whatever Thread Director and Windows Scheduler decide.

If was just "foreground" vs "background", then we wouldn't need Thread Director. Windows can make such a decision on its own. Just the fact that Thread Director exists is clear confirmation that Intel's vision goes beyond that.


----------



## ncrs (Nov 5, 2021)

W1zzard said:


> Thread director is not configurable, it also has no API. However, you can always manually set affinity for processes in task manager (or through your own code), which overrides whatever Thread Director and Windows Scheduler decide.
> 
> If was just "foreground" vs "background", then we wouldn't need Thread Director. Windows can make such a decision on its own. Just the fact that Thread Director exists is clear confirmation that Intel's vision goes beyond that.


Well... it seems that we have a modern Turbo button, according to AnandTech:



> [...] There’s an option in the BIOS that, when enabled, means the Scroll Lock can be used to disable/park the E-cores, meaning nothing will be scheduled on them when the Scroll Lock is active. [...]



So at least there seems to be a workaround, but I don't know how common this BIOS option is.


----------



## Why_Me (Nov 5, 2021)

Bring on the B660 boards.









						Best buy incoming: sub-US$200 Intel Core i5-12400F beats the Ryzen 5 5600X
					

For now, the i5-12600K is the value king, but, if you can wait for the i5-12400F releasing in two months, you could get roughly the same performance for sub-US$200.




					www.notebookcheck.net


----------



## fevgatos (Nov 5, 2021)

R0H1T said:


> You do realize you're quoting efficiency numbers for the CPU in what's generally a GPU heavy task i.e. gaming
> 
> I assume you also have the GPU locked at a certain frequency & normalized the results with other variables taken care of?


It's been known, at least to me, for some time now that Intel are extremely efficient in gaming (the 11th gen notwithstanding, those are trash).  But for some reason, the internet is full of comments like "only 5% more performance in 1440p while consuming 100% watts", completely neglecting the fact that the cpus dont actually consume that much in gaming


----------



## Mussels (Nov 6, 2021)

fevgatos said:


> It's been known, at least to me, for some time now that Intel are extremely efficient in gaming (the 11th gen notwithstanding, those are trash).  But for some reason, the internet is full of comments like "only 5% more performance in 1440p while consuming 100% watts", completely neglecting the fact that the cpus dont actually consume that much in gaming


It's because some games *do* use all that power.

Rimworld wont max out my 5800x, but DX12 titles sure make a good go at it


----------



## Scrizz (Nov 6, 2021)

This is an interesting chip for sure.


----------



## HD64G (Nov 6, 2021)

So, for budget gamers, no difference at all between 12600K and 5600X, or even 12700K and 5800X in performance. Just on power draw and platform's cost. As for the power users or pros, those CPUs are irrelevant.


----------



## The King (Nov 6, 2021)

HD64G said:


> So, for budget gamers, no difference at all between 12600K and 5600X, or even 12700K and 5800X in performance. Just on power draw and platform's cost. As for the power users or pros, those CPUs are irrelevant.


Just shows what an awesome gaming CPU AMD made with Zen 3 over a year ago, within +/-95% gaming performance. of ADL at 1080p and the gap will be less @1440p.
I was never impressed with ADL gaming performance to be honest.

@HD64G Please post source article


----------



## Kissamies (Nov 6, 2021)

This looks to be the best one of these when thinking about price/performance and the power consumption.

Though we'll see what AMD has on its sleeve with Zen4, as it's not a surprise that this wins an year old AMD lineup.


----------



## lexluthermiester (Nov 6, 2021)

HD64G said:


> So, for budget gamers, no difference at all between 12600K and 5600X, or even 12700K and 5800X in performance. Just on power draw and platform's cost. As for the power users or pros, those CPUs are irrelevant.
> View attachment 223964


Please give this a watch.









And no.


----------



## The King (Nov 6, 2021)

lexluthermiester said:


> Please give this a watch.
> 
> 
> 
> ...


----------



## fevgatos (Nov 6, 2021)

HD64G said:


> So, for budget gamers, no difference at all between 12600K and 5600X, or even 12700K and 5800X in performance. Just on power draw and platform's cost. As for the power users or pros, those CPUs are irrelevant.
> View attachment 223964


There is no difference in power draw in game. Stop making shit up


----------



## The King (Nov 6, 2021)

fevgatos said:


> There is no difference in power draw in game. Stop making shit up


Single threaded testing shows a different story on TPU. Maybe ask @W1zzard to stop making things up.


----------



## HenrySomeone (Nov 6, 2021)

fevgatos said:


> There is no difference in power draw in game. Stop making shit up


Admins should really start stepping in because the number of posts perpetuating this bullshit (using gaming performance delta with stress test power consumption) is getting out of hand. I can sort of see now how a not-so-well-informed, casual reader can then quickly get the impression that Intels are indeed power hogs in all circumstances.


----------



## R0H1T (Nov 6, 2021)

fevgatos said:


> It's been known, at least to me, for some time now that Intel are extremely efficient in gaming (the 11th gen notwithstanding, those are trash).  But for some reason, the internet is full of comments like "only 5% more performance in 1440p while consuming 100% watts", *completely neglecting the fact that the cpus dont actually consume that much in gaming*


Right & that goes for AMD as well ~ because CPU's actually don't consume that much in gaming anyway, especially wrt to the total system power consumption. So even if Intel CPU's are 50% more efficient in gaming, the numbers aren't that much of an issue unless you're OCing or overvolting your CPUs into insane territory.

Let me put it in another way ~ when talking about gaming, any "gaming" really, you should pay attention to the* GPU numbers*!


----------



## fevgatos (Nov 6, 2021)

The King said:


> Single threaded testing shows a different story on TPU. Maybe ask @W1zzard to stop making things up.
> View attachment 223975


That's not gaming though, and the efficiency difference here is miniscule.

There are a bunch of gaming efficiency reviews around, and the 12600k smokes everything, so why do you have to resort to lying? I don't get it, what's the point? Who are you trying to convince? People that don't know? Why? Are you trying to mislead them to make the wrong purchasing decision? What do you gain out of that? I'm sorry, but it blows my mind how people can just lie like that in a public forum about freely available data, and it always happen in favor of one particular company....what the heck is going on?


----------



## The King (Nov 6, 2021)

fevgatos said:


> That's not gaming though, and the efficiency difference here is miniscule.
> 
> There are a bunch of gaming efficiency reviews around, and the 12600k smokes everything, so why do you have to resort to lying? I don't get it, what's the point? Who are you trying to convince? People that don't know? Why? Are you trying to mislead them to make the wrong purchasing decision? What do you gain out of that? I'm sorry, but it blows my mind how people can just lie like that in a public forum about freely available data, and it always happen in favor of one particular company....what the heck is going on?



New LGA1700 motherboards required
Some workloads get scheduled onto wrong cores
*Energy efficiency worse than AMD Zen 3*
No CPU cooler included









						Intel Core i5-12600K Review - Winning Price/Performance
					

The Core i5-12600K is the price/performance king in the Intel Alder Lake lineup. With its competitive pricing of $300, it's a clear winner against AMD's Ryzen 5 5600X and faster than even the 5800X in many applications and games. This is the gaming CPU you want.




					www.techpowerup.com


----------



## fevgatos (Nov 6, 2021)

R0H1T said:


> Right & that goes for AMD as well ~ because CPU's actually don't consume that much in gaming anyway, especially wrt to the total system power consumption. So even if Intel CPU's are 50% more efficient in gaming, the numbers aren't that much of an issue unless you're OCing or overvolting your CPUs into insane territory.
> 
> Let me put it in another way ~ when talking about gaming, any "gaming" really, you should pay attention to the* GPU numbers*!


Well, according to igorslab testing, alder lake is 

1) WAY (and I mean WAY) faster in a HUGE number of productivity applications, like autocad / inventor / premiere / photoshop / transcoding / exporting / solidworks and a number of scientific applications. 

2) It is WAY more efficient in all of the above. Again, huge efficiency difference.

3) It is faster in gaming

4) It's also WAY more efficient in gaming

5) They only lose, in terms of both performance and efficiency to rendering mainly. 

So if you are working for Disney and the likes, zen 3 is still the way to go. For everything else, the choice is pretty obvious, and I have no idea why people are making shit up to convince unaware buyers into a wrong decision.



The King said:


> New LGA1700 motherboards required
> Some workloads get scheduled onto wrong cores
> *Energy efficiency worse than AMD Zen 3*
> No CPU cooler included
> ...


Did you look at the actual results? Here you go

That 0.2 kJ makes all the difference, doesn't it?


----------



## HD64G (Nov 6, 2021)

fevgatos said:


> There is no difference in power draw in game. Stop making shit up


Nice find! As for the platform cost I can feel a silence from your side...


----------



## The King (Nov 6, 2021)

fevgatos said:


> Did you look at the actual results? Here you go
> 
> That 0.2 kJ makes all the difference, doesn't it?


That is a heavily MT benchmark CB Cinebench. The results of which has no relevants to gaming power usage.

Single threaded power consumption results will have a bigger impact on games. hence I posted the Super Pi chart single thread test.
Hope it all makes sense to you know.

The statement that Alder Lake is less efficient it true even if its by 0.2kJ  Has the user that posted that was abused by other users on this forum has spreading lies.
I have reported the matter to the Mods. Take care.


----------



## fevgatos (Nov 6, 2021)

HD64G said:


> Nice find! As for the platform cost I can feel a silence from your side...


No point addressing 15 different things at the same time. Are we done with the efficiency argument? Do you agree that you were wrong and alder lake are as efficient in gaming and more efficient in productivity than zen 3? If yes, okay, ill move on the platform

Yes the total cost of ownership is higher between a 12600k and a 5600x, but they got released literally 2 days ago, with no b760 mobos out yet. The heck do you expect? Do you remember the platform cost of the zen 3 when they released? The 5600x alone was freequently above 400!! And what is the actual cost of ownership, even with todays price? A very capable DDR4 z690 costs 200€. So how much are you saving from the mobo? 50€? 70€ if you stretching? So what does the 12600k offer you for that 50-70€? Way higher longevity, both in gaming and in productivity. I mean it IS a way better CPU, paying 70€ for that is worth it, no?



The King said:


> That is a heavily MT benchmark CB Cinebench. The results of which has no relevants to gaming power usage.
> 
> Single threaded power consumption results will have a bigger impact on games. hence I posted the Super Pi chart single thread test.
> Hope it all makes sense to you know. Take care.
> ...


If his statement about power usage was true then his statement that there is no difference in performance was wrong, since there was a difference in the very picture he himself posted, the 12600k was actually faster in gaming I'm sorry, you CANT have your cake and eat it too. 

Does this have any relevance to gaming power usage?


----------



## Flanker (Nov 6, 2021)

lol I feel kinda dumb to realize it this late, but looks like for playing at 60Hz VSync, any i5/Ryzen 5 will do, the rest goes to GPU money when the prices are no longer crazy


----------



## fevgatos (Nov 6, 2021)

Flanker said:


> lol I feel kinda dumb to realize it this late, but looks like for playing at 60Hz VSync, any i5/Ryzen 5 will do, the rest goes to GPU money when the prices are no longer crazy


If you activate RT and want a minimum of 60 fps under every circumstance, a 11600k barely gets there. It's hovering around the 60-65 fps in the very busy scenes of the city, running with 3333c12 ram. With looser ram it can't actually hold a minimum of 60


----------



## Pumper (Nov 6, 2021)

RandallFlagg said:


> Edit :  So, I did not notice this before but the DDR5 12900K is notably more efficient than the DDR4 based variants.  About 15% more efficient than the DDR4 with the same 241/241 PL1/PL2 power settings.


No it's not, it's completely the opposite. The chart is FPS/W, not W/FPS



Hossein Almet said:


> Right now in Australia the 5600X is A$100 cheaper, and the X570s are also significantly cheaper than the Z690s.


Let's not pretend that Zen3 launched at MSRP, ok? The 5600X lowest retail prices were over 600€ at one point in EU and went down to sub 350€ only 6-7 weeks after launch, and reached the 310€ MSRP only half a year after release.


----------



## kmetek (Nov 6, 2021)

here 400€


----------



## RandallFlagg (Nov 6, 2021)

Flanker said:


> lol I feel kinda dumb to realize it this late, but looks like for playing at 60Hz VSync, any i5/Ryzen 5 will do, the rest goes to GPU money when the prices are no longer crazy




Anything above that red line is too slow to keep up with my monitor's refresh rate at 1440p - and this is just avg, not a particularly new game either. 

You simply picked a game that is GPU bound mostly, with a 3080 that TPU uses. 

However, correct that this doesn't yet mean much to those who have lesser GPUs.  If one keeps their CPU for more than a year or two though, it's probably going to mean a lot when next gen GPUs come out. 






What's more amazing to me are charts like this one.  This is a 12600K keeping pace with a 5950X on a code recompile. 

A 12900K smokes 5950X by more than 15%, at least in Visual Studio (which I use).


----------



## HD64G (Nov 6, 2021)

fevgatos said:


> No point addressing 15 different things at the same time. Are we done with the efficiency argument? Do you agree that you were wrong and alder lake are as efficient in gaming and more efficient in productivity than zen 3? If yes, okay, ill move on the platform
> 
> Yes the total cost of ownership is higher between a 12600k and a 5600x, but they got released literally 2 days ago, with no b760 mobos out yet. The heck do you expect? Do you remember the platform cost of the zen 3 when they released? The 5600x alone was freequently above 400!! And what is the actual cost of ownership, even with todays price? A very capable DDR4 z690 costs 200€. So how much are you saving from the mobo? 50€? 70€ if you stretching? So what does the 12600k offer you for that 50-70€? Way higher longevity, both in gaming and in productivity. I mean it IS a way better CPU, paying 70€ for that is worth it, no?


I didn't write about  efficiency in productivity at all. Those CPUs aren't bought as productivity ones and no matter what 5600X is ultra efficient in all workloads. So, for gamers, the difference in 1080P performance with an over $1000 GPU is 10% maybe? So, for budget gamers with a $500 GPU will drop to less than 5% most probably. I wouldn't buy a platform that costs more, and a CPU than needs a cooler to work with (a 30$ at least which makes the difference to build a pc between those 2 over 100$ for less than 5% gaming performance difference). Try to argue with that now.


----------



## fevgatos (Nov 6, 2021)

HD64G said:


> I didn't write about  efficiency in productivity at all. Those CPUs aren't bought as productivity ones and no matter what 5600X is ultra efficient in all workloads. So, for gamers, the difference in 1080P performance with an over $1000 GPU is 10% maybe? So, for budget gamers with a $500 GPU will drop to less than 5% most probably. I wouldn't buy a platform that costs more, and a CPU than needs a cooler to work with (a 30$ at least which makes the difference to build a pc between those 2 over 100$ for less than 5% gaming performance difference). Try to argue with that now.


Your argument would also apply to the 10400f. Why the heck would you buy a 5600x when the 10400f costs half and has similar performance with a 500$ GPU (what gpu is that today, a 3060?). The answer is simple. Future proofing. While the 10400f would perform similarly with a 3060, 1 or 2 years down the road you might upgrade that 3060 and your CPU is now the bottleneck, meaning it needs an upgrade. It's the same thing now with the 5600x. It will be cheaper today but youll pay more down the line cause youll need an upgrade sooner.


----------



## RandallFlagg (Nov 6, 2021)

This gets stupid.

If your #1 use is gaming, let us assume for example you game 8 hours a day 365 days a year.  And you pick something that uses an extra 20W of power.

That's 0.16 KWH of power per day, or 58KWH per year.

The average cost of electricity in the USA is currently 12.55c / KWH.

That's $7.25c per year for the average US household.  If you game 8 hours constantly 365 days a year.  I game way too much I think but I believe it's more like 2 hours a day not 8.

Just stupid waste of time to discuss it.


----------



## Turmania (Nov 6, 2021)

I think there should be another review in regards to power consumption during gaming... as well as boost frequencies chart like there was with ryzen cpu reviews.


----------



## Vayra86 (Nov 6, 2021)

Dyatlov A said:


> Is 12600K with a high end DDR4 maybe faster than with DDR5?



I think its non issue in real life use. By the time affordable DDR5 that is fast is mainstream you're a gen further in time.

ADL is a nice first shot at big little, proving the technology can do more than what Intel used to do. It opens new ways to direct power where its needed most, adds flexibility. But its also an early adopter gen. Both for new DDR and for the arch. It won't gain its best numbers in this gen.



RandallFlagg said:


> This gets stupid.
> 
> If your #1 use is gaming, let us assume for example you game 8 hours a day 365 days a year.  And you pick something that uses an extra 20W of power.
> 
> ...



The efficiency AND the gaming performance are a strange beast. I don't think personally, anyone is even half serious about the _cost_ of electricity wrt a CPU TDP/load wattage. What I rather think, and I say this because that is also my personal experience over all these years, is that *temperature is a performance limiter, and it rapidly adds noise + expenses on extra cooling to your build.*

The fact is, ANY application, at ANY given time, can present a type of load to a CPU that puts it in maximum gear. With the strong improvements to multi threading and scheduling, this is only more likely to happen as time progresses. We see it, too. Its historical fact, as well. I know I ran a 3570K a few years ago and it would readily use more power than it did when I bought it, even under the same OC. Why? Simple, applications utilize cpu cycles better, and they need more of it, as GPUs get faster.

So yes, I completely, totally understand people are ready and willing to mix up the worst case energy metrics with their personal take on how the CPU could possibly run their games or applications. Its what I would do. You just WANT to be able to run the CPU in the red and not smell burnt electronics.

And power, is temperature. We know these CPUs can push 241W, and we know what kind of heat they produce in that case. Also, the argument applies universally. AMD's Ryzen gets hot too, but its not quite as toasty nor has such a horrible inefficiency at the top end.

When AMD produced hotter GPUs, I was staying away too. Now that Nvidia produces Samsung 8nm Ampere GPUs, I'm staying away too, and if any GPU would become cost effective in some way, I'd prefer 7nm TSMC any day of the week, yes I'd even sacrifice RTX for it without blinking twice. The fact still remains that on hard efficiency, Intel still loses node-wise to AMD/TSMC 7nm, which is a node already in refinement too. And these things do count. The silicon is the key differentiator, especially when competing architectures share the majority of perks.



fevgatos said:


> Your argument would also apply to the 10400f. Why the heck would you buy a 5600x when the 10400f costs half and has similar performance with a 500$ GPU (what gpu is that today, a 3060?). The answer is simple. Future proofing. While the 10400f would perform similarly with a 3060, 1 or 2 years down the road you might upgrade that 3060 and your CPU is now the bottleneck, meaning it needs an upgrade. It's the same thing now with the 5600x. It will be cheaper today but youll pay more down the line cause youll need an upgrade sooner.



Unlikely. GPUs may get faster, but demands in games on that GPU also increase, while the CPU demands tend to remain stagnant until the consoles push up the mainstream again.

I have yet to see a single core limitation for gaming on my 8700K. And that's with an efficiency oriented OC, I run 4.6 Ghz... Sure I can fire up CS GO and maybe I'll have a few dozen more frames above 600... but who cares? ALL CPUs even from yesteryear are more than capable for gaming, they have the core count and the IPC, and the current console crop won't be changing soon either.


----------



## HD64G (Nov 6, 2021)

fevgatos said:


> Your argument would also apply to the 10400f. Why the heck would you buy a 5600x when the 10400f costs half and has similar performance with a 500$ GPU (what gpu is that today, a 3060?). The answer is simple. Future proofing. While the 10400f would perform similarly with a 3060, 1 or 2 years down the road you might upgrade that 3060 and your CPU is now the bottleneck, meaning it needs an upgrade. It's the same thing now with the 5600x. It will be cheaper today but youll pay more down the line cause youll need an upgrade sooner.


I can see you find another CPU not released to make an argument about future buyers. So, for now I am correct I suppose? 

Firstly wait for 10400F and its pricing while AMD can lower their prices too until then and make AM4 platform even less expensive. Moreover, anyone on a budget will never buy top GPU or CPU, so your argument is invalid. 

The ones in need of top-tier CPU or GPU will buy the i7/9 or R7/9. And there we get in a totally another conversation about pros and cons.


----------



## fevgatos (Nov 6, 2021)

HD64G said:


> I can see you find another CPU not released to make an argument about future buyers. So, for now I am correct I suppose?
> 
> Firstly wait for 10400F and its pricing while AMD can lower their prices too until then and make AM4 platform even less expensive. Moreover, anyone on a budget will never buy top GPU or CPU, so your argument is invalid.
> 
> The ones in need of top-tier CPU or GPU will buy the i7/9 or R7/9. And there we get in a totally another conversation about pros and cons.


What? The 10400f has been around for 2 years at ~140€. What are youtalking about? 






						Intel® Core™ i5-10400F, Prozessor
					

Der Intel® Core™ i5-10400F Prozessor (Codename "Comet Lake-S") ist eine 6-Kern-CPU für den Sockel 1200 und basiert auf der Comet Lake Generation. D...




					www.alternate.de
				






Vayra86 said:


> Unlikely. GPUs may get faster, but demands in games on that GPU also increase, while the CPU demands tend to remain stagnant until the consoles push up the mainstream again.
> 
> I have yet to see a single core limitation for gaming on my 8700K. And that's with an efficiency oriented OC, I run 4.6 Ghz... Sure I can fire up CS GO and maybe I'll have a few dozen more frames above 600... but who cares? ALL CPUs even from yesteryear are more than capable for gaming, they have the core count and the IPC, and the current console crop won't be changing soon either.


Are you saying that a CPU will run any GPU regardless? No, at some point it will start bottlenecking, no matter when that time is. So a CPU that is , let's say 15% faster, can support a 15% faster GPU than the one that bottlenecks your 8700k. 

I did have the 8700k, and yes its still a beast, but there are games out right now that it bottlenecks a high end card, unless you heavy tune it and memory oc the crap out of it.


----------



## TheoneandonlyMrK (Nov 6, 2021)

fevgatos said:


> Your argument would also apply to the 10400f. Why the heck would you buy a 5600x when the 10400f costs half and has similar performance with a 500$ GPU (what gpu is that today, a 3060?). The answer is simple. Future proofing. While the 10400f would perform similarly with a 3060, 1 or 2 years down the road you might upgrade that 3060 and your CPU is now the bottleneck, meaning it needs an upgrade. It's the same thing now with the 5600x. It will be cheaper today but youll pay more down the line cause youll need an upgrade sooner.


So we should only buy i9's ,are you a sales rep.

Different people have different needs and priorities ,I know people happily gaming on Fx8350s and Dell's, to assume everyone just surf's and casual games is not right, as is thinking you know what everyone else should buy.

Have you looked at the CPU charts above 1080p , the CPU is just not that important, and at 4k well.

That upgrade itch is enthusiasts only, everyone else just buys when it's broke or nicked.

I am finding the comedy lier comments a bit offensive , people can't see that despite the facts,, ADL just isn't a clear and easy winner, not for everyone.


----------



## sepheronx (Nov 6, 2021)

TheoneandonlyMrK said:


> So we should only buy i9's ,are you a sales rep.
> 
> Different people have different needs and priorities ,I know people happily gaming on Fx8350s and Dell's, to assume everyone just surf's and casual games is not right, as is thinking you know what everyone else should buy.
> 
> ...



I play games.  I mostly read on the net or I attend to my mining rigs.  I am more than happy to play video games on a RX 580 and or GTX 970.  I have played them for so long.  I even connect it to my 4K tv to play retro games on the big screen.  Looks good, plays good and is it running all top of the line equipment? Nope.  Older V2 or V3 Xeons that I can pick up for mid double digit prices.  Ram that is ECC which is cheaper to get.  Does the job just fine.

This is purely enthusiasts.  People were editing 4K videos before these processors came out.  Actually, that reminds me, isn't video rendering programs now taking advantage more so of the GPU than CPU's now?


----------



## TheoneandonlyMrK (Nov 6, 2021)

sepheronx said:


> I play games.  I mostly read on the net or I attend to my mining rigs.  I am more than happy to play video games on a RX 580 and or GTX 970.  I have played them for so long.  I even connect it to my 4K tv to play retro games on the big screen.  Looks good, plays good and is it running all top of the line equipment? Nope.  Older V2 or V3 Xeons that I can pick up for mid double digit prices.  Ram that is ECC which is cheaper to get.  Does the job just fine.
> 
> This is purely enthusiasts.  People were editing 4K videos before these processors came out.  Actually, that reminds me, isn't video rendering programs now taking advantage more so of the GPU than CPU's now?


That's eventually where Intel and AMD might come unstuck soon , because if it's working most won't be buying, my laptop still is my go-to and it's a 1080p 2060 8750h so not epic but it's not getting swapped for a while yet.


----------



## Why_Me (Nov 6, 2021)

TheoneandonlyMrK said:


> So we should only buy i9's ,are you a sales rep.
> 
> Different people have different needs and priorities ,I know people happily gaming on Fx8350s and Dell's, to assume everyone just surf's and casual games is not right, as is thinking you know what everyone else should buy.
> 
> ...


This cpu or an i7-12700F paired with a B660 board and DDR4 should do fine for gaming.









						Best buy incoming: sub-US$200 Intel Core i5-12400F beats the Ryzen 5 5600X
					

For now, the i5-12600K is the value king, but, if you can wait for the i5-12400F releasing in two months, you could get roughly the same performance for sub-US$200.




					www.notebookcheck.net


----------



## fevgatos (Nov 6, 2021)

TheoneandonlyMrK said:


> So we should only buy i9's ,are you a sales rep.
> 
> Different people have different needs and priorities ,I know people happily gaming on Fx8350s and Dell's, to assume everyone just surf's and casual games is not right, as is thinking you know what everyone else should buy.
> 
> ...


I never said or suggested you should buy an i9.

Yeh, your friend might happily game with an fx8350, but at SOME point he won't happily game on it anymore. That point is sooner for the fx8350 compared to let's say, a 4790k.


----------



## TheoneandonlyMrK (Nov 6, 2021)

fevgatos said:


> I never said or suggested you should buy an i9.
> 
> Yeh, your friend might happily game with an fx8350, but at SOME point he won't happily game on it anymore. That point is sooner for the fx8350 compared to let's say, a 4790k.


The 4790k is two generations newer, so it should.

@Why_Me I agree, I just don't then agree that it's the only CPU worth buying.

Obviously if the price is right, I wouldn't pay scalpers prices for example.


----------



## RandallFlagg (Nov 6, 2021)

Think people are forgetting that these are K series chips.  Those and AMD X are the enthusiast chips, unlocked, customizable.

You can turn a K into a non-K if you so desire.

You can clock it till you can't cool it.  Same on AMD side, PBO2 is power unlocked, as far as you can clock it and cool it.

Saw a geekbench for a 6.2 Ghz 5950X this morning.  Betchya it pulls way upwards of 500W.  I hear it is not unusual for them to pull 250-300W when overclocking.

Again, not sure why people are so worried about what these chips pull when stock and power limited.


----------



## Mussels (Nov 6, 2021)

fevgatos said:


> There is no difference in power draw in game. Stop making shit up


Power draw absolutely matters.

Heat too high?
VRMs too hot?
PL2 active for too long?
Didnt buy that $500 Z series motherboard? (Every single prebuilt/OEM owner goes here)

Then you don't reach that review performance.

Don't look at it as "well it's not ALWAYS there so its fine"
Look at it as "What if it wants that, and cant get it?"




A 5600x and 12600k have my tick of approval, because no matter what, they should be able to reach their max performance with standard cooling, on a standard motherboard in a standard system.

Even the 5950x can run off an air cooler (Noctua NH-U14S is a popular example, but the U12S works too, if you dont max out PBO) on the majority of B450/B550 motherboards (barring a few really bad examples like MSI who have boards that are bad pure garbage for both brands)

Edit: We have an entire section of the forum dedicated to the poor sods with OEM's and laptops who are fighting against all the arbitrary limits and locks set by intel. 25W CPU, 120W power brick, low temps, alls good: nope 15W power limit for you. And then after 6 months its 100C and themal throttling because the cooling solution cant handle the heat long term, or in summer.


----------



## fevgatos (Nov 6, 2021)

TheoneandonlyMrK said:


> The 4790k is two generations newer, so it should.


You could buy an fx8350 brand new in 2014. So if your friend chose the fx8350 cause it was cheaper (it was around 150€ at the time) then he is spending less money, but he is losing longevity. The same applies today to the 5600x vs the 12600k.


----------



## TheoneandonlyMrK (Nov 6, 2021)

RandallFlagg said:


> Think people are forgetting that these are K series chips.  Those and AMD X are the enthusiast chips, unlocked, customizable.
> 
> You can turn a K into a non-K if you so desire.
> 
> ...


I'm not sure on that, if all you have is 11 on the power dial, to get this Leap in performance with all that new technology , hopefully rapter lake is more on point.

@fevgatos yeah exactly ,not, the 5600X is only one year older than 12600k not two

Surprise, the competitor decided to show up at last, now about those GPUs.


----------



## fevgatos (Nov 7, 2021)

Mussels said:


> Power draw absolutely matters.
> 
> Heat too high?
> VRMs too hot?
> ...


I never said power draw doesn't matter. I'm saying that during gaming, or 99% of any other task, alder lakes are extremely efficient, way more efficient than zen 3 in most tasks. The only thing they are not efficient at is basically rendering, and that's because of the insanely high stock clockspeeds. It's obvious that nobody should or would run a 20 hour long blender render on 5+ghz.  You either power limit it or downclock it. I mean igorslab has some interesting numbers, performance normalized (basically having the same performance as 5900x) teh 12900k completely evaporates it in terms of efficiency. The 5900x needs 40% more power to finish the same rendering workload.

So yeah, long story short, alder lakes are extremely efficient in 99% of tasks at stock. If you work for disney and do rendering 24/7 then power limiting them to 150w would do wonders, if you care about efficiency. You'll lose 5% performance but you'll decrease the power output by a truckload.



TheoneandonlyMrK said:


> @fevgatos yeah exactly ,not, the 5600X is only one year older than 12600k not two
> 
> Surprise, the competitor decided to show up at last, now about those GPUs.


It doesn't matter how new or old it is. My point is a faster CPU will last you longer, so comparing price to performance only in today's games with today's graphics cards in 1440p resolution is just the wrong way of doing it.  Back in 2017 my R5 1600 had the same performance as the 8700k with a 1080ti @ 1440p. Fast forward to today, you can easily use a modern graphics like a 3080 on an 8700k. You can't do that on an R5 1600, even at 4k you will get bottlenecked in some games. So spending 150€ to get the 8700k instead of the R5 1600 would be a better choice.


----------



## Mussels (Nov 7, 2021)

fevgatos said:


> I never said power draw doesn't matter. I'm saying that during gaming, or 99% of any other task, alder lakes are extremely efficient, way more efficient than zen 3 in most tasks. The only thing they are not efficient at is basically rendering, and that's because of the insanely high stock clockspeeds. It's obvious that nobody should or would run a 20 hour long blender render on 5+ghz.  You either power limit it or downclock it. I mean igorslab has some interesting numbers, performance normalized (basically having the same performance as 5900x) teh 12900k completely evaporates it in terms of efficiency. The 5900x needs 40% more power to finish the same rendering workload.
> 
> So yeah, long story short, alder lakes are extremely efficient in 99% of tasks at stock. If you work for disney and do rendering 24/7 then power limiting them to 150w would do wonders, if you care about efficiency. You'll lose 5% performance but you'll decrease the power output by a truckload.
> 
> ...


It does matter because it peaks to those values, and if it cant do so - the performance is lower

Zen 2 began AMD's trend of polling 1000 times a second, every 1ms.
How much performance loss would you expect from one of these 300W monsters if any of the performance criteria isnt met?


----------



## TheoneandonlyMrK (Nov 7, 2021)

fevgatos said:


> I never said power draw doesn't matter. I'm saying that during gaming, or 99% of any other task, alder lakes are extremely efficient, way more efficient than zen 3 in most tasks. The only thing they are not efficient at is basically rendering, and that's because of the insanely high stock clockspeeds. It's obvious that nobody should or would run a 20 hour long blender render on 5+ghz.  You either power limit it or downclock it. I mean igorslab has some interesting numbers, performance normalized (basically having the same performance as 5900x) teh 12900k completely evaporates it in terms of efficiency. The 5900x needs 40% more power to finish the same rendering workload.
> 
> So yeah, long story short, alder lakes are extremely efficient in 99% of tasks at stock. If you work for disney and do rendering 24/7 then power limiting them to 150w would do wonders, if you care about efficiency. You'll lose 5% performance but you'll decrease the power output by a truckload.
> 
> ...


There's no wrong way except every way other than yours?!

If you buy what you need and can afford in CPU terms it has to last ten years, because I said so,!


----------



## Mussels (Nov 7, 2021)

"So yeah, long story short, alder lakes are extremely efficient in 99% of tasks at stock"


... only vs Intel 11th gen, famously known as a waste of sand. Higher is *bad* here. The 12600K in particular? Middle of the pack at best.
The 5600x is energy efficient, AL is not.


----------



## RandallFlagg (Nov 7, 2021)

Mussels said:


> "So yeah, long story short, alder lakes are extremely efficient in 99% of tasks at stock"
> 
> 
> ... only vs Intel 11th gen, famously known as a waste of sand. Higher is *bad* here. The 12600K in particular? Middle of the pack at best.
> ...



There isn't anything in the Alder Lake Lineup yet that is competing with the 5600X.  In fact the 5800X isn't really competitive vs 12600K except in multi-core, it gets demolished in single.  

So proper comparison here would be 12600K vs 5800X and 5900X.

And 12600K matches the 5800X on one chart and lands in-between the 5800X and 5900X on the other.

Those charts really don't tell you what you want them to.

Lets re-do that comparison when we have a 12400.


----------



## fevgatos (Nov 7, 2021)

Mussels said:


> "So yeah, long story short, alder lakes are extremely efficient in 99% of tasks at stock"
> 
> 
> ... only vs Intel 11th gen, famously known as a waste of sand. Higher is *bad* here. The 12600K in particular? Middle of the pack at best.
> ...


Cause that's so many different tasks. Oh wait, it's just 1, cinebench. The amount of misinformation going around in this forum is sad. 

Here you go, a bunch of different tasks including gaming.


----------



## dicobalt (Nov 7, 2021)

I disagree, the 12400F is priced at $200 ( 6 Pcores, 0 Ecores), and as far as price performance it's the real winner of this generation. Modern CPUs have really slid up way too high on prices, and according to Steam hardware survey over 90% of people are still gaming at 1440p and below, perfect for a 6 core CPU. 

Price Citation: https://wccftech.com/canadian-retailer-lists-alder-lake-intel-core-i5-12400f-for-249-cad-200-usd/


----------



## lexluthermiester (Nov 7, 2021)

dicobalt said:


> I disagree, the 12400F is priced at $200 ( 6 Pcores, 0 Ecores), and as far as price performance it's the real winner of this generation. Modern CPUs have really slid up way too high on prices, and according to Steam hardware survey over 90% of people are still gaming at 1440p and below, perfect for a 6 core CPU.
> 
> Price Citation: https://wccftech.com/canadian-retailer-lists-alder-lake-intel-core-i5-12400f-for-249-cad-200-usd/


While the price is attractive, we don't have the performance numbers for it yet. You can't call something a winner when it's not in the race.


----------



## RandallFlagg (Nov 7, 2021)

These are starting to show up in major OEM rigs now.  That means there's some volume behind Alder Lake.


----------



## Mussels (Nov 7, 2021)

Oh and for relevance to pricing:

Au launch prices say "shit no, stay zen 3"
At least the top tier models price match for the first time in a while: a 5950x and a 12900K are only $50 apart (at $1050 and $1099)


----------



## Why_Me (Nov 7, 2021)

lexluthermiester said:


> While the price is attractive, we don't have the performance numbers for it yet. You can't call something a winner when it's not in the race.


Although this isn't a good indicator to use it looks hopeful.









						Best buy incoming: sub-US$200 Intel Core i5-12400F beats the Ryzen 5 5600X
					

For now, the i5-12600K is the value king, but, if you can wait for the i5-12400F releasing in two months, you could get roughly the same performance for sub-US$200.




					www.notebookcheck.net


----------



## lexluthermiester (Nov 7, 2021)

RandallFlagg said:


> These are starting to show up in major OEM rigs now.  That means there's some volume behind Alder Lake.
> 
> View attachment 224091


That was to be expected. It's nice to see them using the "KF" models.


----------



## Fasola (Nov 7, 2021)

fevgatos said:


> Cause that's so many different tasks. Oh wait, it's just 1, cinebench. The amount of misinformation going around in this forum is sad.
> 
> Here you go, a bunch of different tasks including gaming.


Is there another review to corroborate igor's LAB's result, as it's the only one I've seen quoted as proof of AL's low power draw?


----------



## lexluthermiester (Nov 7, 2021)

Fasola said:


> Is there another review to corroborate igor's LAB's result, as it's the only one I've seen quoted as proof of AL's low power draw?


Jay did a power draw test which should answer your question.


----------



## Deleted member 215115 (Nov 7, 2021)

Flanker said:


> lol I feel kinda dumb to realize it this late, but looks like for playing at 60Hz VSync, any i5/Ryzen 5 will do, the rest goes to GPU money when the prices are no longer crazy


But why would you game at 60 fps in 2021? That's something we did 20 years ago.

Oh yeah, the human eye can't see more than 4 GB of RAM anyway.


----------



## Mussels (Nov 7, 2021)

Fasola said:


> Is there another review to corroborate igor's LAB's result, as it's the only one I've seen quoted as proof of AL's low power draw?


I absolutely trust Igor, but yes it's odd to see disagreeing information
Perhaps he tested something differently.


fevgatos said:


> Cause that's so many different tasks. Oh wait, it's just 1, cinebench. The amount of misinformation going around in this forum is sad.
> 
> Here you go, a bunch of different tasks including gaming.


Look, heres a screenshot. It's of you. I want you to see the "less is better" part and then see the first thing you linked, with the Intels at the top with the biggest numbers...


----------



## Pumper (Nov 7, 2021)

rares495 said:


> But why would you game at 60 fps in 2021? That's something we did 20 years ago.
> 
> Oh yeah, the human eye can't see more than 4 GB of RAM anyway.



What a dumb question. Because most people only have 60Hz monitors and gaming at >60FPS requires exponentially more expensive hardware.



Mussels said:


> I absolutely trust Igor, but yes it's odd to see disagreeing information
> Perhaps he tested something differently.
> 
> Look, heres a screenshot. It's of you. I want you to see the "less is better" part and then see the first thing you linked, with the Intels at the top with the biggest numbers...



Something is not right with that chart, because right above it, he shows that Intel is using less power during Autocad testing:






It looks like that particular power efficiency chart has the wrong CPUs listed. He even says that Intel is doing better:

_"Once again, you can put the score in relation to the power consumption in order to map the efficiency. The Core i9-12900KF is even 71 percentage points more efficient than the Ryzen 9 5950X. I’d rather not even write anything about the Core i5-12600K."_

Here's his other chart with correct CPU names listed next to the scores:


----------



## londiste (Nov 7, 2021)

Mussels said:


> I absolutely trust Igor, but yes it's odd to see disagreeing information
> Perhaps he tested something differently.


His results do not seem to disagree with others. AutoCAD is rather specific in its CPU usage. It does multithread but generally poorly and relies on single-thread performance which is where Alder Lake excels. It is worth noting that games are quite similar to that usage pattern in many ways and Alder Lake's quite good gaming efficiency is corroborated by a bunch of different sources.

While AutoCAD one was focused upon, Igor also has the Blender power efficiency chart in that review:








						Core i9-12900KF, Core i7-12700K and Core i5-12600 in a workstation test with amazing results and an old weakness | Part 2 | Page 9 | igor'sLAB
					

So today I'll get serious and show you where Alder Lake S can really score aside from colorful gaming pixels. Gaming what? Completely overrated if you look at at least some of today's results.




					www.igorslab.de
				




Edit:


Mussels said:


> "So yeah, long story short, alder lakes are extremely efficient in 99% of tasks at stock"
> ... only vs Intel 11th gen, famously known as a waste of sand. Higher is *bad* here. The 12600K in particular? Middle of the pack at best.
> The 5600x is energy efficient, AL is not.
> View attachment 224068View attachment 224070


As I noted before in the thread - Alder Lake does very bad in SuperPi. Using that as the benchmark for power efficiency is probably quite misleading. Not sure what the power consumption in the other tests was but assuming similar power draw at quite noticeably better performance compared to others would put it in a different place in the efficiency chart.

For example, SuperPi vs CB R23 ST:


----------



## HD64G (Nov 7, 2021)

fevgatos said:


> What? The 10400f has been around for 2 years at ~140€. What are youtalking about?
> 
> 
> 
> ...


Sorry for the misunderstanding. I thought you wrote about the upcoming 12400F that could cost lower than 5600X. As for 10400F, its upgrading path reaches up to the 10900 whereas the AM4 platform will be compatible even with the upcoming Zen3D. Big difference me thinks.

As for the efficiency topic for heavy apps using all cores and threads, check that power draw below from GN. Double the power draw from the competition's same performance-tier CPU isn't something to argue about for so long me thinks. Gaming never comsumes much from the CPU as we all know for years now but good cooling is needed when you need the CPU to always perform to its max and that costs more for AL CPUs.


----------



## Deleted member 215115 (Nov 7, 2021)

Pumper said:


> What a dumb question. Because most people only have 60Hz monitors and gaming at >60FPS requires exponentially more expensive hardware.


What a dumb reply. I obviously know that most people still have 60 Hz monitors. What I meant was, how can people still use 60 Hz monitors? I can't even stand moving my cursor on the desktop at 60 Hz anymore. It physically hurts me. It is not smooth.

And no, gaming at 75Hz or 120 Hz doesn't require exponentially more expensive hardware but requires a careful planning and selection of parts before building a PC, something most people don't know how to do. 1080p 120 Hz is easily doable on a GTX 1070 or RTX 2060, cards that aren't even mid-range anymore. Yes, the shortage does change the equation a bit but you shouldn't be building a PC right now anyway unless you already have a graphics card.


----------



## Mussels (Nov 7, 2021)

rares495 said:


> What a dumb reply. I obviously know that most people still have 60 Hz monitors. What I meant was, how can people still use 60 Hz monitors? I can't even stand moving my cursor on the desktop at 60 Hz anymore. It physically hurts me. It is not smooth.
> 
> And no, gaming at 75Hz or 120 Hz doesn't require exponentially more expensive hardware but requires a careful planning and selection of parts before building a PC, something most people don't know how to do. 1080p 120 Hz is easily doable on a GTX 1070 or RTX 2060, cards that aren't even mid-range anymore. Yes, the shortage does change the equation a bit but you shouldn't be building a PC right now anyway unless you already have a graphics card.


I was using 72Hz-90Hz in the CRT days. LCD sent us backwards for a bit there.

As much as it may hurt some people who are used to 1080p60, 4k60 and 4k120 are the literal new standards thanks to TV's progressing, and the new consoles.


----------



## The King (Nov 7, 2021)

Mussels said:


> I was using 72Hz-90Hz in the CRT days. LCD sent us backwards for a bit there.
> 
> As much as it may hurt some people who are used to 1080p60, 4k60 and 4k120 are the literal new standards thanks to TV's progressing, and the new consoles.


Steam hardware OCT 21 survey says 1080p is still the standard, with a small number gaming at 1440p.
*1920 x 1080* *66.50%*
2560 x 1440  8.71%
3840 x 2160 2.39%


			Steam Hardware & Software Survey


----------



## fevgatos (Nov 7, 2021)

Mussels said:


> I absolutely trust Igor, but yes it's odd to see disagreeing information
> Perhaps he tested something differently.
> 
> Look, heres a screenshot. It's of you. I want you to see the "less is better" part and then see the first thing you linked, with the Intels at the top with the biggest numbers...
> View attachment 224110


The translation is wrong, obviously. It's score per WATT, higher is obviously better. It's pretty freaking obvious just looking at the numbers, do you think a 12900k at 241w is more efficient than a 12900k at 125w? Lolk


----------



## Mussels (Nov 7, 2021)

Yeah i'm done.

You've made your mind up, and you'll see what you want to see.


----------



## fevgatos (Nov 7, 2021)

Mussels said:


> Yeah i'm done.
> 
> You've made your mind up, and you'll see what you want to see.


Is it that hard to just admit you made a mistake, when it is clearly obvious you did? It's a german site and lots of translations are wrong. For example in the graph it says watts per hour which is also wrong, instead it measures watts consumed. Anyways, if you are not willing to admit your mistake, you are indeed done


----------



## londiste (Nov 7, 2021)

HD64G said:


> As for the efficiency topic for heavy apps using all cores and threads, check that power draw below from GN. Double the power draw from the competition's same performance-tier CPU isn't something to argue about for so long me thinks. Gaming never comsumes much from the CPU as we all know for years now but good cooling is needed when you need the CPU to always perform to its max and that costs more for AL CPUs.
> View attachment 224137


The thing with that chart is that 12600K outperformed 5800X in that particular test...


----------



## looniam (Nov 7, 2021)

Turmania said:


> I think there should be another review in regards to power consumption during gaming... as well as boost frequencies chart like there was with ryzen cpu reviews.


not a "review" but something that might be relevant:


----------



## lexluthermiester (Nov 7, 2021)

Mussels said:


> As much as it may hurt some people who are used to 1080p60, 4k60 and 4k120 are the literal new standards thanks to TV's progressing, and the new consoles.


This. While I'm happy with my 1440p displays on my gaming system, it's getting time to go 4k120 and I'm currently shopping for displays that are the right size, and specs.



fevgatos said:


> Is it that hard to just admit you made a mistake, when it is clearly obvious you did?


Mussels didn't make the mistake. You are missing some context. Just throwing it out there...



fevgatos said:


> Anyways, if you are not willing to admit your mistake, you are indeed done


Please look in a mirror while saying that.


----------



## Pumper (Nov 7, 2021)

lexluthermiester said:


> Mussels didn't make the mistake. You are missing some context. Just throwing it out there...


The mistake is pretending that the chart is correct, while it clearly shows the wrong data (unless you really think that the 11900K is twice as efficient than the 12600K).

It says that the numbers are "score points per Watt", meaning higher is better, but the chart has a mistake and says "less is better", because originally it was supposed to display "watts per hours".


----------



## lexluthermiester (Nov 7, 2021)

Pumper said:


> The mistake is pretending that the chart is correct, while it clearly shows the wrong data (unless you really think that the 11900K is twice as efficient than the 12600K).


The charts posted here?








						Intel Core i5-12600K
					

Think people are forgetting that these are K series chips.  Those and AMD X are the enthusiast chips, unlocked, customizable.  You can turn a K into a non-K if you so desire.  You can clock it till you can't cool it.  Same on AMD side, PBO2 is power unlocked, as far as you can clock it and cool...




					www.techpowerup.com
				



Those don't show what you just suggested. Go look them over gain because you are missing some context..



Pumper said:


> It says that the numbers are "score points per Watt", meaning higher is better, but the chart has a mistake and says "less is better", because originally it was supposed to display "watts per hours".


They don't say that. Go look again.


----------



## Pumper (Nov 7, 2021)

lexluthermiester said:


> The charts posted here?
> 
> 
> 
> ...



Nope, these: https://www.techpowerup.com/forums/threads/intel-core-i5-12600k.288576/post-4644180

The first chart has a typo where is says "lower is better" when it should be "higher is better", but Mussels just ignored it and replied: _"Look, heres a screenshot. It's of you. I want you to see the "less is better" part and then see the first thing you linked, with the Intels at the top with the biggest numbers._..".

*To summarize:*
fevgatos posted some charts from Igor's Lab with a typo in the first one,
Mussels replied that the first chart says "less is better" but AMD has lower scores, therefore Intel is losing,
fevgatos pointed out the typo,
Mussels replied that he's just in denial,
then you came in talking about fevgatos missing context while you are not even referring to the same charts.


----------



## fevgatos (Nov 7, 2021)

lexluthermiester said:


> This. While I'm happy with my 1440p displays on my gaming system, it's getting time to go 4k120 and I'm currently shopping for displays that are the right size, and specs.
> 
> 
> Mussels didn't make the mistake. You are missing some context. Just throwing it out there...
> ...


Ok bro, whatever


----------



## lexluthermiester (Nov 7, 2021)

Pumper said:


> The first chart has a typo where is says "lower is better" when it should be "higher is better", but Mussels just ignored it and replied: _"Look, heres a screenshot. It's of you. I want you to see the "less is better" part and then see the first thing you linked, with the Intels at the top with the biggest numbers._..".


I know this is going to seem insulting, but I have to ask: Are you deliberately trolling? Because if you're not, your reading comprehension needs improvement.



fevgatos said:


> Ok bro, whatever


Like this for example. This is an obvious dismissive comment from someone who is either deliberately trolling or refuses to accept the reality staring them in the face..


----------



## fevgatos (Nov 7, 2021)

lexluthermiester said:


> Like this. This is an obvious dismissive comment from someone who is either deliberately trolling or refuses to accept the reality staring them in the face..


And you keep embarrassing yourself.

I posted a graph from igorslab. It had a typo in it, saying lower is better instead of higher is better. Mussels didnt realize the typo so he thought , mistakenly, that zen 3 are more efficient when they are not. Then he said im in denial cause he didn't understand it's a typo. Then you came along not even realizing what we are talking about sprouting your nonsense pretty much. Keep it up


----------



## Pumper (Nov 7, 2021)

lexluthermiester said:


> I know this is going to seem insulting, but I have to ask: Are you deliberately trolling? Because if you're not you're reading comprehension needs improvement.
> 
> 
> Like this for example. This is an obvious dismissive comment from someone who is either deliberately trolling or refuses to accept the reality staring them in the face..


Keep ignoring reality if you like. You are not even looking at the chart both Mussels and fevgatos are talking about.


----------



## lexluthermiester (Nov 7, 2021)

fevgatos said:


> I posted a graph from igorslab. It had a typo in it, saying lower is better instead of higher is better. Mussels didnt realize the typo so he thought , mistakenly, that zen 3 are more efficient when they are not. Then he said im in denial cause he didn't understand it's a typo. Then you came along not even realizing what we are talking about sprouting your nonsense pretty much. Keep it up


No, that's not what happened. 


fevgatos said:


> And you keep embarrassing yourself.


And that's not happening either.


Pumper said:


> Keep ignoring reality if you like. You are not even looking at the chart both Mussels and fevgatos are talking about.


Sure thing.

It's time for the bickering and off-topic back & forth to stop.


----------



## fevgatos (Nov 7, 2021)

lexluthermiester said:


> No, that's not what happened.
> 
> And that's not happening either.
> 
> ...


Cause it's really freaking hard to go to page 10 and see it for yourself...and you keep embarrassing yourself...


----------



## Pumper (Nov 7, 2021)

lexluthermiester said:


> It's time for the bickering and off-topic back & forth to stop.


Yes, because it's too hard to admit you are wrong, so let's just move along and forget all about it.


----------



## 95Viper (Nov 7, 2021)

Your points were noted by all.
Stop the bickering.
Now, move on.


----------



## TheoneandonlyMrK (Nov 7, 2021)

The King said:


> Steam hardware OCT 21 survey says 1080p is still the standard, with a small number gaming at 1440p.
> *1920 x 1080* *66.50%*
> 2560 x 1440  8.71%
> 3840 x 2160 2.39%
> ...


Most people drive petral and diesel cars, but the future is?! Not them.


----------



## The King (Nov 7, 2021)

TheoneandonlyMrK said:


> Most people drive petral and diesel cars, but the future is?! Not them.


Context is always import when replying to a post.
I replied to a user that said 4K/60fps/120fps is the new standard today.
I merely replied that does not seem to be true according to Steam which shows 1080p is still the most popular standard among Gamers.

4K/60fps/120fps maybe the future standard for gaming, but that has no impact on what I posted.


----------



## TheoneandonlyMrK (Nov 7, 2021)

The King said:


> Context is always import when replying to a post.
> I replied to a user that said 4K/60fps/120fps is the new standard today.
> I merely replied that does not seem to be true according to Steam which shows 1080p is still the most popular standard among Gamers.
> 
> 4K/60fps/120fps maybe the future standard for gaming, but that has no impact on what I posted.


I know, is steam defined universally as the goto definition of what people are buying to game on in the future ?! No it's at best an indicator of what they have now.

Do people buy new tech just to continue with what they're doing now, or do a major proportion of new buyers buy parts expecting an improvement or increase in gaming performance?!.

I already know the answers to these too.


----------



## The King (Nov 7, 2021)

TheoneandonlyMrK said:


> I know, is steam defined universally as the goto definition of what people are buying to game on in the future ?! No it's at best an indicator of what they have now.
> 
> Do people buy new tech just to continue with what they're doing now, or do a major proportion of new buyers buy parts expecting an improvement or increase in gaming performance?!.
> 
> I already know the answers to these too.


Lets hope the price of GPUs that can run 4K/60fps/120fps come down by alot so this can become the future standard.
At the current moment things are not looking good with the chip shortage said to continue well into next year.


----------



## TheoneandonlyMrK (Nov 7, 2021)

The King said:


> Lets hope the price of GPUs that can run 4K/60fps/120fps come down by alot so this can become the future standard.
> At the current moment things are not looking good with the chip shortage said to continue well into next year.


It already is, progress takes time but dlss and Fsr have helped, I can do 4k 60 easily on my Vega64 or Rtx2060.

And we know that there are millions of consoles at sub 600£ that can do it.


----------



## RandallFlagg (Nov 7, 2021)

lexluthermiester said:


> Jay did a power draw test which should answer your question.



The really relevant part starts around 12:30  He's using 5900X vs 12900K

Summary:  Under full load Intel draws more, under gaming load they're close to the same, under idle \ small loads (opening a file browser and such) AMD pulls a lot more power. 

I don't know about other people here but I spend 90% of my time on PC under near idle loads. 

Starting CPU test (TimeSpy CPU)  - Intel running AMD not yet running it :






Intel finished the CPU test before the AMD rig could even start, here's how the CPU test looked on AMD :





Test result :




Idle :



Opening up file explorer and using the desktop a bit on the Intel system :




Same thing on AMD this time, opening file browser and a file :


----------



## lexluthermiester (Nov 7, 2021)

RandallFlagg said:


> The really relevant part starts around 12:30  He's using 5900X vs 12900K
> 
> Summary:  Under full load Intel draws more, under gaming load they're close to the same, under idle \ small loads (opening a file browser and such) AMD pulls a lot more power.
> 
> ...


Yeah, he seemed like he was being careful to be fair.


----------



## Pumper (Nov 7, 2021)

RandallFlagg said:


> Idle :
> View attachment 224176



Jay should delete all the bloatware from his AMD system the next time he's doing idle power draw tests:






And these are accurate results. My own system idles at ~48W.


----------



## RandallFlagg (Nov 7, 2021)

lexluthermiester said:


> Yeah, he seemed like he was being careful to be fair.



He spent the first 12:30s on that rofl.

It's not a totally apples to apples demonstration but, those idle and low load numbers tell me everything I need to know about AMD efficiency.  

I probably hit 100% CPU a dozen times a day for like one second, and sit at 1-5% for 95% of my time.  So basically for 12 seconds a day the max load benchmarks are relevant to me.  The other 86388 seconds of a day they are irrelevant.  

Even in real gaming, spikes above 20% are rare.  I know because I did perfmon many times for multiple days to see what my workload was like.  And my workload is actually > than normal users.


----------



## Vayra86 (Nov 7, 2021)

fevgatos said:


> What? The 10400f has been around for 2 years at ~140€. What are youtalking about?
> 
> 
> 
> ...



Got examples and numbers?

Every PC has a bottleneck. You always have one. Perfect balance doesnt exist.

Its irrelevant the moment you exceed your monitor refresh rate on the GPU or when the GPU is the limiting factor. And with VRR modes it gets even more irrelevant.

Im always ready to be convinced. So please give it your best shot 

The gist being of course, 'what is the value of this upgrade'. For gaming, its going to be so minimal, yet have so many drawbacks, because you're early adopting into everything you shouldn't - fresh Windows, DDR5, new arch and scheduling... the works.



fevgatos said:


> The translation is wrong, obviously. It's score per WATT, higher is obviously better. It's pretty freaking obvious just looking at the numbers, do you think a 12900k at 241w is more efficient than a 12900k at 125w? Lolk





fevgatos said:


> Is it that hard to just admit you made a mistake, when it is clearly obvious you did? It's a german site and lots of translations are wrong. For example in the graph it says watts per hour which is also wrong, instead it measures watts consumed. Anyways, if you are not willing to admit your mistake, you are indeed done



Well spotted! Its true, it has to be points per watt.


----------



## The King (Nov 7, 2021)

lexluthermiester said:


> Jay did a power draw test which should answer your question.


AMD system
Hey guys this system here has
10 X RGB FANS
Has waterblock  etc
So we know this system over is using more wattage hence the high idle but we not comparing idle in this video.

Intel System
this has less connected to it.

Conclusion at end of video. AMD has very bad high idle power consumption.

I am truly lost for words.


----------



## Vayra86 (Nov 7, 2021)

The King said:


> AMD system
> Hey guys this system here has
> 10 X RGB FANS
> Has waterblock  etc
> ...



Youtuber Intelligence. It never ceases to amaze, nor does the flock of sheep clicking them.

There are only good reasons to read instead of watch. And write instead of be a TV personality. It defines the content and nobody escapes that reality. Reading is healthy, getting spoonfed with content breeds intellectual laziness.


----------



## RandallFlagg (Nov 7, 2021)

Vayra86 said:


> Youtuber Intelligence. It never ceases to amaze, nor does the flock of sheep clicking them.
> 
> There are only good reasons to read instead of watch. And write instead of be a TV personality. It defines the content and nobody escapes that reality. Reading is healthy, getting spoonfed with content breeds intellectual laziness.



Wow, you're so smart.

Or alternately you might choose to listen to the first 12 1/2 minutes of that video.

I'm assuming you did that before passing judgement, correct?  

Probably I shouldn't have posted the relevant time on the video.


----------



## Vayra86 (Nov 7, 2021)

RandallFlagg said:


> Wow, you're so smart.
> 
> Or alternately you might choose to listen to the first 12 1/2 minutes of that video.
> 
> ...



There is nothing in it that refutes what @The King stated. Jays statements on idle draw are pretty strange, because they also don't match what's shown in the reviews on TPU.


----------



## The King (Nov 7, 2021)

Vayra86 said:


> There is nothing in it that refutes what @The King stated. Jays statements on idle draw are pretty strange, because they also don't match what's shown in the reviews on TPU.










The guy is a walking contradiction. @ 3mins 50 seconds

He literally says you can't compare these two systems because the Intel system will need to have 10 fans etc to be a fair idle comparison.
Then says I will remove two sticks of RAM from the Ryzen system and now its an apples to apples as we can get.
But Jay! What about the 10 fans and the water-block loop connected to the GPU on the Ryzen 5800 system?

You just said you cant compare them and now its apples to apples by removing two sticks of ram?

If anything it proves that even with 10 fans and a water-block connected to it the Ryzen 5800X system, still used far less overall power underload in the load tests compared to the 12900K.


----------



## Vayra86 (Nov 7, 2021)

The King said:


> The guy is a walking contradiction. @ 3mins 50 seconds
> 
> He literally says you can't compare these two systems because the Intel system will need to have 10 fans etc to be a fair idle comparison.
> Then says I will remove two sticks of RAM from the Ryzen system and now its an apples to apples as we can get.
> ...



And the tests are not even synced  Who knows what's what.


----------



## Fasola (Nov 7, 2021)

I remember watching this PC repair challenge and Jay was having a hard time even though Linus was trying to feed him the solutions. There's a recap on GN's channel but I don't think it illustrates just how bad Jay was at troubleshooting.


----------



## MarsM4N (Nov 7, 2021)

lexluthermiester said:


> Jay did a power draw test which should answer your question.



I like Jay, but that's the most *un*scientific test I have seen in a while.

Just lazy. If he continues like that, he soon has to do reviews for *Man Bra's*.


----------



## Why_Me (Nov 7, 2021)

Fasola said:


> I remember watching this PC repair challenge and Jay was having a hard time even though Linus was trying to feed him the solutions. There's a recap on GN's channel but I don't think it illustrates just how bad Jay was at troubleshooting.


Anything is better than those two clowns on Hardware Unboxed.


----------



## 95Viper (Nov 7, 2021)

Thread is about "Intel Core i5-12600K".
It is not about other testers/reviewers/etc.
Let's stay on topic.

Thank You.


----------



## fevgatos (Nov 7, 2021)

The King said:


> AMD system
> Hey guys this system here has
> 10 X RGB FANS
> Has waterblock  etc
> ...


10 fans and an AIO don't need more than 2AMPS to work. It's pretty obvious by the fact that you can actually connect 4-5 fans to a single 4pin of your mobo. That 4pin usually supports up to 1 amp.


----------



## tfdsaf (Nov 7, 2021)

Great value, essentially getting 12 cores for $300 and it beats the 5600x easily, but nothing earth shattering in terms of performance. It beats the 8 core 5800x and ends up losing to the 12 cores 5900x and power consumption in applications seems to be through the roof, especially for a supposed mid range cpu, but its still solid considering its onkly $300. 

The issue is high costs of motherboards and of course you want to buy DDR5 with this, it would be stupid not to get the future looking memory, but the cost is very restrictive. I can see people with Ryzen 1000 or 2000 or Intel 6/7/8/9th generation who want to upgrade and plan on reusing their DDR4 modules jumping on board and going for the 12600k. I think its a tough sell for 10/11th gen Intel users especially since the performance between all of them is very similar and both are very recent cpu's, so if you shelved out $300 on a 10600k cpu less than 2 years ago, you wouldn't want to doll out another $300 so soon, even if performance is significantly increased. Again with cost of new motherboard and cooler you are probably looking at $550 cost just for those 3 parts.


----------



## Mussels (Nov 8, 2021)

The King said:


> AMD system
> Hey guys this system here has
> 10 X RGB FANS
> Has waterblock  etc
> ...



At 2:50 he specifically even states "the AMD system has more connected, this is all about load wattages"


But how much load wattage needs to be removed from the AMD for all that crap?




fevgatos said:


> 10 fans and an AIO don't need more than 2AMPS to work. It's pretty obvious by the fact that you can actually connect 4-5 fans to a single 4pin of your mobo. That 4pin usually supports up to 1 amp.


Yes and no.
NF-A12x25 uses <2W (0.14A) per fan, so yes you'd be right on that with standard fans.
Corsairs LL120 ARGB fans however, are 0.30A (3.6W) - just over double that
A D5 pump uses 1.8A (~22W)

So with just the pump and 10 fans, we're already upto 60 Watts if they're at higher settings. Then theres lighting controllers, LED strips, fan controllers/hubs...


----------



## Taraquin (Nov 8, 2021)

Good review, but I think alder lake has an unfair advantage with 6000cl36 ram vs 3600cl16-20-20 on others. I know Rocket lake is close to gear 1 lim, but Comet lake and Zen 3 should have had 3800 in my opinion. Another option would be testing Alder lake at 5200-5600cl38 as this would be comparable to 3600cl16 on the others.


----------



## W1zzard (Nov 8, 2021)

RandallFlagg said:


> Giving us idle power and max load power consumption in a practical sense is like a car enthusiast site doing a car review and only giving gallons/hr of fuel consumed at idle and at full throttle as a measure of efficiency.  This has absolutely nothing to do with efficiency and they would be laughed out of business for giving such worthless data points.
> 
> The fact this somehow cuts the mustard for PC enthusiasts is a good indicator that a lot of what is in all of these reviews is likely just so much BS, it also says a lot about their readers.


I started a new thread, so we can focus on the topic: https://www.techpowerup.com/forums/...er-consumption-testing-in-cpu-reviews.288761/


----------



## goodeedidid (Nov 8, 2021)

I wonder if the Apple M1 PRO will destroy this.. you should also include the Apple silicon on the benchmark scoreboards


----------



## wheresmycar (Nov 8, 2021)

Glad to have followed the thread to the 12th page. I take it 12th Gen temps/power draw are just as decent as Zen 3?

I wander if W1zzard can add gaming temps to the mix in future reviews.


----------



## lexluthermiester (Nov 9, 2021)

wheresmycar said:


> I take it 12th Gen temps/power draw are just as decent as Zen 3?


Not quite. Alder Lake is a bit more power hungry than Zen3, a lot more in the upper performance ranges.


wheresmycar said:


> I wander if W1zzard can add gaming temps to the mix in future reviews.


He doesn't really need to. No game that I know of will load a CPU to it's max like the other tests have done. Even if you find one that does, all it'll do is cause the CPU to exhibit the same thermal characteristics that have already been shown with the current tests. The testing that has been done tells you everything you need to know. W1zzard's testing didn't miss a single beat.


----------



## 5 o'clock Charlie (Nov 9, 2021)

Does anyone have any information on what security hardware mitigations have been implemented in Alder Lake cpus? I am having a hard time on Intel's Ark pages to display this information. It is not on the 12600k product page. I remember Intel posting a spreadsheet how each fix was implemented back with Whiskey Lake, but I am not finding anything with this architecture. I am just curious. Thanks.


----------



## Mussels (Nov 10, 2021)

lexluthermiester said:


> Not quite. Alder Lake is a bit more power hungry than Zen3, a lot more in the upper performance ranges.
> 
> He doesn't really need to. No game that I know of will load a CPU to it's max like the other tests have done. Even if you find one that does, all it'll do is cause the CPU to exhibit the same thermal characteristics that have already been shown with the current tests. The testing that has been done tells you everything you need to know. W1zzard's testing didn't miss a single beat.


This is being discussed in another thread - but the basic summary is, it's worth knowing if your gaming results are costing you 60W or 300W.
Modern CPU's (and now hybrid CPU's) with all the idling, boosting, and preferred core swapping really do make it so low thread count loads can still use a lot of power.


----------



## lexluthermiester (Nov 10, 2021)

5 o'clock Charlie said:


> Does anyone have any information on what security hardware mitigations have been implemented in Alder Lake cpus? I am having a hard time on Intel's Ark pages to display this information. It is not on the 12600k product page. I remember Intel posting a spreadsheet how each fix was implemented back with Whiskey Lake, but I am not finding anything with this architecture. I am just curious. Thanks.


What security problems are you concerned with?


Mussels said:


> This is being discussed in another thread


Where's that at?


----------



## pocketjacks (Nov 10, 2021)

I remember when Intel used to have a TDP 50W then 65W then 125W to now 150W and above. 
Is this the new redefining of performance & efficiency we should expect from Intel ? Yes the 12600K is cheaper but the extra power draw will out price AMD, not to mention the new motherboard requirement for every generation of Intel CPU going forward. And now that Intel is close to par with AMD by a few frames, they will demand top dollar like they used to. And then they will create a chip shortage so that they can gouge out the consumer like they did not long ago.  For performance & efficiency AMD is still the one & they have further improvements months away.


----------



## lexluthermiester (Nov 10, 2021)

pocketjacks said:


> Is this the new redefining of performance & efficiency we should expect from Intel ?


Not just Intel. AMD is also moving to a form a power declaration that is easier for market understanding and testing, likely the same one.


----------



## 5 o'clock Charlie (Nov 10, 2021)

lexluthermiester said:


> What security problems are you concerned with?


I know many of the early security issues (e g. Spectre, Meltdown) have been mitigated on the hardware level, but curious what others have been implemented on the hardware level such as MDS attacks as well as what has not. Last I checked (which has been a year or so) have only been on the firmware and OS level, thus affecting performance. Hardware are usually more efficient implementations.

Personally, I am not concerned about these security issues with personal computers, but more curious about the server field. For example, my company had to disable HT on our older Intel xeon line of cpus. This affected performance significantly on the workloads that are done. Eventually we had to replace all of those with Epyc CPUs, which had the hardware mitigations. The downside is that these were really expensive and less configure options available with our vendor compared to the Intel equivalent.


----------



## lexluthermiester (Nov 11, 2021)

5 o'clock Charlie said:


> I know many of the early security issues (e g. Spectre, Meltdown) have been mitigated on the hardware level, but curious what others have been implemented on the hardware level such as MDS attacks as well as what has not. Last I checked (which has been a year or so) have only been on the firmware and OS level, thus affecting performance. Hardware are usually more efficient implementations.
> 
> Personally, I am not concerned about these security issues with personal computers, but more curious about the server field. For example, my company had to disable HT on our older Intel xeon line of cpus. This affected performance significantly on the workloads that are done. Eventually we had to replace all of those with Epyc CPUs, which had the hardware mitigations. The downside is that these were really expensive and less configure options available with our vendor compared to the Intel equivalent.


That's what I thought you meant, but clarification never hurts. As far as I know, Alder Lake has no known hardware security vulnerabilities.


----------



## TheinsanegamerN (Nov 11, 2021)

TheoneandonlyMrK said:


> Most people drive petral and diesel cars, but the future is?! Not them.


See, the issue with your comparison is that "ev is the future" has been a motto since the 1990s, arguably even earlier. The very first time that was said was the electric model t. We all saw how that went. Just because a new tech is good doesnt mean it will dominate nor is it imminent. Everyone in 2006 who said that EVs would dominate by 2020 look pretty foolish, what with EVs making up less then 1% of consumer car sales and all. 

As far as resolutions go, 4k ahs been around for years now. 1440p is not a new resolution by any means. Yet, 1440p and 4k combined are less then 10% of the total market, and 25% of the market is sub 1080p. Claiming 4k120 is the future and that 1080p should be ignored is rediculous, at thsi rate itll take decades before either 1440p or 4k get to half of 1080p's usage numbers, let alone exceed 1080p in the market.


----------



## TheoneandonlyMrK (Nov 11, 2021)

TheinsanegamerN said:


> See, the issue with your comparison is that "ev is the future" has been a motto since the 1990s, arguably even earlier. The very first time that was said was the electric model t. We all saw how that went. Just because a new tech is good doesnt mean it will dominate nor is it imminent. Everyone in 2006 who said that EVs would dominate by 2020 look pretty foolish, what with EVs making up less then 1% of consumer car sales and all.
> 
> As far as resolutions go, 4k ahs been around for years now. 1440p is not a new resolution by any means. Yet, 1440p and 4k combined are less then 10% of the total market, and 25% of the market is sub 1080p. Claiming 4k120 is the future and that 1080p should be ignored is rediculous, at thsi rate itll take decades before either 1440p or 4k get to half of 1080p's usage numbers, let alone exceed 1080p in the market.


Hang on let me just re read 8 pages ago for context and I'll get back to you.

But you can start with steam stats aren't worth much.

And end on that's the comment you want to drag on ?!.

Me quoted again
"
I know, is steam defined universally as the goto definition of what people are buying to game on in the future ?! No it's at best an indicator of what they have now.

Do people buy new tech just to continue with what they're doing now, or do a major proportion of new buyers buy parts expecting an improvement or increase in gaming performance?!.

I already know the answers to these too."

So again what do people buy new stuff for, to do the same stuff ?!.

It will take time obviously but doesn't mean only 1080p counts until then, been 4k here for 3 years now and I still own a 1080p laptop until it dies, and it isn't getting swapped for another 1080p.


----------



## 5 o'clock Charlie (Nov 11, 2021)

lexluthermiester said:


> That's what I thought you meant, but clarification never hurts. As far as I know, Alder Lake has no known hardware security vulnerabilities.


Thank you. It is always good to ask for clarification and not assume (the latter gets me into more trouble than its worth )
After more searching online, I _finally_ found a security vulnerability spreadsheet that contains a CPU family list dating back to Sandy Bridge thru Alder Lake, which is what I have been seeking. So I might as well post it here if anyone else is interested:
https://www.intel.com/content/www/u...-affected-consolidated-product-cpu-model.html


----------



## wheresmycar (Nov 11, 2021)

lexluthermiester said:


> He doesn't really need to. No game that I know of will load a CPU to it's max like the other tests have done. Even if you find one that does, all it'll do is cause the CPU to exhibit the same thermal characteristics that have already been shown with the current tests. The testing that has been done tells you everything you need to know. W1zzard's testing didn't miss a single beat.



Not taking anything away from W1zzard's God-like reviews.... the best material available for the CPU know how. 

Only i'd fancy a quick look at power consumption whilst gaming too. Unfortunately with the data provided we don't have these game specific stats. it's not a problem though... as other reviewers readily have these available but it would be nice to see all the stats on one page (and i prefer TPUs style of charts).

Update: recenly he added this thread: https://www.techpowerup.com/forums/...er-consumption-testing-in-cpu-reviews.288761/ and the good news being he mentions_ "I do have definite plans to add "Gaming Power Consumption", using Cyberpunk 2077 at highest settings, v-sync off" _(w1zzard you're a legend)


----------



## evernessince (Nov 12, 2021)

RandallFlagg said:


> This mirrors what I've seen at some other sites, including Anandtech.  Peak load benchmarks have really always just been troll bait.



I'd say it's definitely helpful to know peak power consumption so that you can tailor your motherboard, PSU, and cooler to any conditions your CPUs will be under.

IMO if you are fine with 720p or 1080p benchmarks it's hypocritical to complain about these peak power tests.  You can't complain about tests designed to demonstrate peak power consumption when the CPU tests are designed based on the same principle.

Alder lake is a fantastic addition from Intel but those power consumption numbers are all over the place.  I'm not sure what to make of it.


----------



## RandallFlagg (Nov 12, 2021)

evernessince said:


> I'd say it's definitely helpful to know peak power consumption so that you can tailor your motherboard, PSU, and cooler to any conditions your CPUs will be under.
> 
> IMO if you are fine with 720p or 1080p benchmarks it's hypocritical to complain about these peak power tests.  You can't complain about tests designed to demonstrate peak power consumption when the CPU tests are designed based on the same principle.
> 
> Alder lake is a fantastic addition from Intel but those power consumption numbers are all over the place.  I'm not sure what to make of it.



And I think it hypocritical for you to say that the peak power tests are so very useful, and in the same exact post state :

"...those _*power consumption numbers are all over the place.  I'm not sure what to make of it.*_"

You don't know what to make of it because it is not a realistic scenario.  Even your PSU argument falls apart and shows just how misled and/or ignorant you are.   The power levels both peak and average can *and should* be controlled by a DIY builder by changing PL1/PL2 and Tau power levels to fit the build.  In point of fact, *Intel's technical documentation overtly states that the builder should adjust those settings for the platform's capabilities*.   Zen can be unlocked to PBO2 (unlimited power) and consume upwards of 250W as well, it's just that Zen gets no benefit to speak of from it, but Intel does, so no one speaks of doing it on Zen other than to say 'not worth it'.   Somehow this is presented as a negative for Intel.  

If you are not capable or willing to understand those power settings then don't DIY,  go buy an OEM rig and rely on your warranty.   They will configure it for you.


----------



## evernessince (Nov 12, 2021)

RandallFlagg said:


> And I think it hypocritical for you to say that the peak power tests are so very useful, and in the same exact post state :
> 
> "...those _*power consumption numbers are all over the place.  I'm not sure what to make of it.*_"
> 
> ...



I'm not sure if you've read my profile specs or any of my prior posts but the personal attacks are completely uncalled for.  I miss the days when I could go onto a tech form and not be personally attacked.  Completely unwarranted.



> You don't know what to make of it because it is not a realistic scenario.



That was the point, neither CPU testing nor peak power consumption testing are realistic scenarios.  As I stated in my last comment, you either have to except the fact that they are going to have some tests that seek to extract maximum performance.  "realistic" is a moving target as I stated previously. Realistic for one person is not realistic for another.  For the vast majority of people 720p is not realistic yet it's completely acceptable because it's the best way to extract maximum performance.  The same applies to power consumption, while the maximum power consumption (just like CPU performance at 720p) is not "realistic" to most, it's an important metric to have nonetheless.


----------



## RandallFlagg (Nov 12, 2021)

evernessince said:


> I'm not sure if you've read my profile specs or any of my prior posts but the personal attacks are completely uncalled for.



Then don't make personal attacks yourself.  If you can't stand the heat, stay out of the kitchen.  I don't need to see your profile to know that you are either misrepresenting, or you are clueless, about PL1/PL2 power level settings.  Either way my comment is on target.



evernessince said:


> I miss the days when I could go onto a tech form and not be personally attacked.  Completely unwarranted.



Speaking of hypocrisy, do I need to refer you to your very first post in this thread?



evernessince said:


> That was the point, neither CPU testing nor peak power consumption testing are realistic scenarios.  As I stated in my last comment, you either have to except the fact that they are testing in a manner to extract maxmiums or you do all testing in a "realistic" manner (which is a moving target as I stated previously as realistic for one person is not realistic for another).  For the vast majority of people 720p is not realistic yet it's completely acceptable because it's the best way to extract maximum performance.



A false assertion.  You don't have to do anything of the sort.   You can test at different power levels to very clearly illustrate performance / power scaling.


----------



## evernessince (Nov 12, 2021)

RandallFlagg said:


> Then don't make personal attacks yourself.  If you can't stand the heat, stay out of the kitchen.  I don't need to see your profile to know that you are either misrepresenting, or you are clueless, about PL1/PL2 power level settings.  Either way my comment is on target.
> 
> 
> 
> ...



I didn't and feel free to link.


----------



## RandallFlagg (Nov 12, 2021)

evernessince said:


> I didn't and feel free to link.





evernessince said:


> *IMO if you are fine with 720p or 1080p benchmarks it's hypocritical to complain about these peak power tests.* *You can't complain about tests designed to demonstrate peak power consumption when the CPU tests are designed based on the same principle.*
> 
> Alder lake is a fantastic addition from Intel but those power consumption numbers are all over the place.  I'm not sure what to make of it.



720p tests are not peak power tests.


----------



## Mussels (Nov 12, 2021)

RandallFlagg said:


> 720p tests are not peak power tests.


They can be, in the same way that furmark is a power hungry beast.

Removing bottlenecks along the way (such as lower resolution and settings) can definitely cause the usage of another part (CPU or GPU) to skyrocket outside typical usage.


----------



## RandallFlagg (Nov 12, 2021)

Mussels said:


> They can be, in the same way that furmark is a power hungry beast.
> 
> Removing bottlenecks along the way (such as lower resolution and settings) can definitely cause the usage of another part (CPU or GPU) to skyrocket outside typical usage.



I'm not aware of any games that can provide both a consistent *and *multi-core / all core workload - and I think it is pretty clear that references to the 720p and 1080p benchmarks is talking about games.    Games simply do not lend themselves to this, yes they'll be multi-threaded but it's entirely unlike a render job.   Rendering and encoding as examples (typically used for a 'real world' load test) can usually be evenly divided into similar sized workloads which are all doing the same type of work.  Furmark is very close to those types of jobs, but it is an OpenGL test not a 720p game bench.  That flow simply doesn't describe a game, where you will have potentially multiple threads for 'AI', for draw calls, for sound and so on which are all entirely dissimilar.


----------



## Mussels (Nov 12, 2021)

Yes, it'd be game related, otherwise why bring in resolution at all? 2D and workload tasks dont really care about what your monitor is set to.
There ends up being a minefield of thousands of variables, and w1zz has to choose just a few but as an example, think a 720p gamer with Vsync on vs off. 60FPS vs 400FPS is going to use MASSIVELY different power amounts.


----------



## Selaya (Nov 12, 2021)

Games are usually bottlenecked by memory/cache and not IPC. So yeah, even at 720p or 640x480 most will not be able to load a CPU as much as like, Blender could.


----------



## evernessince (Nov 13, 2021)

RandallFlagg said:


> 720p tests are not peak power tests.



That's not what I was trying to say.  I was saying that they are designed to extract maximum performance from the CPU.

I think you maybe misunderstood my comment.  At no point did I attempt to insult anyone and when I said I don't know what to make of the power consumption numbers, I meant that in a sense of approaching the platform from a buyer's perspective.  If I am considering purchasing a platform, I have to purchase a mobo and cooler that will work with it in all scenarios so the question would be do we target closer to the max power draw or closer to the average given the large gap between the two.  You could make an argument for either.


----------



## RandallFlagg (Nov 13, 2021)

Mussels said:


> Yes, it'd be game related, otherwise why bring in resolution at all? 2D and workload tasks dont really care about what your monitor is set to.
> There ends up being a minefield of thousands of variables, and w1zz has to choose just a few but as an example, think a 720p gamer with Vsync on vs off. 60FPS vs 400FPS is going to use MASSIVELY different power amounts.



Depends on how well balanced those threads are.  In an game you may have two threads doing draw calls at 20% usage going 60fps at 1440p, then you go to 720p and it goes 300fps so the two threads are now huffing it at 60%.   That is not going to change your AI or sound and so on.   That's why I said you need it to be consistent *and* balanced, games simply don't work like that, you can quadruple the rate of draw calls yes but the rest remains the same.


----------



## Why_Me (Nov 26, 2021)

kruk said:


> The leaks overhyped Alder Lake to the moon, but in reality it's just a normal generational jump with disappointing power consumption, possible backwards compatibility issues and high platform costs ...


i5 12600K = best overall cpu & best gaming cpu for 2021












Hossein Almet said:


> Sure, It beats the 5600X conclusively by consuming 90W more power.  Sorry, the 5600X still gets my money.


If you're worried about power then stick to laptops. 









						AMD Ryzen 5 5600X 6 Core AM4 CPU/Processor
					

Buy from Scan - AMD Ryzen™ 5 5600X, AM4, Zen 3, 6 Core, 12 Thread, 3.7GHz, 4.6GHz Turbo, 35MB Cache, PCIe 4.0, 65W, CPU Retail




					www.scan.co.uk
				



AMD Ryzen 5 5600X *£287.48*

vs 









						Intel 10 Core i5 12600KF Alder Lake CPU/Processor
					

Buy from Scan - Intel Core i5 12600KF, S 1700, Alder Lake, 10 Cores, 16 Threads, 3.7GHz, 4.9GHz Turbo, 20MB Cache, 125W, Retail




					www.scan.co.uk
				



Intel Core i5 12600KF *£259.98*


----------



## davyangel (Nov 29, 2021)

Pretty sure your wPrime benchmark in this review is off.
I got 81.847 sec on my 12600K which I just built at stock settings.
Pretty sure you need to set thread count to 20 instead of default 4 or will get score u got.


----------



## Mussels (Nov 29, 2021)

davyangel said:


> Pretty sure your wPrime benchmark in this review is off.
> I got 81.847 sec on my 12600K which I just built at stock settings.
> Pretty sure you need to set thread count to 20 instead of default 4 or will get score u got.


It's going to run at the programs default settings, otherwise you aren't benchmarking fairly.

If the processor/OS has issues with detecting the amount of threads/cores with older programs and behaves poorly for some of them, that's good to know - and the result is still useful and fair since every CPU is tested the same way.

If and when it becomes a situation where Wprime just runs poorly all the time due to it being an older program, i'm sure w1zz would just retire it.


----------



## davyangel (Nov 29, 2021)

Mussels said:


> It's going to run at the programs default settings, otherwise you aren't benchmarking fairly.
> 
> If the processor/OS has issues with detecting the amount of threads/cores with older programs and behaves poorly for some of them, that's good to know - and the result is still useful and fair since every CPU is tested the same way.
> 
> If and when it becomes a situation where Wprime just runs poorly all the time due to it being an older program, i'm sure w1zz would just retire it.


Yeah pretty old and outdated program tbh. Wouldn't even run unless run as Admin either and will not populate the hardware info from cpu-z correctly either.


----------



## W1zzard (Nov 30, 2021)

davyangel said:


> Pretty sure your wPrime benchmark in this review is off.
> I got 81.847 sec on my 12600K which I just built at stock settings.
> Pretty sure you need to set thread count to 20 instead of default 4 or will get score u got.


Yeah you can hand-tune things to ensure wPrime doesn't end up on the e-cores on Windows 11 and run it with fewer threads for higher perf.

I want a fair apples-to-apples comparison, so I run it the same way on every system, using the advertised thread count



davyangel said:


> Yeah pretty old and outdated program tbh. Wouldn't even run unless run as Admin either and will not populate the hardware info from cpu-z correctly either.


I'll probably drop wPrime in the 2022 CPU test system setup, not sure about a replacement yet


----------



## lexluthermiester (Nov 30, 2021)

W1zzard said:


> I'll probably drop wPrime in the 2022 CPU test system setup, not sure about a replacement yet


Prime95? I've always found it more useful.


----------



## md2003 (Jan 15, 2022)

Was watching some videos on youtube. It seems, that many alder lake cpus, including the smaller of the K series, can reach up to 5.5-5.7GHz single core overclocks, with a daily aio setup. What do you think of that?


----------



## Hyderz (Jan 15, 2022)

md2003 said:


> Was watching some videos on youtube. It seems, that many alder lake cpus, including the smaller of the K series, can reach up to 5.5-5.7GHz single core overclocks, with a daily aio setup. What do you think of that?
> 
> View attachment 232480


seems impressive, depends on how much power it chews up on a daily use...


----------



## fevgatos (Jan 15, 2022)

md2003 said:


> Was watching some videos on youtube. It seems, that many alder lake cpus, including the smaller of the K series, can reach up to 5.5-5.7GHz single core overclocks, with a daily aio setup. What do you think of that?
> 
> View attachment 232480


Doesn't even need an AIO at all, temps from reviews are overly exaggerated. I'm running a 12900k on a u12a, I get 70 to 72C on cinebench R20 and around 75-78 on cinebench r23. That's at 204 to 230watts. It can easily hit 5.5ghz+ on single or lightly threaded workloads, at least when it comes to temperatures.


----------



## md2003 (Jan 15, 2022)

fevgatos said:


> Doesn't even need an AIO at all, temps from reviews are overly exaggerated. I'm running a 12900k on a u12a, I get 70 to 72C on cinebench R20 and around 75-78 on cinebench r23. That's at 204 to 230watts. It can easily hit 5.5ghz+ on single or lightly threaded workloads, at least when it comes to temperatures.


It is not so easy to deconstruct many reviews. My personal experience, since you are mentioning yours: the 12900k i ve played with, cooled by an ac freezer 360 get easily up to 95°C in c23 with bios limited to 250 watts (in a 10min stress test). Thus, the scenario examined in each test case might be different by many attributes hence the variance of the results. It's true though, that by unleashing power constrains of this cpu, is not that easy to keep it under normal temperatures.


----------



## lexluthermiester (Jan 15, 2022)

@15:05
There is no way a Noctua NH-U12A is going to cool a 12900K as well as a 360 AIO.


----------



## GURU7OF9 (Jan 15, 2022)

I would have preferred  K  versions with no e cores of the   6  and 8  core cpus.
At least have the option to buy k without e cores? 
Alas it appears those days are over with the raptor lake supposedly supporting upto 16 e cores!


----------



## fevgatos (Jan 16, 2022)

lexluthermiester said:


> @15:05
> There is no way a Noctua NH-U12A is going to cool a 12900K as well as a 360 AIO.


I'm exaggerating again, aren't I? 



md2003 said:


> It is not so easy to deconstruct many reviews. My personal experience, since you are mentioning yours: the 12900k i ve played with, cooled by an ac freezer 360 get easily up to 95°C in c23 with bios limited to 250 watts (in a 10min stress test). Thus, the scenario examined in each test case might be different by many attributes hence the variance of the results. It's true though, that by unleashing power constrains of this cpu, is not that easy to keep it under normal temperatures.


Don't know what to tell you man, for me in order to get thermal throttle I need to be clocked at 5.3ghz all core @ 1.385 volts. Then it draws around 280-300watts in cbr23 and I hit ~100C


----------



## Vayra86 (Jan 16, 2022)

davyangel said:


> Yeah pretty old and outdated program tbh. Wouldn't even run unless run as Admin either and will not populate the hardware info from cpu-z correctly either.



The value of such instruments is pretty much inherent to the fact that they're old and outdated.

It offers a means of comparing CPUs in like-for-like comparisons over long periods of time. And for CPU that matters a lot. You want to know how your current model compares against what you're looking to upgrade to. So you want historical data. 5-7 years is not a strange lifetime for a consumer CPU in a regular system. Some are even older.

For the same and similar reasons you really do want to keep your game tests in a lower res even if the majority stopped playing at it. Its a scientific method of determining relative performance. You benefit from long-term data because it provides a very no-nonsense view on progress.

Applications evolve alongside CPUs for most use cases, so keeping one or two super static things is of great value. The biggest selling point of benchmark comparisons is their trustworthiness, the fact that it is 'what you see is what you get'. Any update to these benchmarks will kill that principle in one way or another, invalidating everything it produced before the update.


----------

