# Alder Lake CPUs common discussion



## birdie (Nov 4, 2021)

Many people are talking about about ADL massive power consumption but when you're running games or various simple Windows applications you don't actually fully load the CPU at all times. This conclusion of great many reviews including here on TPU doesn't seem fair. It's only when you're running massively parallel pure computational tasks (encoding, rendering, compilation) is when you see it and then ... Why does it look like that Alder Lake CPUs run at their max turbo power limits at all times when in reality max turbo power applies at most *for a minute* if I'm not mistaken?

@W1zzard

Now, I have some serious questions about ADL benchmarks which no reviewer has actually taken into consideration. I've already emailed you but looks like you've been to busy to reply/address my concerns:

1. Would be nice if could run CPU specific tests after ADL CPUs settled on their actual TDP whatever it is, thus excluding the first minute or so when they run at much higher wattages (PL1/PL2/turbo power, whatever) 

2. Would be nice if you could pit Ryzen 5000, Rocket Lake and Alder Lake against each other when _all_ are set to the same TDP to see what their performance per watt is.

3. I still really much doubt the benefit of having E-cores, so it would be nice if you run this test:  benchmark P-cores while limiting their frequency to the one of E-cores and compare the system power consumption when running this task on any of E-cores.  That will allow to see whether such cores were justified in the first place.

4. Would be nice to see HWiNFO screenshots of an idling ADL CPU (any).

Here's another thing: for most people out there their CPUs are idle ~98% of the time. Both Ryzen 3000 and 5000 CPUs have quite a high idle power consumption due to their architecture (CPUs + a separate always running IO module).

For instance 12600K idles at more than 16W fewer than 5600X. 12900K is 21W "better" than 5950X:


----------



## R0H1T (Nov 4, 2021)

birdie said:


> *thus excluding the first minute *or so when they run at much higher wattages


You should probably update your data as well ~


> No More TDP: Base Power and Turbo Power​In the past, Intel promoted its processor power as a single number: TDP (Thermal Design Power*). The issue wasn’t so much that this number was wrong, it was because it lacked massive context that wasn’t communicated to anyone. Arguably it took us several years to find out what it really meant, especially in relation to its turbo.
> 
> _*Technically TDP is defined differently to power consumption, however they are effectively interchangeable at this point, both in common parlance and Intel documentation._
> 
> ...











						Intel 12th Gen Core Alder Lake for Desktops: Top SKUs Only, Coming November 4th
					






					www.anandtech.com
				




So basically ADL is setup to be more inefficient!


----------



## W1zzard (Nov 4, 2021)

birdie said:


> when in reality max turbo power applies at most *for a minute* if I'm not mistaken?


With Alder Lake PL1=PL2 and is sustained indefinitely

Edit: I suspect Intel did this to win Cinebench nT. With R23, they changed it so that the benchmark runs 10 minutes before taking results, at which point the CPU will be out of Turbo. There's A LOT of activity spikes in a normally used desktop PC, so that short Turbo boosts make perfect sense, just not for the benchmarks of most reviewers. Testing the benefits of E cores is also similarly difficult, I haven't come up with a good test yet, but am thinking about it in what little spare time I have



birdie said:


> I've already emailed you but looks like you've been too busy to reply/address my concerns


Correct, as mentioned somewhere else, I've been benching non-stop for like two weeks now



birdie said:


> For instance 12600K idles at more than 16W fewer than 5600X. 12900K is 21W "better" than 5950X:







not here. depends on the selected motherboards too, and possibly some settings


----------



## Bill_Bright (Nov 4, 2021)

birdie said:


> This conclusion of great many reviews including here on TPU doesn't seem fair.


It is fair if all the products under review go through the exact same testing and review processes, and if any conclusions based on the reviewer's opinions are drawn, that those opinions/conclusions are developed and presented without bias. 

You are correct when you say running games and apps don't fully load the CPU at all times. But that applies to every CPU - so it is a level playing field, thus fair. 

If a reviewer only made conclusions based on the factors that made that sample look good, that reviewer would be a "marketing weenie" and not a true reviewer. The comparison would not be a fair comparison. 

There's a reason Ford, Chevy, RAM, Nissan, and Toyota can all claim their 1/2 ton pickups are #1. It is because they all are #1 - just in different categories. One is best at towing, another at hauling, another at 0-60, another at 60-0, etc. etc. But if you listen only to the marketing hype, are you going to learn which pickup is best for you? Nope. 

If the testing procedures are going to be changed to accommodate one specific CPU or family of CPUs, then the testing procedures need to be changed in the same manner for all CPUs in the same way. And that's fine, but then how do you compare the most current CPU with last year's models?


----------



## birdie (Nov 4, 2021)

W1zzard said:


> With Alder Lake PL1=PL2 and is sustained indefinitely



I don't understand. What's the point of Base Power then, which is specified as 125W for 12900K?

Can you make it perfectly clear please because you make it sound as if the real power consumption for a *fully loaded CPU stays at "Maximum Turbo Power" indefinitely*. This makes no sense at all. Intel has never lied that much. They have twisted facts to look better, but so do many companies including AMD NVIDIA etc.


----------



## W1zzard (Nov 4, 2021)

birdie said:


> Can you make it perfectly clear please because you make it sound as if the real power consumption for a *fully loaded CPU stays at "Maximum Turbo Power" indefinitely*. This makes no sense at all. Intel has never lied that much.


Just answered somewhere else








						Intel Core i9-12900K
					

Good article as usual.     Intel PR marketing Leaks:  8XX CPU-Z score Best XXX CPU Total dominance   Reality: Win some, Lose some Double the power consumption Double the heat Double the platform cost Windows 11    And the best part of this whole thing is: if the platform cost is reduced by...




					www.techpowerup.com


----------



## birdie (Nov 4, 2021)

Bill_Bright said:


> It is fair if all the products under review go through the exact same testing and review processes, and if any conclusions based on the reviewer's opinions are drawn, that those opinions/conclusions are developed and presented without bias.


I'm talking solely about " Processor Base Power" which many reviewers have swept under the carpet as if ADL CPUs boost for hours while consuming Maximum Turbo Power before reaching Base Power if reaching it at all. This cannot be true. Absolute most cooling solutions on the market cannot even deal with CPUs dissipating 241W indefinitely. We're talking about major throttling and a lot lower than advertised performance.



W1zzard said:


> Just answered somewhere else
> 
> 
> 
> ...


This cannot be true, not that I don't trust you, but it just doesn't make any sense. I could understand if it were specific to your motherboard but if not, we are talking about serious misinformation. Could you ask other reviewers (I know you have close contacts) if *that's indeed the case for everyone*?

125W and 241W are two extremely different figures.


----------



## W1zzard (Nov 4, 2021)

birdie said:


> I'm talking solely about " Processor Base Power" which many reviewers have swept under the carpet as if ADL CPUs boost for hours while consuming Maximum Turbo Power before reaching Base Power if reaching it at all. This cannot be true. Absolute most cooling solutions on the market cannot even deal with CPUs dissipating 241W indefinitely. We're talking about major throttling and a lot lower than advertised performance.
> 
> 
> This cannot be true, not that I don't trust you, but it just doesn't make any sense. I could understand if it were specific to your motherboard but if not, we are talking about serious misinformation. Could you ask other reviewers (I know you have close contacts) if *that's indeed the case for everyone*?
> ...


----------



## birdie (Nov 4, 2021)

It's hard to understand anything from these two graphs. Which one of them is actual and relevant? The first one says ADL CPUs settle on Base Power eventually, the second one says they run at PL1/PL2 indefinitely.

What's the "default configuration"? Sorry, this is all utterly confusing. What if my cooler is not capable of dissipating 241W?

Intel's Ark doesn't make it clear either:

Base Power: _The time-averaged power dissipation that the processor is validated to not exceed during manufacturing while executing an Intel-specified high complexity workload at Base Frequency and at the junction temperature as specified in the Datasheet for the SKU segment and configuration._

Maximum Turbo Power: _The maximum sustained (>1s) power dissipation of the processor as limited by current and/or temperature controls. Instantaneous power may exceed Maximum Turbo Power for short durations (<=10ms). Note: Maximum Turbo Power is configurable by system vendor and can be system specific._

The fact that _"Instantaneous power may exceed Maximum Turbo Power for short durations (<=10ms)"_ is even more scary. Should people reserve 300W or 400W for an ADL CPU *alone*? Only GPUs so far have had this "nice" feature, now we have to deal with CPUs as well?


----------



## repman244 (Nov 4, 2021)

birdie said:


> What if my cooler is not capable of dissipating 241W?



You get a new cooler or a different CPU.

I don't see anything groundbreaking new with these CPUs when it comes to power consumption. If your system cannot handle the heat output it will throttle like every other CPU would in that case.


----------



## Bill_Bright (Nov 4, 2021)

I wonder how many people will really choose which CPU to buy based on these power consumption specs? I sure won't. I do care (a little - but not much) about power consumption at idle. But I buy based on performance and the intended purpose of the computer. 

All these specs are really going to do for me is help me decide which cooler to get (if no OEM is included) and the size of my PSU. 

When enthusiasts are looking to buy their new 2nd childhood, over-compensating sports car, are they really going to look at fuel economy and horsepower? Or are they going to look at fastest times and top speeds?


----------



## looniam (Nov 4, 2021)

birdie said:


> It's hard to understand anything from these two graphs. Which one of them is actual and relevant? The first one says ADL CPUs settle on Base Power eventually, the second one says they run at PL1/PL2 indefinitely.
> 
> What's the "default configuration"? Sorry, this is all utterly confusing. What if my cooler is not capable of dissipating 241W?


it reads UNLOCKED cpus . .  locked parts will have the lower PL1 whereas the K skus will bypass PL1 for PL2 unless you change the default settings - for whatever reason.

imma a caveman and i understand that.


----------



## birdie (Nov 4, 2021)

looniam said:


> it reads UNLOCKED cpus . .  locked parts will have the lower PL1 whereas the K skus will bypass PL1 for PL2 unless you change the default settings - for whatever reason.
> 
> imma a caveman and i understand that.


Erm, do you have a source for that? I like your hypothesis except ... why does Intel specify Base Power at all which, according to you, applies only to locked SKUs?

@W1zzard

BTW AVX-512 is there and can be unlocked as long as you're willing to sacrifice your E-cores. The performance is outrageous:






8 cores vs 16 cores and more than 3 times faster.


----------



## looniam (Nov 4, 2021)

birdie said:


> Erm, do you have a source for that? I like your hypothesis except ... why does Intel specify Base Power at all which, according to you, applies only to locked SKUs?


the source is several preview/review articles; unless your fingers are broke, use google. im not going to adversarially spoon feed you.

E:
from that anandtech article you used that bench from:



> There is usually a weighted time factor that limits how long a processor can remain in its Turbo state for slowly reeling back, *but for the K processors Intel has made that time factor effectively infinite – with the right cooling, these processors should be able to use their Turbo power all day, all week, and all year.*



take that hypotheses.


----------



## R0H1T (Nov 4, 2021)

birdie said:


> BTW AVX-512 is there and can be unlocked as long as you're willing to sacrifice your E-cores. The performance is outrageous:


No it isn't, Intel doesn't officially support it ~
View attachment 223669








						Intel® Core™ i9-12900K Processor (30M Cache, up to 5.20 GHz) - Product Specifications | Intel
					

Intel® Core™ i9-12900K Processor (30M Cache, up to 5.20 GHz) quick reference with specifications, features, and technologies.




					www.intel.com
				



Also for some reason you glossed over this entire part 


> Intel Disabled AVX-512, but Not Really​One of the more interesting disclosures about Alder Lake earlier this year is that the processor would not have Intel’s latest 512-bit vector extensions, AVX-512, despite the company making a big song and dance about how it was working with software developers to optimize for it, why it was in their laptop chips, and how no transistor should be left behind. One of the issues was that the processor, inside the silicon, actually did have the AVX-512 unit there. We were told as part of the extra Architecture Day Q&A that it would be fused off, and the plan was for all Alder Lake CPUs to have it fused off.
> 
> Part of the issue of AVX-512 support on Alder Lake was that only the P-cores have the feature in the design, and the E-cores do not. One of the downsides of most operating system design is that when a new program starts, there’s no way to accurately determine which core it will be placed on, or if the code will take a path that includes AVX-512. So if, naively, AVX-512 code was run on a processor that did not understand it, like an E-core, it would cause a critical error, which could cause the system to crash. Experts in the area have pointed out that technically the chip could be designed to catch the error and hand off the thread to the right core, but Intel hasn’t done this here as it adds complexity. By disabling AVX-512 in Alder Lake, it means that both the P-cores and the E-cores have a unified common instruction set, and they can both run all software supported on either.
> 
> ...


Intel will likely strip this option out from BIOS like they did multiple times by blocking non-Z overclocking on other boards!


----------



## birdie (Nov 4, 2021)

looniam said:


> the source is several preview/review articles; unless your fingers are broke, use google. im not going to adversarially spoon feed you.
> 
> E:
> from that anandtech article you used that bench from:
> ...


Thank you! Haven't had the time to read Anandtech review yet. I hope a class action lawsuit will follow.

I don't know why you're fretting over, I'm just trying to dig the truth up. I hate companies lying to the public.


----------



## looniam (Nov 4, 2021)

birdie said:


> I hope a class action lawsuit will follow. I don't know why you're fretting over, I'm just trying to dig the truth up. I hate companies lying to the public.


law suite for what - publicly changing their philosophy?  
i am far from fretting. i don't know what your problem is, getting to the truth? well, it's right in front of you.

seems you just want to stir the pot for whatever.

good luck w/that.


----------



## W1zzard (Nov 4, 2021)

birdie said:


> It's hard to understand anything from these two graphs


What's hard to understand?


----------



## birdie (Nov 4, 2021)

At 125W 12900K is quite competitive though it's beaten by 5950X at just 88W. Looks like Intel wanted to achieve maximum performance at any cost :-( Let's see what Meteor Lake will bring.





Here's in-depth coverage from computerbase.de.



> Neither in Cyberpunk 2077 nor in Doom Eternal, which ran in 720p with maximum details including ray tracing in order to maximize the CPU load, there were clock or performance losses due to the limitation of the Core i9 to a hard 125 watts. In Cyberpunk 2077 this was only the case at 88 watts upper limit, in Doom Eternal even at 65 watts. The benchmarks were carried out with a GeForce RTX 3080 Ventus 8G from MSI.







You can safely run the CPU at 125W power limit and lose almost nothing in terms of game performance.

And it's not even necessary for gaming:





Thanks to computerbase.de for a proper review. Haven't seen it anywhere else.


----------



## Ferrum Master (Nov 4, 2021)

The biggest disappointment is the DDR4 vs DDR5 bench at computerbase.

It kinda proves that that the first gen is always a toss... Not mature enough to invest into it.


----------



## birdie (Nov 4, 2021)

Ferrum Master said:


> The biggest disappointment is the DDR4 vs DDR5 bench at computerbase.
> 
> It kinda proves that that the first gen is always a toss... Not mature enough to invest into it.



As far as I can see DDR5-6200 proves to be the fastest and most expensive at that. Considering the price of a CPU + motherboard + DDR5 - it's a platform for absolute enthusiasts willing to run Windows 11.

*

​*
@W1zzard there's no need to run any additional tests. ComputerBase.de review has answered all my questions.


----------



## Deleted member 215115 (Nov 4, 2021)

Ferrum Master said:


> the first gen is always a toss... Not mature enough to invest into it.


This is hardly a surprise...


----------



## TheoneandonlyMrK (Nov 4, 2021)

birdie said:


> At 125W 12900K is quite competitive though it's beaten by 5950X at just 88W. Looks like Intel wanted to achieve maximum performance at any cost :-( Let's see what Meteor Lake will bring.
> 
> View attachment 223689
> 
> ...


sorry but we already had an expert in w1zard , cry us a river or go get a non K, or just game and nothing else, do as intel bADE YOU.

THE  STOCK SETTIINGS FOR A K CPU = BOOST, OR MANUAL OC, NOT MAX ECO BOOST???

jeeeebs this release has brought em out.


----------



## phanbuey (Nov 4, 2021)

It's 10nm still... I mean I know they're calling it 7 or whatever but it's a 10nm lithography, not going to be as power efficient as zen 3 at 7nm.  But honestly an amazing step forward in performance either way.


----------



## The red spirit (Nov 4, 2021)

Bill_Bright said:


> I wonder how many people will really choose which CPU to buy based on these power consumption specs? I sure won't. I do care (a little - but not much) about power consumption at idle. But I buy based on performance and the intended purpose of the computer.
> 
> All these specs are really going to do for me is help me decide which cooler to get (if no OEM is included) and the size of my PSU.
> 
> When enthusiasts are looking to buy their new 2nd childhood, over-compensating sports car, are they really going to look at fuel economy and horsepower? Or are they going to look at fastest times and top speeds?


This is what you hear today, when Intel can't engineer for shit and when their flagship isn't 77W chip anymore. Rationalization at its finest.



phanbuey said:


> It's 10nm still... I mean I know they're calling it 7 or whatever but it's a 10nm lithography, not going to be as power efficient as zen 3 at 7nm.  But honestly an amazing step forward in performance either way.


They are not really "7nm" or "10nm".


----------



## freeagent (Nov 4, 2021)

I am watching the GN video now, nice! Looks like Intel is back 

I will grab a next gen unit and enjoy what I have for 1 more year


----------



## 15th Warlock (Nov 4, 2021)

It's a very interesting discussion, and nice to see Intel back in the game, competition is what we all need.

I was able to nab a 12900K and an Asus Strix Z690 motherboard, but I still haven't had any luck finding DDR5 modules, hope retailers can restock those soon!

I get my mobo from Amazon on Friday, and my CPU from Best buy on Monday.

Anyone else moving to Alder Lake for their main rig?


----------



## Deleted member 202104 (Nov 4, 2021)

15th Warlock said:


> Anyone else moving to Alder Lake for their main rig.



I had a 12700k and DDR4 board in my cart at Newegg this morning to try out.  It was going to be a few weeks to get the LGA1700 mounting adapters to fit any of my coolers.  Gave up and bought a set of fast RAM instead for the main rig.  I usually want it now or not at all when it comes to that type of purchase.


----------



## 15th Warlock (Nov 4, 2021)

weekendgeek said:


> I had a 12700k and DDR4 board in my cart at Newegg this morning to try out.  It was going to be a few weeks to get the LGA1700 mounting adapters to fit any of my coolers.  Gave up and bought a set of fast RAM instead for the main rig.  I usually want it now or not at all when it comes to that type of purchase.



I read Asus mobos have mounting holes that line up with previous gen coolers, I hope I can use my current cooler.

Hope that new RAM pushes your current rig to the limit, no need to "upgrade" from a 5950X!


----------



## Deleted member 202104 (Nov 4, 2021)

15th Warlock said:


> I read Asus mobos have mounting holes that line up with previous gen coolers, I hope I can use my current cooler.
> 
> Hope that new RAM pushes your current rig to the limit, no need to "upgrade" from a 5950X!


From what I've read also you should be good to go.  The board I was willing to spend the money on (read cheap) said it required a LGA1700 specific bracket.

The RAM is something I've been trying to decide if I should buy, and I don't really need it, but...  you know. 

Keep us posted on the 12900k - excited to hear some first hand experience.


----------



## Deleted member 24505 (Nov 4, 2021)

As I said, i don't give a rats how much power it uses, main concern is, is it the fastest chip for my money, and can i cool it( i have a custom loop, so yes) otherwise what's the problem. Big deal it might use more power than AMD equivalent chip, but if it's faster and you can cool and power it so what. does the fact that it uses more power and may be more inefficient mean it's a bad chip? not imo if it is the fastest for my cash.


----------



## 15th Warlock (Nov 4, 2021)

weekendgeek said:


> From what I've read also you should be good to go.  The board I was willing to spend the money on (read cheap) said it required a LGA1700 specific bracket.
> 
> The RAM is something I've been trying to decide if I should buy, and I don't really need it, but...  you know.
> 
> Keep us posted on the 12900k - excited to hear some first hand experience.



Maybe your cooler manufacturer can send you the bracket? 

Will definitely do! As soon as I can find some RAM to go with it


----------



## docnorth (Nov 4, 2021)

15th Warlock said:


> It's a very interesting discussion, and nice to see Intel back in the game, competition is what we all need.
> 
> I was able to nab a 12900K and an Asus Strix Z690 motherboard, but I still haven't had any luck finding DDR5 modules, hope retailers can restock those soon!
> 
> ...


Probably yes for a new office PC, but it seems like 2H2022. I don't think I'll have the luxury to wait for Raptor Lake (or Zen 4).



birdie said:


> At 125W 12900K is quite competitive though it's beaten by 5950X at just 88W. Looks like Intel wanted to achieve maximum performance at any cost :-( Let's see what Meteor Lake will bring.
> 
> View attachment 223689
> 
> ...


Yesterday we saw a last-minute CB23 "leak" before official reviews. It seems you can set PL1 to 150w and lose almost nothing, even for MT productivity work. Plus you can reuse your air cooler.


----------



## Flanker (Nov 5, 2021)

Actually, it's a fair point with power consumption. They don't really affect me that much. Unless it's so bad that I'm forced to upgrade my cooler.


----------



## ir_cow (Nov 5, 2021)

Working on a DDR5 memory review right now.  6400 down to 5200. I'll throw in DDR4 3600, 4400 and 5100 if I can. I'm having to retest a few things which is causing the delay.


----------



## 15th Warlock (Nov 5, 2021)

For those interested in using their existing Corsair AIO cooler, you can purchase an LGA 1700 retrofit kit straight from them, it's only $2.99 with free shipping.






						LGA1700 Retrofit Kit
					

This retrofit kit of standoffs enables you to use your existing LGA1200/LGA115x Retention brackets and backplates for modern CORSAIR AIO coolers with the new Intel ‘Alder Lake-S’ processors that require socket LGA1700 mounting kits. Please reference the tech specs below for compatibility...




					www.corsair.com
				






Bestbuy just billed my card for the order, so I'm just waiting for the CPU to ship, I had a $350 gift card, so it was only $322 for the 12900K


----------



## freeagent (Nov 5, 2021)

Why did you scribble out the shipping addy?


----------



## tabascosauz (Nov 5, 2021)

docnorth said:


> Yesterday we saw a last-minute CB23 "leak" before official reviews. It seems you can set PL1 to 150w and lose almost nothing, even for MT productivity work. Plus you can reuse your air cooler.



tbh ADL looks unexpectedly impressive. Yeah it's hot at 250W, but that's like 4.9GHz all-core @ 1.3V at the same ridiculous density as N7. Running 4.9GHz @ 1.3V on a 5800X won't be really any easier on thermals and power, while quite a bit slower in all-core - not to mention I'm pretty sure 4.9 @ 1.3V would already be asking way too much of TSMC N7 volt-freq. Also that 12600K looking to really shake up the meta.

Intel could have reined it in a bit and still be fast but I guess they wanted to stick it to the 5950X stock for stock. Can probably cut a bit from the PL and watch the temps come way down, yet the Gracemonts will still be able to mitigate the performance impact.

Not to mention, looking forward to seeing what people can do with undervolting. Love how simple it is to UV an Intel.


----------



## oxrufiioxo (Nov 5, 2021)

freeagent said:


> I am watching the GN video now, nice! Looks like Intel is back
> 
> I will grab a next gen unit and enjoy what I have for 1 more year



Probably best to wait for Meteorlake.... Raptorlake seems to only add more E cores


----------



## freeagent (Nov 5, 2021)

Zen 3 was a heck of a lot of fun to play with, but these look pretty fun too 

It will be nice to see what people can do with them once memory is available.



oxrufiioxo said:


> Probably best to wait for Meteorlake.... Raptorlake seems to only add more E cores


Yessir, I am still content with my system, I will wait till ML before I buy again, good DDR5 has to be out because I will be using it for awhile..


----------



## Flanker (Nov 5, 2021)

Thinking about this, having E and P cores makes it really tricky for the OS. I would wait for things on the software side to catch up before recommending them.


----------



## tabascosauz (Nov 5, 2021)

Flanker said:


> Thinking about this, having E and P cores makes it really tricky for the OS. I would wait for things on the software side to catch up before recommending them.



I think I might have been wrong initially about how the switching works. From reviews it doesn't seem like it behaves like Lakefield, so there's not nearly as much switching going on a la AMD (where load is juggled around every few seconds). At least the idea is that loads that should be on P-cores will be planted firmly on P-cores because Thread Director is helping out the scheduler on a hardware level. From AT:



> Intel has said that Windows 11 does all of this. The only thing Windows 10 doesn’t have is insight into the efficiency of the cores on the CPU...Intel says the net result of this will be seen only in run-to-run variation: there’s more of a chance of a thread spending some time on the low performance cores before being moved to high performance, and so anyone benchmarking multiple runs will see more variation on Windows 10 than Windows 11. But ultimately, the peak performance should be identical.



So if there's one niche program that Thread Director hasn't yet been taught to handle, it still might be stuck on E-cores. But less likely on Win 11, would be a very visible and obvious bug, and I'd imagine Intel has left itself a means to update Thread Director's behaviour through microcode like AMD does AGESA.

It all sounds less random core switching, so more consistent performance than AMD?


----------



## lexluthermiester (Nov 5, 2021)

Jay has some interesting perspectives.


----------



## Psychoholic (Nov 5, 2021)

Picked up a 12900K and Asus ROG STRIX Z690-A today at microcenter. 
Went with a DDR4 board since I already have 32GB of good DDR4 memory and according to reviews the currently available DDR5 doesn't make much difference.

PSA: My Noctua U12A wouldn't work with this board, had to buy an AIO due to the height of the VRM Heatsinks.

Thankfully, Asus has the oblong holes so LGA1200 Coolers will mount.


----------



## Hyderz (Nov 5, 2021)

im much jelly that some members are able to get a system with the new cpu and ddr5, 
i wanna upgrade but i cannot justify the upgrade from my current system


----------



## docnorth (Nov 5, 2021)

tabascosauz said:


> I think I might have been wrong initially about how the switching works. From reviews it doesn't seem like it behaves like Lakefield, so there's not nearly as much switching going on a la AMD (where load is juggled around every few seconds). At least the idea is that loads that should be on P-cores will be planted firmly on P-cores because Thread Director is helping out the scheduler on a hardware level. From AT:
> 
> 
> 
> ...


Actually you had a point. As @W1zzard shows in 12900k review the hybrid architecture messes up with 4 CPU tests. 3 of them (i.e. wPrime, machine learning - classify images and database - MySQL) achieve huge performance increase just by disabling E cores. Probably most of this issues will be resolved soon.


----------



## dyonoctis (Nov 5, 2021)

birdie said:


> At 125W 12900K is quite competitive though it's beaten by 5950X at just 88W. Looks like Intel wanted to achieve maximum performance at any cost :-( Let's see what Meteor Lake will bring.
> 
> View attachment 223689
> 
> ...


That also show that the E-core are usefull...if you aren't a pure gamer. I'm having doubts about Goldencove ability to scale up beyond 8P core without having to severely lower the all core boost. And raptor lake will more E-core while the P-core count stays the same...


----------



## Zyll Goliat (Nov 5, 2021)

Well...considering the higher price for DDR5 and Z690 motherboards a bit higher power consumption but never the less really good performance all I can said it's ....


----------



## Bomby569 (Nov 5, 2021)

Zyll Goliat said:


> Well...considering the higher price for DDR5 and Z690 motherboards a bit higher power consumption but never the less really good performance all I can said it's ....



I didn't had time to watch much, only saw the linus video, but at least for gaming the power consumption is about the same. And that's what matters most to me. Intel has a winner in the 12600, to bad that DDR5 and the price of the new boards makes it a stupid purchase.


----------



## birdie (Nov 5, 2021)

So, having read quite a lot of reviews here are some pertinent  and important conclusions:​
Intel has pushed their P-cores for MT scenarios to extremes to be the absolute performance king and at least be faster than 5900X and rival 5950X in many cases. This results in an insane power consumption out of the box but only for heavy MT tasks, e.g. video encoding, rendering, software compilation, math calculations - not something average people do daily.
This extreme power consumption does not translate into every day scenarios like modestly threaded applications or games  - in fact many reviewers show that ADL CPUs are the most power efficient in games. Igor's Lab, AnandTech and computerbase.de have shown that limiting their TDP to 125W or even lower does not meaningfully affect frame rates.
It seems very likely that if P-cores maximum frequency is decreased by just 200-300MHz their efficiency will be incredible.
Factory OC'ing is not new and NVIDIA, AMD, Apple have been doing that for at least a couple of years. No one is crying foul because of that.
*TLDR*: Overall ADL CPUs are great sans an extreme factory OC for heavy MT scenarios which can be easily mitigated by limiting their power consumption by setting the PL1 limit in BIOS. At the moment the only issue is the price of the platform because even though the CPUs are competitively priced, you need to purchase a quite expensive motherboard, DDR5 RAM (the faster the better) and a decent cooling solution (preferably AIO).

Too many reviewers are fishing for views and ad revenue, so having loud and disparaging headlines which aren't necessarily representative of the real world is their way of achieving that which is quite sad.


----------



## TheoneandonlyMrK (Nov 5, 2021)

birdie said:


> So, having read quite a lot of reviews here are some pertinent  and important conclusions:​
> Intel has pushed their P-cores for MT scenarios to extremes to be the absolute performance king and at least be faster than 5900X and rival 5950X in many cases. This results in an insane power consumption out of the box but only for heavy MT tasks, e.g. video encoding, rendering, software compilation, math calculations - not something average people do daily.
> This extreme power consumption does not translate into every day scenarios like modestly threaded applications or games  - in fact many reviewers show that ADL CPUs are the most power efficient in games. Igor's Lab, AnandTech and computerbase.de have shown that limiting their TDP to 125W or even lower does not meaningfully affect frame rates.
> It seems very likely that if P-cores maximum frequency is decreased by just 200-300MHz their efficiency will be incredible.
> ...


While some users see only what they want to, from only there perspective while saying everyone's wrong hmnn.


----------



## bug (Nov 5, 2021)

phanbuey said:


> It's 10nm still... I mean I know they're calling it 7 or whatever but it's a 10nm lithography, not going to be as power efficient as zen 3 at 7nm.  But honestly an amazing step forward in performance either way.


It's more or less the same as Zen's 7nm. That's why they renamed their process, similar transistor densities. For years TSMC and Samsung named their processes to make it look like they were on par with Intel, where in reality their density was always one step behind. Of course, this all was before Intel's 10nm "smashing success".



birdie said:


> So, having read quite a lot of reviews here are some pertinent  and important conclusions:​
> Intel has pushed their P-cores for MT scenarios to extremes to be the absolute performance king and at least be faster than 5900X and rival 5950X in many cases. This results in an insane power consumption out of the box but only for heavy MT tasks, e.g. video encoding, rendering, software compilation, math calculations - not something average people do daily.
> This extreme power consumption does not translate into every day scenarios like modestly threaded applications or games  - in fact many reviewers show that ADL CPUs are the most power efficient in games. Igor's Lab, AnandTech and computerbase.de have shown that limiting their TDP to 125W or even lower does not meaningfully affect frame rates.
> It seems very likely that if P-cores maximum frequency is decreased by just 200-300MHz their efficiency will be incredible.
> ...


This is what AMD used to do with their GPUs: push them to insane power requirements just to be able to claim they're on par with Nvidia. It's how underclocking/undervolting became a thing with AMD owners.

There's a lot more to learn about AL (e.g. a CPU that draws more power, but finishes a task quicker, may still use less energy than a CPU that uses less power, but for longer). But the simple fact is that fully loaded, out-of-the-box, AL burns through a lot of power. On the upside, using AVX512 _doesn't_ result in even more power burnt. It stays within the same limits, which is a first for AVX512.

If I have a bone to pick with AL, it's the scheduler. Intel only went for Win11 support and apparently even that isn't foolproof yet. Win10 support falls well short (can be worked around manually, but that's subpar). And Linux patches haven't even begun to land, which is very uncharacteristic for Intel 

That said, I still think a 12600k with a lower specced mobo (or even better a 12600 with an H mobo) are/will be great value for the money.


----------



## birdie (Nov 5, 2021)

bug said:


> It's more or less the same as Zen's 7nm. That's why they renamed their process, similar transistor densities. For years TSMC and Samsung named their processes to make it look like they were on par with Intel, where in reality their density was always one step behind. Of course, this all was before Intel's 10nm "smashing success".
> 
> 
> This is what AMD used to do with their GPUs: push them to insane power requirements just to be able to claim they're on par with Nvidia. It's how underclocking/undervolting became a thing with AMD owners.
> ...



I'm 100% sure Windows 10 support will arrive sooner or later, after all the OS will be supported until 2029. For early ADL adopters who insist on running this CPU along with Windows 10 there are at least two solutions:
1) Pinning applications to fast cores (which can be done automatically since there are utilities for that, e.g. dAffinity, Bill's Processor Manager, Process Lasso or even command line/lnk editing - affinity mask calculator).
2) Disabling E-cores altogether.

No big deal as far as I can see.


----------



## Bomby569 (Nov 5, 2021)

birdie said:


> So, having read quite a lot of reviews here are some pertinent  and important conclusions:​
> Intel has pushed their P-cores for MT scenarios to extremes to be the absolute performance king and at least be faster than 5900X and rival 5950X in many cases. This results in an insane power consumption out of the box but only for heavy MT tasks, e.g. video encoding, rendering, software compilation, math calculations - not something average people do daily.
> This extreme power consumption does not translate into every day scenarios like modestly threaded applications or games  - in fact many reviewers show that ADL CPUs are the most power efficient in games. Igor's Lab, AnandTech and computerbase.de have shown that limiting their TDP to 125W or even lower does not meaningfully affect frame rates.
> It seems very likely that if P-cores maximum frequency is decreased by just 200-300MHz their efficiency will be incredible.
> ...



Very much inline with Linus review, i guess i watch the right one. Since he got that big guy (sorry idk his name) his content is really good.


----------



## ViperXTR (Nov 5, 2021)

Unreleased Intel Core i5-12400F CPU could offer Ryzen 5 5600X performance at half the price, shows early review - VideoCardz.com
					

Intel Core i5-12400F with 6-cores and 12-threads tested, a matching opponent for Ryzen 5 5600X? Yesterday Intel launched as many as six 12th Gen Core K-Series unlocked CPUs with TDP at 125W. The mid-range segment known as the non-K series with TDP at 65W is not to launch till January next year...




					videocardz.com
				



pair this with B660 and some decent DDR4 and you get the best budget gaming build


----------



## Khonjel (Nov 5, 2021)

I think Intel won't be happy with the sales figures with Alder Lake.

1) Needs Windows 11 which though not many but few people won't upgrade to because of perceived or otherwise stability and quality concerns.
2) While Alder Lake's gaming power consumption is comparable and not very far from Zen 3, the damage is already done methinks. People already think Alder Lake is hot and power hungry which barely edges out Zen 3.
3) Generally unfavourable component cost: GPU, DDR5, motherboard, better cooler and PSU (because of hot and power-hungry perception).
4) New architecture weirdness. Though I will admit if people could stick out with Ryzen 1000 weirdness, Intel will stick with them too. Though I personally know few people who changed from AMD to Intel because of 1000 and 2000 series weirdness.

Intel's got an uphill battle. Though by sheer volume alone it'll trounce AMD anyway.

While I got no no bet on the race, imo the only silver lining is game publisher/developers removing the tumor that is Denuvo, because the tumor isn't compatible with Alder Lake. I bet they wouldn't have bothered if AMD was the one introduced big.LITTLE and faced the same issue.


----------



## bug (Nov 5, 2021)

birdie said:


> I'm 100% sure Windows 10 support will arrive sooner or later, after all the OS will be supported until 2029. For early ADL adopters who insist on running this CPU along with Windows 10 there are at least two solutions:
> 1) Pinning applications to fast cores (which can be done automatically since there are utilities for that, e.g. dAffinity, Bill's Processor Manager, Process Lasso or even command line/lnk editing - affinity mask calculator).
> 2) Disabling E-cores altogether.
> 
> No big deal as far as I can see.


If it were that simple, MS wouldn't have built a new scheduler for Win11 
Pinning may work in some cases, but it won't in others. E.g. If I pin my browser the the P core, tabs I don't watch can't be moved to the E cores. Not a deal breaker, but a subpar experience for sure.


----------



## birdie (Nov 5, 2021)

bug said:


> If it were that simple, MS wouldn't have built a new scheduler for Win11
> Pinning may work in some cases, but it won't in others. E.g. If I pin my browser the the P core, tabs I don't watch can't be moved to the E cores. Not a deal breaker, but a subpar experience for sure.


It _is_ simple when you know which applications should run on which cores. The real issue is doing that _automatically/intelligently_ without either wasting watts unnecessarily or slowing everything down. 

Even for Windows 11 it's _not_ all rosy yet: I've seen reviews where e.g. databases were tested and they were extremely slowed down and worked slower than on RKL/CML/Zen 3 CPUs because Windows 11 decided to run them, as a background task, on e-cores.


----------



## bug (Nov 5, 2021)

birdie said:


> It _is_ simple when you know which applications should run on which cores.


I gave you an example above: same app, you want it to run on both cores, depending on the circumstances. Not simple


----------



## RandallFlagg (Nov 5, 2021)

birdie said:


> At 125W 12900K is quite competitive though it's beaten by 5950X at just 88W. Looks like Intel wanted to achieve maximum performance at any cost :-( Let's see what Meteor Lake will bring.
> 
> View attachment 223689
> 
> Here's in-depth coverage from computerbase.de.



Here's the problem with Computerbase.de's report - crippled DDR5 that no early adopter will run :


----------



## chrcoluk (Nov 5, 2021)

Khonjel said:


> I think Intel won't be happy with the sales figures with Alder Lake.
> 
> 1) Needs Windows 11 which though not many but few people won't upgrade to because of perceived or otherwise stability and quality concerns.
> 2) While Alder Lake's gaming power consumption is comparable and not very far from Zen 3, the damage is already done methinks. People already think Alder Lake is hot and power hungry which barely edges out Zen 3.
> ...


I dont know where the needs Windows 11 comes from, apparently the scheduler will help, but it doesn't need Windows 11.  I think Techpowerup are doing a Windows 10 review soon, and club386 have already tested it on Windows 10.

I think 241W for a cpu released in 2021 is bad news, the PC industry is going against the current eco meta elsewhere. Looking at the power data for when e-cores were disabled they dont actually seem to be saving power, but rather they just low performance cores given a good marketing name.

Agreed on the component cost, the motherboard prices are completely unacceptable, will DDR5 ever hit DDR4 pricing levels in the future?

Agree also on the architecture changes, Windows 11 is the best case scenario, you have to account older and non windows OS as well.


----------



## Bomby569 (Nov 5, 2021)

chrcoluk said:


> I think 241W for a cpu released in 2021 is bad news, the PC industry is going against the current eco meta elsewhere. Looking at the power data for when e-cores were disabled they dont actually seem to be saving power, but rather they just low performance cores given a good marketing name.



In a normal use scenario they use as much power as a 5950x and are more powerfull. Only if you are doing some specific tasks the power usage is higher then the 5050x. Power is not a problem for 90% of the people with alder lake.


----------



## RandallFlagg (Nov 5, 2021)

birdie said:


> I'm 100% sure Windows 10 support will arrive sooner or later, after all the OS will be supported until 2029. For early ADL adopters who insist on running this CPU along with Windows 10 there are at least two solutions:
> 1) Pinning applications to fast cores (which can be done automatically since there are utilities for that, e.g. dAffinity, Bill's Processor Manager, Process Lasso or even command line/lnk editing - affinity mask calculator).
> 2) Disabling E-cores altogether.
> 
> No big deal as far as I can see.



Probably already know this but capability is already there on Win 10, a few registry tweaks enable it.  OFC that is probably not 'supported'.

There are already some issues with the scheduler on some games using AL though.  OFC, you can disable the e-cores on AL with the scroll lock button on some motherboard so you don't even have to go into bios.   

It's worth noting that this type of thing is clearly lowering Alder Lake's comparative performance scores right now.  As patches and fixes are deployed, it's going to get faster in the aggregated results.  

Just take a look at the wPrime benchmark here, or the bizarre MS Flight Sim results at other sites.  These things will get fixed and I think a lot of sites will need to revisit their reviews in a few months.  

Given the number of new techs here though - big.LITTLE, PCIe 5, entirely new core uArch (in fact, two new uArchs on one chip), a chipset that is actually quite changed from previous gen, I'm surprised at this point that more problems have not come up.


----------



## Khonjel (Nov 5, 2021)

Bomby569 said:


> In a normal use scenario they use as much power as a 5950x and are more powerfull. Only if you are doing some specific tasks the power usage is higher then the 5050x. Power is not a problem for 90% of the people with alder lake.


See? This is what I'm talking about. Alder Lake only burns its undies in non-gaming heavy tasks but wrong first impression has already been spread about.

And @chrcoluk Windows 11 is NEEDED for Alder Lake unless all you're gonna do is gaming which isn't that much affected tbh.








Start at 6:41. TPU still can't do YT timestamps.

Seriously it's so much hoops needed jumping to get the best out of Alder Lake. I can see only dedicated fanboys and "enthusiasts" who want the current year's best of the best clamoring to buy them.


----------



## chrcoluk (Nov 5, 2021)

Khonjel said:


> See? This is what I'm talking about. Alder Lake only burns its undies in non-gaming heavy tasks but wrong first impression has already been spread about.
> 
> And @chrcoluk Windows 11 is NEEDED for Alder Lake unless all you're gonna do is gaming which isn't that much affected tbh.
> 
> ...


This is why I wanted a Windows 10 review 

Windows 10 works albeit with a significant performance hit then, I suppose Windows 10 users disable the e-cores and treat it as a 8/16 cpu.

In regards to the power, the media does tend to over represent content creation workloads, however those workloads shouldn't be ignored completely, even I have started to do cpu based encoding now as I record a lot of my gaming using x264 mode in OBSS (cpu based encoding).


----------



## Deleted member 24505 (Nov 5, 2021)

birdie said:


> So, having read quite a lot of reviews here are some pertinent  and important conclusions:​
> Intel has pushed their P-cores for MT scenarios to extremes to be the absolute performance king and at least be faster than 5900X and rival 5950X in many cases. This results in an insane power consumption out of the box but only for heavy MT tasks, e.g. video encoding, rendering, software compilation, math calculations - not something average people do daily.
> This extreme power consumption does not translate into every day scenarios like modestly threaded applications or games  - in fact many reviewers show that ADL CPUs are the most power efficient in games. Igor's Lab, AnandTech and computerbase.de have shown that limiting their TDP to 125W or even lower does not meaningfully affect frame rates.
> It seems very likely that if P-cores maximum frequency is decreased by just 200-300MHz their efficiency will be incredible.
> ...



So all the anti Intel bullshit about massive power consumption was just the usual anti Intel TPU crap, well no surprise for me there. Well done Intel on your comeback.


----------



## bug (Nov 5, 2021)

RandallFlagg said:


> Here's the problem with Computerbase.de's report - crippled DDR5 that no early adopter will run :
> 
> View attachment 223836


It's not that. As Anand explained, DDR5 can only stretch its legs in memory intensive _and_ highly threaded workloads. I.e. mostly science stuff, not games.


Khonjel said:


> Seriously it's so much hoops needed jumping to get the best out of Alder Lake. I can see only dedicated fanboys and "enthusiasts" who want the current year's best of the best clamoring to buy them.


There are models in the pipeline that are all P cores. If you don't like hoops, that is 
Or, as pointed out above, get a motherboard that will disable the E cores at the press of ScollLock.

Tbh, AL has brought Intel back in the game, offers some solid choices (12600k and 12600kf), but it's not a must have or a game changing contender. Intel should have set on it one or two more quarters and figure out the scheduler, instead of bringing it out in the form it is today.

On the other hand, AL is about changing so much at once, CPU arch, DDR, PCIe and Thunderbolt support, turbo handling (generally a big engineering no-no) that it's surprising it hasn't turned out even more of a mess.



Tigger said:


> So all the anti Intel bullshit about massive power consumption was just the usual anti Intel TPU crap, well no surprise for me there. Well done Intel on your comeback.


How was it crap? Fully loaded, the 12900k burns through a lot of power - indefinitely for k chips. The fact that not all workloads fully load the chip is not a power saving feature.


----------



## windwhirl (Nov 5, 2021)

Why Can Digital Rights Management (DRM) Crash Some Games on 12th Gen...
					

Temporary workaround for games with DRM software that can’t recognize 12th Gen Intel® Core™ Processor, causing games to crash or not load




					www.intel.com
				












						[Updated] Games Updated for DRM Issue with 12th Gen Intel®...
					

Lists games and OSs affected, provides release map for fixes for games with DRM software that can’t recognize 12th Gen Intel® Core™ Processors.




					www.intel.com
				




Intel's articles regarding DRM on Alder Lake and games compatibility. 

Be aware, compatibility is even more broken on Windows 10.


----------



## Bomby569 (Nov 5, 2021)

Regarding the w11 almost requirement to get the best out of it, i think it's fair tbh, it's new tech, MS is moving on and probably don't want to have the trouble of doing double work on 10 and 11.


----------



## bug (Nov 5, 2021)

Bomby569 said:


> Regarding the w11 almost requirement to get the best out of it, i think it's fair tbh, it's new tech, MS is moving on and probably don't want to have the trouble of doing double work on 10 and 11.


It's fair from a technical point of view. It's unfortunate when you have been behind for years and are looking to move as many SKUs as possible.


----------



## RandallFlagg (Nov 5, 2021)

bug said:


> It's not that. As Anand explained, DDR5 can only stretch its legs in memory intensive _and_ highly threaded workloads. I.e. mostly science stuff, not games.



So are you saying that the use of DDR5-4400 didn't cripple their DDR5 Alder Lake setups?   Because that is what I said, and the low speed of DDR5 they used does affect latency too since the real latency in nanoseconds is a function of the CL (clocks) and speed (clocks per second), and it does affect their FPS results significantly.  That's a 10% drop in FPS vs a tuned DDR4-3200 C12 setup.


----------



## chrcoluk (Nov 5, 2021)

Bomby569 said:


> In a normal use scenario they use as much power as a 5950x and are more powerfull. Only if you are doing some specific tasks the power usage is higher then the 5050x. Power is not a problem for 90% of the people with alder lake.


Just watched it, I think he over plays pci-e gen 5 when he says will allow future gpu's to be fully utilised, gen 4 makes no difference to RTX 3000 series vs gen 3.  I think its one thing for a manufacturer to over play it, but a reviewer should know better.


----------



## Bomby569 (Nov 5, 2021)

bug said:


> It's fair from a technical point of view. It's unfortunate when you have been behind for years and are looking to move as many SKUs as possible.



It was the way to get a ahead of the race, with those small cores with wouldn't stand a change.

But i still don't see that as much of a problem (it is must certanly not deal breaker), i mean i haven't upgraded but from what i've seen W11 is pretty good apart of some bugs, but they aren't at the point of destroying your experience. You can live with it.


----------



## bug (Nov 5, 2021)

RandallFlagg said:


> So are you saying that the use of DDR5-4400 didn't cripple their DDR5 Alder Lake setups?   Because that is what I said, and the low speed of DDR5 they used does affect latency too since the real latency in nanoseconds is a function of the CL (clocks) and speed (clocks per second), and it does affect their FPS results significantly.  That's a 10% drop in FPS vs a tuned DDR4-3200 C12 setup.


Cripple? Not by a long shot. Anand didn't see much difference between DDR4-3200 and DDR5-4800 (those are the officially supported speeds, sans overclocking), except for highly multithreaded stuff.
In other words, in the absence of very intensive memory access, DDR4 already offers all the bandwidth you need.



Bomby569 said:


> It was the way to get a ahead of the race, with those small cores with wouldn't stand a change.
> 
> But i still don't see that as much of a problem (it is must certanly not deal breaker), i mean i haven't upgraded but from what i've seen W11 is pretty good apart of some bugs, but they aren't at the point of destroying your experience. You can live with it.


Not disputing any of that, but all those hitches are enough for everyone to remember AL launch as "rushed" or "half-assed". Intel could have done without that.


----------



## tabascosauz (Nov 5, 2021)

Khonjel said:


> Seriously it's so much hoops needed jumping to get the best out of Alder Lake. I can see only dedicated fanboys and "enthusiasts" who want the current year's best of the best clamoring to buy them.



Why are _we_ complaining about possibly jumping through hoops? We're enthusiasts, we're not strangers to tweaking. Whether Alder Lake is the profitable holy grail for Intel is not my concern, and isn't anything I have control over. What I want is to have control over is how my CPU behaves, to the best possible extent, as CPUs get more complex and more intertwined with software and firmware.

When the 11 scheduler forgot about Zen 3, I can only take it up the ass as games think Core 7 (quality ranked #11/12 lmao) on CCD2 is their "preferred core". No control.
Even in normal operation, I watch as Windows constantly juggles between Core 0 (junk) and Core 1, because Windows is still infatuated with Core 0. No control.
After -30 Curve Optimizer, Core 2 is honestly 5.0GHz capable basically all the time, but I can't use it because neither AMD nor Windows thinks it's "preferred". No control.
There is no control or alternative, because in most benches changing Core Affinity in Taskmanager makes the bench extremely unstable for repeated runs. CPU-Z doesn't even work, CB R23 is dubious but will certainly crash. Don't like it? Buy another 5900X in hopes of getting a better one, or live with it because AGESA doesn't change general CPPC behaviour beyond immediate post-launch fixes.

The obvious solution is for AMD to allow advanced users to edit the CPPC ranking, but that's a pipe dream and a half.

Here we have not 1 but *3 options* to prevent "focus switching", or load undesirably jumping to E-cores. AT reiterated this twice, once in the preview and once in the review:





And yet we're complaining about having some semblance of control? Granted, I understand that Thread Director isn't perfect at the moment, it's clearly visible in the occasional game/benchmark, every reviewer has encountered at least one. But everything has to start somewhere, and ADL looks like it's starting from a better position than either Zen 2 and Zen 3 did at launch, firmware-wise.


----------



## The King (Nov 5, 2021)

For those who are on the fence about DDR4 vs DDR5 ADL boards. This maybe useful.


----------



## RandallFlagg (Nov 5, 2021)

bug said:


> Cripple? Not by a long shot. Anand didn't see much difference between DDR4-3200 and DDR5-4800 (those are the officially supported speeds, sans overclocking), except for highly multithreaded stuff.
> In other words, in the absence of very intensive memory access, DDR4 already offers all the bandwidth you need.



I will tell ya, I have never been impressed with Cutress.  He is so wrapped up in his spec marks, he just makes assumptions based on his theories and runs with them, never bothering to check if he is right or not.  

So one of the reasons I like TPU is because they benchmark apps that, in my experience, *most* people actually use.  

AnandTech does not.  They make stupid assumptions that lead to comments like the one you just made.

For the record, millions of people use spreadsheets every single day.  Many of these are highly complex, with hundreds of thousands of cells, dozens of graphs, images and scripts.  I've known people who basically made a career out of knowing how to make stuff like that.

The below is not uncommon use cases.  It's fairly normal for power users.  And there a fuck ton load of people using apps like this, as opposed to 'using' specmarks and pov-ray.

Emperical facts.  Tom's used DDR4-3200 and DDR5-4800, similar to AT.  

You are looking at > 10% on a spreadsheet script due to DDR4 vs DDR5.  This particular spreadsheet workload _*likes bandwidth *_:






Oh but wait, *it can also be latency sensitive* :


----------



## Deleted member 24505 (Nov 5, 2021)

The King said:


> For those who are on the fence about DDR4 vs DDR5 ADL boards. This maybe useful.



So there is possibly advantages to going DDR5, though might be worth waiting till next year when higher speed low latencie kits will be available. I would deffo go Z690/DDR5 now given the cash, sit on the slow sticks till faster ones become available to pop straight in.


----------



## RandallFlagg (Nov 5, 2021)

Tigger said:


> So there is possibly advantages to going DDR5, though might be worth waiting till next year when higher speed low latencie kits will be available. I would deffo go Z690/DDR5 now given the cash, sit on the slow sticks till faster ones become available to pop straight in.



DDR5 out of stock now though.  Pretty much everything I see is 4800 and 5200, but gone.


----------



## GerKNG (Nov 5, 2021)

i have my hands on a 12600k + Aorus Pro DDR4.

am i the only one who has insane problems with this platform?
3200mb/s RAM (2x16GB) CL16 does not even POST.
the Bios that was flashed was from August and was not able to boot into anything (had to flash it before i could even install windows)

the whole PC is extremely unstable (memtest errors at 2933 Gear 1) only 3000 Gear 2 works somewhat okay.

the E Cores are not used in Benchmarks like Cinebench, i have to reset the bios every fifth boot... 

Alder Lake is barely functional in this state.


----------



## RandallFlagg (Nov 5, 2021)

GerKNG said:


> i have my hands on a 12600k + Aorus Pro DDR4.
> 
> am i the only one who has insane problems with this platform?
> 3200mb/s RAM (2x16GB) CL16 does not even POST.
> ...



Suggest you check here, you'll find a lot of people testing ADL hands on.  Too much AMD at TPU :









						Overclocking ADL - 12900k etc results, bins and discussion
					

decent. focus on 2nd/third.  the timings all whack  hint.  9 hrs now. nobody noticed the game benchmark.  15-20% higher as my CML SOTR do, if it is daily setting, for first DDR5 it's really great. That can really called next gen.




					www.overclock.net


----------



## Psychoholic (Nov 5, 2021)

I Plan on keeping my 12900K for a while, if need be in a year or two i'll switch over to a DDR5 board when the FAST ddr5 is out, I assume there will be pretty huge improvements in DDR5 over the course of a year or 18 months.


----------



## RandallFlagg (Nov 5, 2021)

?

Edit: NVM, AIDA has a bclk related bug with AL


----------



## tabascosauz (Nov 5, 2021)

GerKNG said:


> am i the only one who has insane problems with this platform?
> 3200mb/s RAM (2x16GB) CL16 does not even POST.
> the Bios that was flashed was from August and was not able to boot into anything (had to flash it before i could even install windows)



Gigabyte got hit by ransomware in August. Basically all the BIOSes on my B550I Aorus AX went to shit after F12 or so, in that time frame. Customer support was completely offline until like October.

Gigabyte apparently got hit again by the same group in late October. I have a soft spot for Gigabyte boards, but I'd stay away until they actually emerge from this hellhole. Don't expect their boards to be ready.


----------



## bug (Nov 5, 2021)

RandallFlagg said:


> I will tell ya, I have never been impressed with Cutress.  He is so wrapped up in his spec marks, he just makes assumptions based on his theories and runs with them, never bothering to check if he is right or not.
> 
> So one of the reasons I like TPU is because they benchmark apps that, in my experience, *most* people actually use.
> 
> ...


I fail to see the contradiction. Some workloads are impacted, some are not. What Anand says, is that most of them fall into the latter category. I'll believe that, until more reviews come out saying otherwise.


----------



## birdie (Nov 5, 2021)

A nice compilation from 3dcenter.org



Games (CPU limit)11600K11700K11900K5600X5800X5900X5950X12600K12700K12900KCores & Topology6C RKL8C RKL8C RKL6C Zen38C Zen312C Zen316C Zen36C + 4c ADL8C + 4c ADL8C + 8c ADLAnandTech  (8T, 1080p 95th)86.2%89.3%88.6%87.9%_100%_CapFrameX  (10T, 720p avg)87.3%89.9%88.8%_100%_ComputerBase  (9T, 720p avg)78.9%91.6%87.4%90.5%93.7%94.7%90.5%94.7%_100%_Eurogamer  (8T, 1080p 5%)67.8%75.3%75.9%82.0%89.0%_100%_Gamers Nexus  (7T, 1080p 1%)87.3%93.8%85.8%90.4%91.4%91.4%_100%_Golem  (8T, 720p P1)87.0%82.1%84.6%_100%_Hardware  luxx (4T, 720p avg)86.5%88.4%91.4%86.2%88.6%88.7%88.5%92.2%_100%_Igor's Lab  (10T, 720p 99th)76.9%81.3%88.4%81.7%87.3%88.4%88.1%90.6%95.0%_100%_Le Comptoir  (11T, 1080p 1%)72.8%76.4%79.9%80.7%85.0%86.8%87.9%93.1%97.0%_100%_Linus Tech Tips  (6T, 1080p 99th)81.8%86.8%85.7%91.7%91.4%96.3%_100%_Notebookcheck  (9T, 720p avg)86.7%92.3%95.5%98.9%99.6%95.4%89.2%_100%_PC Games HW  (20T, 720p avg)75.2%87.1%80.0%82.9%87.4%91.1%88.8%_100%_PC world  (12T, 720p avg)80.1%85.9%87.7%91.1%91.8%_100%_SweClockers  (5T, 720p 99th)76.6%85.9%81.9%86.9%83.6%90.3%_100%_TechPowerUp  (10T, 720p avg)81.2%84.5%86.6%85.5%89.4%90.4%89.6%93.7%_100%_TechSpot  (10T, 1080p 1%)88.5%94.3%94.9%_100%_Tom's Hardware  (6T, 1080p 99th)85.2%86.4%92.3%82.6%83.9%90.8%86.4%92.5%_100%_*Averaged game performance**78.1%**82.3%**86.6%**83.4%**87.2%**89.3%**89.4%**91.5%**95.8%**100%*List price$ 237$ 374$ 519$ 299$ 449$ 549$ 749$ 264$ 384$ 564



Game perf.vs 11600Kvs 11700Kvs 11900Kvs 5600Xvs 5800Xvs 5900X*Core i5-12600K*+ 17.2%+ 11.2%+ 5.7%+ 9.8%+ 5.0%+ 2.5%*Core i7-12700K*+ 22.7%+ 16.5%+ 10.7%+ 15.0%+ 9.9%+ 7.3%*Core i9-12900K*+ 28.1%+ 21.5%+ 15.5%+ 19.9%+ 14.7%+ 12.0%


----------



## Operandi (Nov 5, 2021)

birdie said:


> Intel has pushed their P-cores for MT scenarios to extremes to be the absolute performance king and at least be faster than 5900X and rival 5950X in many cases. This results in an insane power consumption out of the box but only for heavy MT tasks, e.g. video encoding, rendering, software compilation, math calculations - not something average people do daily.


Is it jut me or is there something about the way Intel does things that allow their CPUs to scale frequency at the extreme ends of power scale better than AMD CPUs.  You can manually push all core overclocks on AMD CPUs and power goes up but you get very little from it where as Intel CPUs yeah the power consumption becomes pretty impractical and more than a bit of a problem but you get performance that seems to scale better.  Not sure if this has been discussed elsewhere but its an interesting observation. 

If AMD tweaked the Zen 3 core to scale to ADL levels of power they'd be pretty much at parity?


----------



## RandallFlagg (Nov 5, 2021)

birdie said:


> A nice compilation from 3dcenter.org
> 
> 
> 
> ...



That 12600K is just amazing.  I don't recall ever seeing a release from either AMD or Intel where their mid tier performance chip defeated the top tier performance of the last generation CPU in gaming overall.


----------



## GerKNG (Nov 5, 2021)

tabascosauz said:


> Gigabyte got hit by ransomware in August. Basically all the BIOSes on my B550I Aorus AX went to shit after F12 or so, in that time frame. Customer support was completely offline until like October.
> 
> Gigabyte apparently got hit again by the same group in late October. I have a soft spot for Gigabyte boards, but I'd stay away until they actually emerge from this hellhole. Don't expect their boards to be ready.


do you really think all of these issues are because of the gigabyte board?
because it feels like i am testing alder lake the day after they made the first internal silicon


----------



## bug (Nov 5, 2021)

Operandi said:


> Is it jut me or is there something about the way Intel does things that allow their CPUs to scale frequency at the extreme ends of power scale better than AMD CPUs.  You can manually push all core overclocks on AMD CPUs and power goes up but you get very little from it where as Intel CPUs yeah the power consumption becomes pretty impractical and more than a bit of a problem but you get performance that seems to scale better.  Not sure if this has been discussed elsewhere but its an interesting observation.
> 
> If AMD tweaked the Zen 3 core to scale to ADL levels of power they'd be pretty much at parity?


Intel lets these unlocked CPUs to use max power all the time (as opposed to say, ~1 minute, which was the norm so far). It basically allows the CPU to overclock itself indefinitely, as long the cooling can keep up. It's an easy way to extract every little bit of juice out of the box, with the downside that effective performance will now vary based on the cooling solution.


----------



## PolRoger (Nov 5, 2021)

GerKNG said:


> i have my hands on a 12600k + Aorus Pro DDR4.
> 
> am i the only one who has insane problems with this platform?
> 3200mb/s RAM (2x16GB) CL16 does not even POST.
> ...


I picked up a Z690i Aorus Ultra (DDR4) and i5-12600K combo yesterday and I'm not having the type of stability issues that you are describing. 

My board also came with an August  F1 BIOS which I did start out with but I've now updated to the launch release F3 BIOS.

I'm testing with W10. I had some quirks/issues (my LAN cut out) but getting the correct drivers via the Gigabyte Launch App helped to clear that up. I'm currently just testing with default auto settings and XMP enabled (using a 2x8GB 1R 3200C14 kit).

I have found one particular issue while running World Community Grid... Where the program only seems to run/load the E-cores while the P-cores all remain idle?? 

However Asus RealBench and Prime95 will both run/stress with all cores fully loaded. I haven't tested/run Cinebench or AIDA64 yet.


----------



## R-T-B (Nov 5, 2021)

tabascosauz said:


> Gigabyte apparently got hit again by the same group in late October.


OT a bit, but do you have any supporting evidence for that part of your claim?  Feel free to PM.


----------



## Operandi (Nov 5, 2021)

bug said:


> Intel lets these unlocked CPUs to use max power all the time (as opposed to say, ~1 minute, which was the norm so far). It basically allows the CPU to overclock itself indefinitely, as long the cooling can keep up. It's an easy way to extract every little bit of juice out of the box, with the downside that effective performance will now vary based on the cooling solution.


Yes I am aware of that.  My point is there seems to be something different about how Intel CPUs scale with performance relative to power.  AMD CPUs don't seem to have that top end ability to push that little bit of extra performance at the cost of extreme power usage.  You can do it, cooling isn't reallly a problem but you get almost nothing in return, with Intel's designs of late you get something for that insane power consumption.


----------



## birdie (Nov 5, 2021)

Who are those GODS?





The best 5950X result under Windows: 2104/23600. 12900K is 33% faster in ST. That's impossible.

12900K results sorted by ST and MT.
5950X results sorted by ST and MT. (MacOS results are higher and probably shouldn't be taken into consideration as this OS has a different scheduler and the app is compiled slightly differently).


----------



## GerKNG (Nov 5, 2021)

PolRoger said:


> I picked up a Z690i Aorus Ultra (DDR4) and i5-12600K combo yesterday and I'm not having the type of stability issues that you are describing.
> 
> My board also came with an August  F1 BIOS which I did start out with but I've now updated to the launch release F3 BIOS.
> 
> ...


other benchmarks and applications run fine.
just cinebench R15 and R20 (sometimes!) drops out the E Cores.

btw. did you checked your Core Temperatures in HWInfo?
i have two cores constantly sitting way below ambient and dropping to 0°C (maybe it's just a HWInfo bug)


----------



## tabascosauz (Nov 5, 2021)

GerKNG said:


> do you really think all of these issues are because of the gigabyte board?
> because it feels like i am testing alder lake the day after they made the first internal silicon



There's a million things it could be, and ADL owners' sample size isn't nearly big enough, but most things come back to the board. Remember how Rocket Lake had a launch day microcode patch to improve performance?

Are you still on F1 BIOS or something? Whichever it shipped with is not available on the website, the website only lists F4 from November 1st as the initial release.

The 0 degree reading is weird though. Is your HWInfo up to date? Couple of versions of HWInfo now, all with improvements for Alder Lake reporting. Possibly bad CPU (very slim chance though), but firmware also has a hand in just about everything these days.


----------



## PolRoger (Nov 5, 2021)

GerKNG said:


> other benchmarks and applications run fine.
> just cinebench R15 and R20 (sometimes!) drops out the E Cores.
> 
> btw. did you checked your Core Temperatures in HWInfo?
> i have two cores constantly sitting way below ambient and dropping to 0°C (maybe it's just a HWInfo bug)



I have noticed some low temps being reported sometimes? But usually only when a core is in an idle state...

With the new F3 BIOS I've been able to disable the E-Cores so now WCG will run fully loaded @12 threads. My system Kill-A Watt is showing a reading of ~191/192 watts for this 6C/!2T load.

Cooling is open air with an old (4 heat pipe) Noctua NH-U12P with Scythe S-Flex fans in push/pull config.


----------



## GerKNG (Nov 5, 2021)

tabascosauz said:


> Are you still on F1 BIOS or something?


i was not even able to boot to anything (NVME drives were not even supported) with the bios F2.

i flashed F4 (the only available one) and only after that i was able to install windows.



PolRoger said:


> I have noticed some low temps being reported sometimes? But usually only when a core is in an idle state...


exactly this.
at idle i see below ambient on 2-3 cores and they occasionally drop to 0°C.


----------



## Deleted member 202104 (Nov 5, 2021)

weekendgeek said:


> I had a 12700k and DDR4 board in my cart at Newegg this morning to try out.  It was going to be a few weeks to get the LGA1700 mounting adapters to fit any of my coolers.  Gave up and bought a set of fast RAM instead for the main rig.  I usually want it now or not at all when it comes to that type of purchase.





15th Warlock said:


> I read Asus mobos have mounting holes that line up with previous gen coolers, I hope I can use my current cooler.



I need an Adult.


----------



## Deleted member 215115 (Nov 5, 2021)

RandallFlagg said:


> That 12600K is just amazing.  I don't recall ever seeing a release from either AMD or Intel where their mid tier performance chip defeated the top tier performance of the last generation CPU in gaming overall.


5600X vs 3950X

3600 vs 2700X

Maybe 9700K vs 8700K & 8600K vs 7700K


----------



## RandallFlagg (Nov 5, 2021)

rares495 said:


> 5600X vs 3950X
> 
> 3600 vs 2700X



Neither the 3950X nor the 2700X were the top tier gaming chips of their generation.  That would go to the 10900K/9900K and 8700K.  I guess it isn't really all that impressive a stat when you consider that AMD only had a top tier gaming chip at all for one year out of the past 14 or so years.


----------



## Deleted member 215115 (Nov 5, 2021)

RandallFlagg said:


> Neither the 3950X nor the 2700X were the top tier gaming chips of their generation.  That would go to the 10900K/9900K and 8700K.  I guess it isn't really all that impressive a stat when you consider that AMD only had a top tier gaming chip at all for one year out of the past 14 or so years.


Fair enough. Sometimes I can't read. The 5600X did spank the 10900K in all but a few games though.


----------



## RandallFlagg (Nov 5, 2021)

rares495 said:


> 5600X vs 3950X
> 
> 3600 vs 2700X
> 
> Maybe 9700K vs 8700K & 8600K vs 7700K





rares495 said:


> Fair enough. Sometimes I can't read. The 5600X did spank the 10900K in all but a few games though.



It would be like AMD releasing a 6600X that spanks the 12900K.  That didn't happen with Zen 3, 10900K was still generally faster than 5600X and 5800X.  ofc at this point, I think we'll need a new gen of GPUs to see anything get spanked above 720p.



birdie said:


> Who are those GODS?
> 
> View attachment 223885
> 
> ...



AL appears to clock to the moon if you can cool it from what I've seen so far.  Lot of seemingly easy LN2 6.4Ghz overclocks on youtube that usually require going through several pre-binned chips.  I'm gonna look up the geekbench details for that run though.

Ok that's a 5950X at 6.2Ghz.

Something not right on the 12900K, says it's running at 40Ghz


----------



## 15th Warlock (Nov 5, 2021)

weekendgeek said:


> I need an Adult.
> 
> View attachment 223887







You're asking for help in the wrong forum, we are all addicted to PC parts here 

And I just can't find any DDR5 to save my life, that's the only piece I'm missing to my Alder Lake puzzle!

Anyone has any suggestions? I never expected the memory would be the hardest part to get for this new build!


----------



## Psychoholic (Nov 5, 2021)

weekendgeek said:


> I need an Adult.
> 
> View attachment 223887



I Noticed you're using a D15 on your current rig.
I am running the same Z690 Board you listed and had some issues with my noctua due to the height of the VRM heatsink.

Just FYI: 






						ASUS ROG Strix Z690-A Gaming WiFi D4 | Motherboard Compatibility | Noctua Compatibility Centre
					

Socket: LGA 1700




					ncc.noctua.at


----------



## birdie (Nov 5, 2021)

RandallFlagg said:


> Something not right on the 12900K, says it's running at 40Ghz


Yeah, noticed it too, many applications haven't been updated yet to read ADL cores frequencies. It's not about this CPU, all ADL results on GB5 are out of this world right now.


----------



## Deleted member 202104 (Nov 5, 2021)

15th Warlock said:


> View attachment 223891
> 
> You're asking for help in the wrong forum, we are all addicted to PC parts here
> 
> ...



Yeah, I know.   I have to keep telling myself it's only money.

Yesterday Newegg had some XPG 5200 in a combo with a board, but I just checked and they're sold out now too.  All of my usual spots I just checked are also   I'll keep an eye open and PM you if I see anything.



Psychoholic said:


> I Noticed you're using a D15 on your current rig.
> I am running the same Z690 Board you listed and had some issues with my noctua due to the height of the VRM heatsink.
> 
> Just FYI:
> ...



Of course it won't fit.   I have a D15 and a U12S.  Both say they'll fit rotated I guess.  What did you end up doing?

Appreciate the heads up.


----------



## Psychoholic (Nov 5, 2021)

weekendgeek said:


> Yeah, I know.   I have to keep telling myself it's only money.
> 
> Yesterday Newegg had some XPG 5200 in a combo with a board, but I just checked and they're sold out now too.  All of my usual spots I just checked are also   I'll keep an eye open and PM you if I see anything.
> 
> ...



I actually dug out my old corsair H150i and slapped it on.
My U12A "ALMOST" fit and was able to screw it down but wouldnt make good contact due to touching the VRM heatsink.
I guess rotating it would have worked but would have looked kind of funny.


----------



## Deleted member 202104 (Nov 6, 2021)

Psychoholic said:


> I actually dug out my old corsair H150i and slapped it on.
> My U12A "ALMOST" fit and was able to screw it down but wouldnt make good contact due to touching the VRM heatsink.
> I guess rotating it would have worked but would have looked kind of funny.



Thanks.  I have a few others I could try - there's no info on their sites regarding fit, so I'll see what happens.  I suppose this is a sign I should pick up an AIO.


----------



## tabascosauz (Nov 6, 2021)

weekendgeek said:


> Thanks.  I have a few others I could try - there's no info on their sites regarding fit, so I'll see what happens.  I suppose this is a sign I should pick up an AIO.



The reviews had 12900K running at a crazy 4.9 @ 1.3V. Optimumtech was already able to do 5.1 @ 1.21V. 12700K obviously not as well binned, but undervolting looks very promising indeed. Might be lots of power and temp to shave off through UV. I'm confident that U12A shouldn't have that much trouble after tweaks, if it doesn't hit the VRM heatsink.

Even if Asus has the LGA115x mounting holes, I'd still ask Noctua for the new 1700 kit. The Z-height seems different. Just an email away from a free kit anyways.


----------



## Deleted member 24505 (Nov 6, 2021)

Guess i'll have to buy a lga1700 compatible waterblock for my upgrade too


----------



## 15th Warlock (Nov 6, 2021)

weekendgeek said:


> Yeah, I know.   I have to keep telling myself it's only money.
> 
> Yesterday Newegg had some XPG 5200 in a combo with a board, but I just checked and they're sold out now too.  All of my usual spots I just checked are also   I'll keep an eye open and PM you if I see anything



Thank you! Ad good luck with your build as well!


----------



## Deleted member 202104 (Nov 6, 2021)

tabascosauz said:


> The reviews had 12900K running at a crazy 4.9 @ 1.3V. Optimumtech was already able to do 5.1 @ 1.21V. 12700K obviously not as well binned, but undervolting looks very promising indeed. Might be lots of power and temp to shave off through UV. I'm confident that U12A shouldn't have that much trouble after tweaks, if it doesn't hit the VRM heatsink.
> 
> Even if Asus has the LGA115x mounting holes, I'd still ask Noctua for the new 1700 kit. The Z-height seems different. Just an email away from a free kit anyways.



I currently have the U12S on a 10850k and with some voltage tweaks, It keeps it under 80c doing a Cinebench run @ ~175w.  Gaming is no problem @ stock clocks with unlocked power limits  I was thinking I'd try that on the 12700k so I might mount it 90 degrees in the meantime until I can find something that fits correctly.  And good point on the kit from Noctua - I'll send an email off tonight.

I'm looking forward to UV on ADL - I haven't had much patience with CO on 16 cores....


----------



## lexluthermiester (Nov 6, 2021)

weekendgeek said:


> I need an Adult.


You might have come to the wrong thread...


weekendgeek said:


> View attachment 223887


Nice selection. Just a heads up, that board has a firmware update available, it would be a good idea to use it for installing an OS. Also, Windows 11 and make sure you use 22000.282 at least.


----------



## RandallFlagg (Nov 6, 2021)

Welp that didn't take long.  5600X still has another $90 it needs to fall but the 5800X is about right, give or take a bit.  We'll see if this makes its way to Amazon soon, but I bet it does :

*

*


----------



## Deleted member 202104 (Nov 6, 2021)

RandallFlagg said:


> Welp that didn't take long.  5600X still has another $90 it needs to fall but the 5800X is about right, give or take a bit.  We'll see if this makes its way to Amazon soon, but I bet it does :
> 
> *View attachment 223950*



Sweet baby jeebus.

The 5800x is where it should have been from the beginning, and honestly, the 5600x at $199-219 would be perfect.  Slot the 5600G at $169 and they'd have something for the entry level again.

The 5950x at NewEgg was $739 today, which I believe is their lowest so far.


----------



## ir_cow (Nov 6, 2021)

It seems these E-core are unpredictable. After running a bunch of memory benchmarks, I kept getting strange results for Far Cry 5. I'm not sure why I didn't try this before but basically if I leave E-Cores enabled the FSP avg is 196 - 183. it will change every time I do the benchmark. If I disable the E-cores, it is constantly 183-184.

Looks like I'll have to toss out the Far Cry 5 benchmark for now. Its unreliable.

DDR5-5600 CL36

Enabled:
Run 1: 192.4
Run 2: 189.4
Run 3: 193.8
Run 4: 185.2
Run 5: 195.2

Disabled:
Run 1: 183.3
Run 2: 184.9
Run 3: 183.6
Run 4: 184.4


----------



## RandallFlagg (Nov 6, 2021)

weekendgeek said:


> Sweet baby jeebus.
> 
> The 5800x is where it should have been from the beginning, and honestly, the 5600x at $199-219 would be perfect.  Slot the 5600G at $169 and they'd have something for the entry level again.
> 
> The 5950x at NewEgg was $739 today, which I believe is their lowest so far.



Yup.  This is only at MC right now though, and really only the 5800X rates a good deal.  

OTOH, Rocket Lake 11600K is now $199.  That's only like $20 more than an 11400, a really good deal at least until 5600X price drops.


----------



## lexluthermiester (Nov 6, 2021)

ir_cow said:


> It seems these E-core are unpredictable. After running a bunch of memory benchmarks, I kept getting strange results for Far Cry 5. I'm not sure why I didn't try this before but basically if I leave E-Cores enabled the FSP avg is 196 - 183. it will change every time I do the benchmark. If I disable the E-cores, it is constantly 183-184.
> 
> Looks like I'll have to toss out the Far Cry 5 benchmark for now. Its unreliable.
> 
> ...


Out of 180+fps you're concerned with an average variance of 4fps? That's margin of error kind of variance. Get over it. Sorry if I seem over-critical, but the scientist in me saw the average 4fps difference and shook his head.. The FC5 benchmark is fine and valid.


----------



## birdie (Nov 6, 2021)

I haven't seen any reviews which directly answer the question: why did Intel create and use E-cores.

The answer is simple: E-cores improve MT performance without blowing up your power budget - they are a _lot_ more effective in terms of performance per watt in MT tasks than P-Cores.


----------



## lexluthermiester (Nov 6, 2021)

birdie said:


> I haven't seen any reviews which directly answer the question: why did Intel create and use E-cores.
> 
> The answer is simple: E-cores improve MT performance without blowing up your power budget - they are a _lot_ more effective in terms of performance per watt in MT tasks than P-Cores.


And? What point are you trying to make with that statement?


----------



## Solid State Soul ( SSS ) (Nov 6, 2021)

Intel's victory here is not as impressive when you consider Alder lake consume much more power than AMD 5000 series CPUs, total system power draw is 100w or more with Alder lake vs Ryzen 5000, AMD since Ryzen 2000 series AMD continued to dethrone intel while also improving on efficiency , intel tryes to catch up by increase power draw to no end, with so much efficiency advantage AMD has a lot headroom to fight back.


----------



## ViperXTR (Nov 6, 2021)

Solid State Soul ( SSS ) said:


> Intel's victory here is not as impressive when you consider Alder lake consume much more power than AMD 5000 series CPUs, total system power draw is 100w or more with Alder lake vs Ryzen 5000, AMD since Ryzen 2000 series AMD continued to dethrone intel while also improving on efficiency , intel tryes to catch up by increase power draw to no end, with so much efficiency advantage AMD has a lot headroom to fight back.


it consumes more power, but in return it gives better performance on highly threaded productivity tests and scenarios. Gaming wise, Intel is actually more power efficient. And when idle, Intel is either same or lower power because of those E cores, and this is done with 10nm while AMD is at 7nm


----------



## Solid State Soul ( SSS ) (Nov 6, 2021)

ViperXTR said:


> it consumes more power, but in return it gives better performance on highly threaded productivity tests and scenarios. Gaming wise, Intel is actually more power efficient. And when idle, Intel is either same or lower power because of those E cores, and this is done with 10nm while AMD is at 7nm


AMD is more efficient without needing E cores, thats the main thing! so not only next gen Ryzen will presumably have better core than intel, but they will be all performance core and would still end up being more efficient!


----------



## ViperXTR (Nov 6, 2021)

Solid State Soul ( SSS ) said:


> AMD is more efficient without needing E cores, thats the main thing! so not only next gen Ryzen will presumably have better core than intel, but they will be all performance core and would still end up being more efficient!


But the end result being the power draw is the main thing right? Also, take a look at that 5950X 32thread vs 24thread 12900K

Also, im very curious on how it plays with RPCS3, perhaps it requires a bit of modification but i can already see it ploughing through this PS3 emulator (one of the few apps/games that takes advantage of AVX512 to boost performance)


----------



## FireFox (Nov 6, 2021)

15th Warlock said:


> Anyone else moving to Alder Lake for their main rig?


Me been an Intel fanboy means yeah i wil
744,62€ for a CPU hurts


----------



## birdie (Nov 6, 2021)

FWIW: just released GeekBench 5.4.3 fixes CPU frequency detection for Alder Lake CPUs.


----------



## oxrufiioxo (Nov 6, 2021)

15th Warlock said:


> Anyone else moving to Alder Lake for their main rig?



Tempting, but the 12900k doesn't offer enough over my 5950X to justify..... I have been contemplating a 12700k to replace my 5800X based system though once DDR5 is easier to get mostly just to play around with I'd probably still get a ridiculous 600+ usd mobo though.... I really want to see what Zen3D brings first though as I might grab a 16 core varient of it and just run two 16 core systems as my mains


----------



## ViperXTR (Nov 6, 2021)

12th series cpus started appearing on my local distributors and as expected, early adopters tax (that's equivalent to 426.47  USD)


----------



## FireFox (Nov 6, 2021)

ViperXTR said:


> 12th series cpus started appearing on my local distributors and as expected, early adopters tax (that's equivalent to 426.47  USD)
> View attachment 223973


Too expensive for an i5


----------



## Totally (Nov 6, 2021)

Bill_Bright said:


> I wonder how many people will really choose which CPU to buy based on these power consumption specs? I sure won't. I do care (a little - but not much) about power consumption at idle. But I buy based on performance and the intended purpose of the computer.
> 
> All these specs are really going to do for me is help me decide which cooler to get (if no OEM is included) and the size of my PSU.
> 
> When enthusiasts are looking to buy their new 2nd childhood, over-compensating sports car, are they really going to look at fuel economy and horsepower? Or are they going to look at fastest times and top speeds?


----------



## 15th Warlock (Nov 6, 2021)

oxrufiioxo said:


> Tempting, but the 12900k doesn't offer enough over my 5950X to justify..... I have been contemplating a 12700k to replace my 5800X based system though once DDR5 is easier to get mostly just to play around with I'd probably still get a ridiculous 600+ usd mobo though.... I really want to see what Zen3D brings first though as I might grab a 16 core varient of it and just run two 16 core systems as my mains


Yea, couldn’t agree more, no sense in replacing your 5950X with a 12900K, as far as the DDR5 situation goes, have you considered the Asus Z690 Strix D4? It’s more affordable at $349, and you can use DDR4. I’m seriously considering sending back the Asus mobo I got, and replacing it for the DDR4 version, I saw de8auer’s video, and for gaming, using DDR5 just doesn’t justify its crazy price tag.




FireFox said:


> Me been an Intel fanboy means yeah i wil
> 744,62€ for a CPU hurts


Whoa! Yes that hurts indeed, better to wait for the dust to settle, and prices to become more palatable, I got lucky finding my CPU at BestBuy, at $619, I think they had the lowest price for this CPU, I’ve seen them restock a couple more times since I ordered my CPU, and at one point even their local store in town had some i9s in stock yesterday, I hope Intel isn’t affected by the chip shortage, since they have their own fabs.


----------



## Dristun (Nov 6, 2021)

FireFox said:


> Too expensive for an i5


The best price anyone's gonna get in Russia through official sources is going to be around 400USD = EUR pricing * 20% VAT. That's more or less how everyone has been setting MSRPs here for a while. Gray markets might get lower to around 350 in a few months if the availability is good.


----------



## FireFox (Nov 6, 2021)

15th Warlock said:


> better to wait for the dust to settle, and prices to become more palatable,


I agree.
My current build is just 11 months old so i will be upgrading around middle of April 2022, also i saw that the Motherboard i want ( Asus Rog  Maximus Z690 Hero) right now costs 700€+


----------



## birdie (Nov 6, 2021)

Hardwareluxx ran an interesting test comparing 12900K's performance at different power targets.

Looks like 160-180W is the sweet spot for this CPU. Going above means a disproportionate increase in power consumption vs. a very dubious performance increase.





Has anyone seen an IPC comparison between Alder Lake and Tiger Lake? I can't find anything. Would be nice to know how much ADL is better than TGL if at all  Should I wait for an ADL laptop or go for Black Friday discounts?


----------



## oxrufiioxo (Nov 6, 2021)

15th Warlock said:


> Yea, couldn’t agree more, no sense in replacing your 5950X with a 12900K, as far as the DDR5 situation goes, have you considered the Asus Z690 Strix D4? It’s more affordable at $349, and you can use DDR4. I’m seriously considering sending back the Asus mobo I got, and replacing it for the DDR4 version, I saw de8auer’s video, and for gaming, using DDR5 just doesn’t justify its crazy price tag.



It would technically replace my 5800X regardless I would keep the 5950X as my primary or secondary computer. For me if I'm gonna do it it's got to be the Hero or higher just out of principle and half the reason I would do it is to mess around with DDR5.... I don't really care about  cost I'm at a point in life where that is irrelevant.

I do think Meteorlake will be special and worth waiting for if intel can keep this momentum unlike the sub 10% ST performance gains we got from Sandybridge to Skylake... For all we knows intel could be stuck on 10nm for the next half decade like they were with Skylake on 14nm


----------



## bug (Nov 6, 2021)

birdie said:


> Has anyone seen an IPC comparison between Alder Lake and Tiger Lake? I can't find anything. Would be nice to know how much ADL is better than TGL if at all  Should I wait for an ADL laptop or go for Black Friday discounts?


You're not likely to find a comparison like that, with Tiger Lake being a mobile part.

That said, there are some improvements in select scenarios: https://www.anandtech.com/show/1704...hybrid-performance-brings-hybrid-complexity/5


> At the highest level, the P-core supports a 6-wide decode (up from 4), and has split the execution ports to allow for more operations to execute at once, enabling higher IPC and ILP from workflow that can take advantage.


Sadly, no numbers.


----------



## PolRoger (Nov 6, 2021)

15th Warlock said:


> Yea, couldn’t agree more, no sense in replacing your 5950X with a 12900K, as far as the DDR5 situation goes, have you considered the Asus Z690 Strix D4? It’s more affordable at $349, and you can use DDR4. I’m seriously considering sending back the Asus mobo I got, and replacing it for the DDR4 version, I saw de8auer’s video, and for gaming, using DDR5 just doesn’t justify its crazy price tag.
> 
> 
> 
> Whoa! Yes that hurts indeed, better to wait for the dust to settle, and prices to become more palatable, I got lucky finding my CPU at BestBuy, at $619, I think they had the lowest price for this CPU, I’ve seen them restock a couple more times since I ordered my CPU, and at one point even their local store in town had some i9s in stock yesterday, I hope Intel isn’t affected by the chip shortage, since they have their own fabs.



My main setups are running open air bench style with custom water cooled Ryzen 59xx/39xx cpus. My last intel was back on  Z170/Z270 Skylake/Kaby Lake when I did a late cycle transition to 1st gen Ryzen 1700X/X370. I'll continue to look forward to new AMD Ryzen tech but I'm not opposed to Intel either and have run plenty of Blue platforms.

Alder Lake has piqued my interest enough to give Intel's new (mainstream) platform a spin as an additional (fun/hobby) setup. My i5-12600K/Z690 (DDR4) combo is nice upgrade to my previous 6700K/7700K!

Some of the new ADL pricing is on the high side but that has always seemed to be the case with the release of new tech. 

DDR5 should eventually follow the same lowering price trends as DDR4/DDR3/DDR2... However, I do find it interesting to see the rising pricing of new higher end motherboards over the last few generations? Older boards like the LGA 775 ASUS Maximus/Rampage Formula/Extreme or even the (HEDT) LGA 1366 X58 EVGA Classified and the ASUS Rampage Extreme II/III were not cheap boards back in their day but these newer higher end boards with prices pushing ~$600/700/800 even a $1000+??


----------



## lexluthermiester (Nov 6, 2021)

birdie said:


> Has anyone seen an IPC comparison between Alder Lake and Tiger Lake?


Steve did. He was very vocal about it.


----------



## tabascosauz (Nov 6, 2021)

Has anyone seen any reviews / have a ADL CPU with DDR4 to test, that is able to test the limit of Gear 1 for these new boards? Seems that DDR5 might be suffering from the Gear 2 penalty to some degree so the actual difference between DDR4 and DDR5 is still unclear.

RKL was 3600/3733. Ali only stated that 3200 runs at Gear 1 because he always only runs that same 3200CL14 XMP, no mention of how high Gear 1 can go on ADL, or how it compares to RKL.

Seems kinda important, since if you're building Alder Lake with DDR4 chances are there may be a high end B-die kit lying around that you want to keep using.


----------



## ir_cow (Nov 6, 2021)

lexluthermiester said:


> Out of 180+fps you're concerned with an average variance of 4fps? That's margin of error kind of variance. Get over it. Sorry if I seem over-critical, but the scientist in me saw the average 4fps difference and shook his head.. The FC5 benchmark is fine and valid.


It absolutely matters for a review. I removed other variables and locked the CPU frequency to avoid large swings. I should be able to run these tests in 6 months and get the same results. If today I get 196 (which I did) and yesterday 183. Which is correct? Those who are shelling up 300+ for memory will want to know if what they are buying has any impact on games.



tabascosauz said:


> Has anyone seen any reviews / have a ADL CPU with DDR4 to test, that is able to test the limit of Gear 1 for these new boards? Seems that DDR5 might be suffering from the Gear 2 penalty to some degree so the actual difference between DDR4 and DDR5 is still unclear.
> 
> RKL was 3600/3733. Ali only stated that 3200 runs at Gear 1 because he always only runs that same 3200CL14 XMP, no mention of how high Gear 1 can go on ADL, or how it compares to RKL.
> 
> Seems kinda important, since if you're building Alder Lake with DDR4 chances are there may be a high end B-die kit lying around that you want to keep using.


I'm actually working on that today. Seeing how high I can go with Gear 1 for DDR4. I suspect you need to raise the VCCSA to like 1.3 V or higher just like 11th Gen.


----------



## PolRoger (Nov 6, 2021)

I'm running now at 3600C15 (Gear 1) with E Cores disabled for a WCG/issue. I just set timings at 15-15-15-35-50 and all other timings on auto. Dram at 1.35v and VCCSA at 1.110v. With more time I'll push/test higher and tighten up secondary timings.


----------



## Vader (Nov 6, 2021)

Operandi said:


> Is it jut me or is there something about the way Intel does things that allow their CPUs to scale frequency at the extreme ends of power scale better than AMD CPUs.  You can manually push all core overclocks on AMD CPUs and power goes up but you get very little from it where as Intel CPUs yeah the power consumption becomes pretty impractical and more than a bit of a problem but you get performance that seems to scale better.  Not sure if this has been discussed elsewhere but its an interesting observation.
> 
> If AMD tweaked the Zen 3 core to scale to ADL levels of power they'd be pretty much at parity?


You're on the right track. There was one overclocker that pushed an amd cpu to 5Ghz and higher using LN2 and ran some benchmarks. What he found out was that Ryzen CPUs tend to hit a bottleneck of some kind when they move far from stock clocks and subsequently performance doesn't scale well. His conclusion was that AMD set their Ryzen CPUs at the right clock speed; more speed would have had bigger effects on power consumption than performance. we'll see if they can improve on that in following generations


----------



## 15th Warlock (Nov 6, 2021)

tabascosauz said:


> Has anyone seen any reviews / have a ADL CPU with DDR4 to test, that is able to test the limit of Gear 1 for these new boards? Seems that DDR5 might be suffering from the Gear 2 penalty to some degree so the actual difference between DDR4 and DDR5 is still unclear.
> 
> RKL was 3600/3733. Ali only stated that 3200 runs at Gear 1 because he always only runs that same 3200CL14 XMP, no mention of how high Gear 1 can go on ADL, or how it compares to RKL.
> 
> Seems kinda important, since if you're building Alder Lake with DDR4 chances are there may be a high end B-die kit lying around that you want to keep using.


Der8auer has a good good video comparing DDR5 to DDR4, and so does kit guru, you can find them on YouTube.

So I also ordered the Asus Z690 Strix-A D4, it'll get here tomorrow, no chance in hell I'll find DDR5 before that, I'll test my ADL with my DDR4 3600CL16 kit before I get a chance to test it with the Atrix-F mobo. Both boards are pretty much the same, down to the power delivery system.

I might end up sticking to DDR4 until DDR5 goes through its growing pains, for gaming, it'll have minimal effect.

Once DDR5 hits mainstream we'll see 6000MHz kits and higher become mainstream, as it is now, you cant even buy a platry 4800MHz kit...


----------



## PolRoger (Nov 6, 2021)

Running/testing 3733C16 Gear 1...  Manually set 16-16-16-36-52 with the rest of the timings on auto. DRAM @1.35v and VCCSA @1.120v showing just a quick AIDA64 memory stress.


----------



## GerKNG (Nov 7, 2021)

not far away from a 5900x


----------



## phanbuey (Nov 7, 2021)

PolRoger said:


> Running/testing 3733C16 Gear 1...  Manually set 16-16-16-36-52 with the rest of the timings on auto. DRAM @1.35v and VCCSA @1.120v showing just a quick AIDA64 memory stress.
> 
> View attachment 224023



Holy sh*t... my chip is coming on Monday....  I hope it can hit that in gear 1.  Super helpful.

Aida memory stress sucks - use Karhu Ram Test or Testmem 5


----------



## GerKNG (Nov 7, 2021)

btw:
Intel seem to have released the first intel management engine software for Z690 (it appeared today on my support site for my motherboard)
it fixed the one "no driver found" device in my device manager and i can finally play AC Valhalla for some reason (didn't launched before)


----------



## mstenholm (Nov 7, 2021)

GerKNG said:


> not far away from a 5900x
> View attachment 224071


I’m sure there will come a 12600K OC thread sooner or later.


----------



## 15th Warlock (Nov 7, 2021)

Just got my CPU tonight, guess BestBuy shipped it from a distribution center only 10 miles away from my small, middle of nowhere town


Super stoked to try it!


----------



## PolRoger (Nov 7, 2021)

phanbuey said:


> Aida memory stress sucks - use Karhu Ram Test or Testmem 5



I don't have Karhu but I've got TestMem5. I'm still needing to get the preset with the 20 pass loop going on this test drive. Even so AIDA Memory stress does and will fail if your having big memory stability issues. I've been running 3+ hrs WCG load and so far no WHEA's... no BSOD.


----------



## phanbuey (Nov 7, 2021)

PolRoger said:


> I don't have Karhu but I've got TestMem5. I'm still needing to get the preset with the 20 pass loop going on this test drive. Even so AIDA Memory stress does and will fail if your having big memory stability issues. I've been running 3+ hrs WCG load and so far no WHEA's... no BSOD.



I used to use AIDA exclusively... but then I've passed 24 hour AIDA and kept getting a whea once EVERY 3 DAYS and randomly failing to wake from suspend... just weird stuff... thought "must be something else the ram is stable" -- nope - had a bad stick of ram (failed in Karhu in less than 150% at stock, narrowed the stick all stable now... that took  ~3 mins vs MONTHS of testing with AIDA)

If you look online at OC communities Aida is specifically on the 'do not use' list, you have to run it for FOREVER and multiple times to find instability.  It can fail unstable ram in 5 mins, or pass it for 3 hours... Just ends up wasting your time as you get to your max OC.

This was not doable with my aida results:




I only reboot for updates with the current ram OC, and the proper testing utilites let you confidently overclock tertiaries and tune RTL/IOL without having to sacrifice anything to the RAM stability gods.


----------



## tabascosauz (Nov 7, 2021)

PolRoger said:


> I don't have Karhu but I've got TestMem5. I'm still needing to get the preset with the 20 pass loop going on this test drive. Even so AIDA Memory stress does and will fail if your having big memory stability issues. I've been running 3+ hrs WCG load and so far no WHEA's... no BSOD.



Not sure about @1usmusv3, but just one run of @anta777extreme1 ensures almost-total stability for mild OCs. 1.5 hours on a 2x8GB kit and 2 hours on a 2x16GB kit. Still recommended to run a few times (I do 3) and verify with another test like overnighting with HCI.

Untested memory is a bad call on a daily rig. You will never see reboots if you're not running memory intensive loads, but /sfc scannow will always find the damage. Windows is extremely enthusiastic about self-destructing on unstable memory.


----------



## lexluthermiester (Nov 7, 2021)

birdie said:


> Maybe I'm stupid or inattentive today but I don't see a single Tiger Lake CPU in the video. I certainly didn't mean Rocket Lake (loosely based on Ice Lake) which is a desktop CPU.


Nope that was my bad, I should have explained. Tiger Lake is effectively the mobile(10nm) version of Rocket Lake. Tiger lake performance numbers will be lower than that of Rocket Lake and therefore you can derive Tiger Lake from there. Right now there is no way to compare directly as Apples to Apples just isn't a viable option for the moment. As Alder Lake leaps ahead of Rocket Lake by big margin in most cases, we can safely conclude the difference will be bigger with Tiger Lake, thus Steve's video in which he compares the Desktop 11xxx series to the new 12xxx series. It's desktop VS desktop, but if you look at those number and the ones the W1zzard has done, then compare to the tests done previously between Rocket Lake VS Tiger Lake, you can conclude what the direct comparison would be.



ir_cow said:


> If today I get 196 (which I did) and yesterday 183. Which is correct?


Likely both as, again, margin of error kind of variance. And the variances are in the favor of faster performance. What you likely have is a Windows service running, shutting down or stopping during your testing and restarting as testing continues. It happens, it's a Windows thing. And as such everyone will have a similar experience, therefore your testing results are still valid. So include all results and average between them. Alternatively you can find the service causing the variance and disable it during testing.

To throw out results that do not meet with your satisfaction is to effectively cheat the testing(not accusing you of cheating, only saying that is the effective result), and that will not give a clear picture of the actual performance.


----------



## phanbuey (Nov 7, 2021)

tabascosauz said:


> Not sure about @1usmusv3, but just one run of @anta777extreme1 ensures almost-total stability for mild OCs. 1.5 hours on a 2x8GB kit and 2 hours on a 2x16GB kit. Still recommended to run a few times (I do 3) and verify with another test like overnighting with HCI.
> 
> Untested memory is a bad call on a daily rig. You will never see reboots if you're not running memory intensive loads, but /sfc scannow will always find the damage. *Windows is extremely enthusiastic about self-destructing on unstable memory.*


 Well put.


----------



## PolRoger (Nov 7, 2021)

Please excuse my previous post if any of you thought I was trying to make claims towards some kind of true tested stability. This is new setup/platform and I'm just starting to get a feel for it. I was attempting show a quick memory stress using Gear 1 and then I left it running for a few hours on WCG and it was still going when I came back. 

Here are two more showing AIDA memory benchmarks for both Gear 1 and Gear 2... I'm not sure if I can get my 12600K sample to run Gear 1 @2000MHz?? 

4000C17 Gear 2:





3900C17 Gear 1:


----------



## phanbuey (Nov 7, 2021)

PolRoger said:


> Please excuse my previous post if any of you thought I was trying to make claims towards some kind of true tested stability. This is new setup/platform and I'm just starting to get a feel for it. I was attempting show a quick memory stress using Gear 1 and then I left it running for a few hours on WCG and it was still going when I came back.
> 
> Here are two more showing AIDA memory benchmarks for both Gear 1 and Gear 2... I'm not sure if I can get my 12600K sample to run Gear 1 @2000MHz??
> 
> ...



No man not at all... - we love ur results just trying to save you stress (lol)


----------



## birdie (Nov 7, 2021)

FWIW only memtest86 can test your _entire_ memory. Its free version is enough for absolute most average people including OC/tech enthusiasts. Highly recommended. The application is signed so there's no need to disable EFI secure boot.





Windows built-in memtest can do it too but it's quite simplistic and not as thorough as memtest86 which can find errors not detectable by Windows memory checker.


----------



## lexluthermiester (Nov 7, 2021)

birdie said:


> FWIW only memtest86 can test your _entire_ memory. Its free version is enough for absolute most average people including OC/tech enthusiasts. Highly recommended. The application is signed so there's no need to disable EFI secure boot.
> 
> View attachment 224142
> 
> Windows built-in memtest can do it too but it's quite simplistic and not as thorough as memtest86 which can find errors not detectable by Windows memory checker.


See PM.


----------



## chrcoluk (Nov 7, 2021)

lexluthermiester said:


> Steve did. He was very vocal about it.


DDR4 vs DDR4 right?

Most of the reviewer's seemed to only do DDR5.

Having thought about this product release some more, I think I over praised it.

Some of the performance gain is from DDR4 to DDR5, masked due to launch reviews been DDR5 only.
Any other OS aside from Windows 11 is likely to yield problems, solution probably to disable e-cores.  Curious how long it will take for Linux and BSD to get scheduler updates.  It took them a while to resolve Ryzen issues and those were less complicated issues to fix.
DDR5 is expensive right now, coupled with motherboard prices, it is a very expensive generation.
Power consumption for me is still too high, Intel need lots of work in this area.

DDR5 will get cheaper in newer generations as it ramps up production, but by the time that happens we will have a newer Intel chip.

Its a shame we seen no boards which have dual DDR4/5 slots.  Users having to choose between gimped performance or elevated prices for ram.


----------



## ir_cow (Nov 7, 2021)

lexluthermiester said:


> Likely both as, again, margin of error kind of variance. And the variances are in the favor of faster performance. What you likely have is a Windows service running, shutting down or stopping during your testing and restarting as testing continues. It happens, it's a Windows thing. And as such everyone will have a similar experience, therefore your testing results are still valid. So include all results and average between them. Alternatively you can find the service causing the variance and disable it during testing.
> 
> To throw out results that do not meet with your satisfaction is to effectively cheat the testing(not accusing you of cheating, only saying that is the effective result), and that will not give a clear picture of the actual performance.


I get where you are coming from but It is not a windows service. It is directly related to the e-cores and how windows is scheduling them. It may be a "valid" result for a user, but it is not consistent across 8 different memory frequencies making it worthless for data collection. I can only explain this anomaly from e-core scheduling problem.

So either I included 2 tests. One with it enabled and one without. Or I remove it completely which I am doing. It is garbage data that is unrelated to system memory and the review. The article is not Windows 11 gaming, or GPU Performance. It is a memory review.

Edit:  Skewing the results or removing data to give favoritism is a real thing yes. In this case I am removing it because the benchmark data for this game is unrelated to the memory and only muddies the overall data set. So I'll ask, If DDR5-5200 and DDR5-6400 get anywhere from 183 to 196 avg per run, how does that contribute the the overall conclusion for a anything that isn't a windows 11 gaming performance review? In my opinion it does not add to the overall conclusion that is relevant to the review.

I only posted originally to let people know that this could be a problem in other games as well. Not to turn it into a debate about review ethics.


----------



## 15th Warlock (Nov 7, 2021)

chrcoluk said:


> DDR4 vs DDR4 right?
> 
> Most of the reviewer's seemed to only do DDR5.
> 
> ...










5600X getting trounced by the 12600K using DDR4, and in some cases even the 5800X gets creamed; no, it's not thanks to DDR5 as you can see.

And if you have the right cooler, power consumption really is a non issue after all.


----------



## The red spirit (Nov 7, 2021)

birdie said:


> FWIW only memtest86 can test your _entire_ memory. Its free version is enough for absolute most average people including OC/tech enthusiasts. Highly recommended. The application is signed so there's no need to disable EFI secure boot.
> 
> View attachment 224142
> 
> Windows built-in memtest can do it too but it's quite simplistic and not as thorough as memtest86 which can find errors not detectable by Windows memory checker.


I wouldn't recommend it at all. In my experience, it never detects unstable RAM. I have used prime95 and some other memory testing software, but memtest86 just never does what it claims.


----------



## ir_cow (Nov 7, 2021)

The red spirit said:


> I wouldn't recommend it at all. In my experience, it never detects unstable RAM. I have used prime95 and some other memory testing software, but memtest86 just never does what it claims.


I always go Memtest86 > Memtest64 > AIDA64 > Prime95. Best way to eliminate problems. Memtest86 is great to narrow down voltage problems and avoid corrupting windows. Test 5-7 is 99% a memory controller problem. 1-2 is low memory voltage and 3-4 is timings. Follow these and you can save yourself a lot of headaches.


----------



## mstenholm (Nov 7, 2021)

It seems like this thread has outlived itself.


----------



## RandallFlagg (Nov 7, 2021)

Well, this is interesting.


----------



## tabascosauz (Nov 7, 2021)

RandallFlagg said:


> Well, this is interesting.
> 
> View attachment 224250



You know what's the funny part about this? It shouldn't take that much effort for AMD to do something about this. I've brought this up a few times before.

The 2CCD CPUs function very differently than the 5600X and 5800X. If Windows is doing some low priority task in the background, it can use a low-quality core on the 2nd CCD, which happens pretty often actually, to keep it off the the important cores on 1st CCD which are always better binned as a whole. This CCD2 core can regularly see some pretty high clocks/usage/power draw, almost as high as the preferred cores.

In that sense, it achieved some semblance of big.little, amongst homogeneous cores, before Alder Lake came around. Unfortunately, CPPC and Windows only understand how to use that 1 core on CCD2 for this. Any higher priority and the low priority load won't keep expanding into CCD2 - it'll just be treated as another task for the preferred cores, then the rest of CCD1 if it needs more threads.

But something tells me AMD won't, because it's not wise to place your trust anywhere near AGESA if the last two years have taught me anything, especially for already-released products. Perhaps changes will be made for Zen 4.


----------



## docnorth (Nov 7, 2021)

GerKNG said:


> not far away from a 5900x
> View attachment 224071


You finally made it work.



birdie said:


> I haven't seen any reviews which directly answer the question: why did Intel create and use E-cores.
> 
> The answer is simple: E-cores improve MT performance without blowing up your power budget - they are a _lot_ more effective in terms of performance per watt in MT tasks than P-Cores.


Actually E cores allow Intel to clock P cores much higher. So Intel dominates when it comes to one or low core performance and stays competitive with 5950x for MT (also by increasing consumption). A let’s say 12 P core CPU could maybe achieve the same by allowing high clocks for up to 3-4 cores, but drastically reducing frequencies for MT load. Obviously Intel engineers know better than me which plan is better for now _and _years to come.


----------



## bug (Nov 7, 2021)

tabascosauz said:


> You know what's the funny part about this? It shouldn't take that much effort for AMD to do something about this. I've brought this up a few times before.
> 
> The 2CCD CPUs function very differently than the 5600X and 5800X. If Windows is doing some low priority task in the background, it can use a low-quality core on the 2nd CCD, which happens pretty often actually, to keep it off the the important cores on 1st CCD which are always better binned as a whole. This CCD2 core can regularly see some pretty high clocks/usage/power draw, almost as high as the preferred cores.
> 
> ...


Not sure whether you're aware, but the difference between a so called high or low quality core on Ryzen is dictated by manufacturing differences. It's probably something like 3%.


----------



## tabascosauz (Nov 7, 2021)

bug said:


> Not sure whether you're aware, but the difference between a so called high or low quality core on Ryzen is dictated by manufacturing differences. It's probably something like 3%.



Okay, cool story?  CCD2 will still always be binned worse as a whole, it's how 5900X and 5950X work.

That's completely unrelated to what I was saying. The point is that AMD understands the importance of heterogeneous compute, but only sees fit to dedicate a single core to background tasks, while the remaining 5 or 7 cores on CCD2 (which may even surpass CCD1 cores in quality) will never do anything beyond participate in all-core.

They have the cores to make it happen. Those cores aren't doing anything 90% of the time.


----------



## PolRoger (Nov 7, 2021)

With my 12600K even though I've been able to load Windows with memory Gear 1 @1933/1950MHz... I've been having stability issues. I'm thinking that Gear 1 @1900Mhz may end up being a good daily use type for this particular sample. 

HyperPi 32M with 12 threads... (E-Cores disabled) 3800C16 Dram ~1.380v and VCCA ~1.170v:






A little bump up on the core overclock on this sample... P-Cores set to 50x and E-cores set to 38x (AVX offset @-2 (48x) Vcore set .1.235v. Temps probably getting close to the limits for this old Noctua NH-U12P cooler.


----------



## GerKNG (Nov 8, 2021)

docnorth said:


> You finally made it work.


yeah  and even the ram runs now at 3600 Gear 1.
i was used to intels " oh no ram runs at above 2666! lets run the VCCIO/SA at 1.6V
but that's not the case anymore. it seems to just stay at the stock 1.05V (SA) and only changes when you actually do it manually.

at 1.1V everything is fine (passed 1000% memtest)


----------



## bug (Nov 8, 2021)

tabascosauz said:


> Okay, cool story?  CCD2 will still always be binned worse as a whole, it's how 5900X and 5950X work.
> 
> That's completely unrelated to what I was saying. The point is that AMD understands the importance of heterogeneous compute, but only sees fit to dedicate a single core to background tasks, while the remaining 5 or 7 cores on CCD2 (which may even surpass CCD1 cores in quality) will never do anything beyond participate in all-core.
> 
> They have the cores to make it happen. Those cores aren't doing anything 90% of the time.


Low powered tasks can run on a single core just fine. The more you distribute those tasks, the more you prevent cores from sleeping/power gating. Never assume you have thought of something a whole team of engineers that earn their living out it haven't before. I know I don't.


----------



## ir_cow (Nov 8, 2021)

Got to DDR4-4200 in 1:1 Ratio. DDR4-4400 Boots, but it will BSOD without 1.4v on the SA


----------



## PolRoger (Nov 8, 2021)

ir_cow said:


> Got to DDR4-4200 in 1:1 Ratio. DDR4-4400 Boots, but it will BSOD without 1.4v on the SA


Curious as to how much VCCSA for Gear 1 @2000MHz were you needing?


Also on a side note... My last/previous AIDA screenshot proved unstable at 5GHz all core. I think the -2 AVX offset and the FPU stress component was dropping too many cores down to 48x. I took the same settings and disabled the E-cores (for the WCG scheduling bug) and the OC quickly triggered "Watch Dog Timeouts" while running WCG load with 12 threads.
Currently running WCG @49x (12T) with ~1.25v load. I think I'm going to need better cooling to stabilize and run @50x all core.


----------



## GerKNG (Nov 8, 2021)

anyone with a 12600k got an idea why the CPU constantly drops clocks on single cores as soon as you are above 4.8 Ghz?
it has nothing to do with powerlimits, AVX offsets, fll OC modes to compensate the broken clock speed reports, TVB Clipping, current excursion protections or anything else related to power, thermals or offsets.

it begins at 4.9 Ghz by dropping it back to 4.8.
above that it is always a 200Mhz drop:
4.8 GHz = 4.8 for ever.
4.9 Ghz = 4.8 every couple seconds on 1-2 cores under normal load like gaming (heavy load like rendering does it way less often)
5Ghz = 5 Ghz dropping to 4.8.
5.1 Ghz = 5.1 Ghz dropping to 4.9
5.2 Ghz = 5.2 Ghz dropping to 5.0
...

i saw it in optimum techs video about the 12600k as well. look at P Core #4

starts at 8:44










any ideas how to fix that?


----------



## ir_cow (Nov 8, 2021)

PolRoger said:


> Curious as to how much VCCSA for Gear 1 @2000MHz were you needing?
> 
> 
> Also on a side note... My last/previous AIDA screenshot proved unstable at 5GHz all core. I think the -2 AVX offset and the FPU stress component was dropping too many cores down to 48x. I took the same settings and disabled the E-cores (for the WCG scheduling bug) and the OC quickly triggered "Watch Dog Timeouts" while running WCG load with 12 threads.
> Currently running WCG @49x (12T) with ~1.25v load. I think I'm going to need better cooling to stabilize and run @50x all core.



I don't know for 2000MHz. I had it at 1.25v VCCSA for 2200MHz. The 1.4v was a hail marry just to see if I could get it into windows. Not something I would use daily personally. Not even sure if its stable. Just enough to get into windows.


----------



## PolRoger (Nov 8, 2021)

GerKNG said:


> anyone with a 12600k got an idea why the CPU constantly drops clocks on single cores as soon as you are above 4.8 Ghz?
> it has nothing to do with powerlimits, AVX offsets, fll OC modes to compensate the broken clock speed reports, TVB Clipping, current excursion protections or anything else related to power, thermals or offsets.
> 
> it begins at 4.9 Ghz by dropping it back to 4.8.
> ...



I'm not sure if it is something similar... But I did noticed ~200MHz core drops when enabling AIDA FPU @50x.
I was thinking it was some kind of BIOS auto AVX offset? As it doesn't do that with just AIDA CPU enabled. You might try and disable the E-cores and test just the P-cores with your board to see if it still does it? I'm not seeing any MHz drop at all while running WCG @49x (12T).


----------



## ir_cow (Nov 8, 2021)

4000 1:1 . I just enabled XMP on this Crucial Ballistix MAX for convenience. Don't care about timings yet.

Edit: Just tried 1T. Works as well.


----------



## 15th Warlock (Nov 8, 2021)

Another review showing the difference between DDR4 and DDR5 in terms of performance, or lack thereof...

What I find funny is, a couple of these videos use one of the lowest end Asus boards for DDR4, and usually they pit that against a high end, expensive board, in most cases the difference is a couple percent points only, even accounting for the memory difference.


----------



## chrcoluk (Nov 8, 2021)

15th Warlock said:


> 5600X getting trounced by the 12600K using DDR4, and in some cases even the 5800X gets creamed; no, it's not thanks to DDR5 as you can see.
> 
> And if you have the right cooler, power consumption really is a non issue after all.


I never said it was all down to DDR5, please dont mix my words. 

The new release is a mixture of a new cpu architecture and major chipset changes.

Also if you want me to check content please give me the point to skip to on the video or a text article thanks, not got time to watch a 20 minute video which is mostly slides.

Also in regards to power some of us live in a country with really expensive electric that is supply constrained and also want to have more greener levels of power usage, the issue is not about having the right psu etc.



15th Warlock said:


> Another review showing the difference between DDR4 and DDR5 in terms of performance, or lack thereof...
> 
> What I find funny is, a couple of these videos use one of the lowest end Asus boards for DDR4, and usually they pit that against a high end, expensive board, in most cases the difference is a couple percent points only, even accounting for the memory difference.


Shame he didnt put older gen cpu's on the same graph, as we dont have the baseline data otherwise, media reviewers really need to think about what they doing here, is often corners cut to get content out quicker.

So he needed on those graphs.

11900k with ddr4
12900k with ddr4
12900k with ddr5

was pointless putting the 12600k on there as well as it wasnt tested with ddr4.

So yes only a few percentage, but we dont know the overall percentage as it wasnt put on the graph (unless we watch/read another review).

If I was upgrading today I very likely would be going ddr4, better latency and huge cost saving, which would potentially offset replacing the board with ddr5 at a later date after ddr5 prices drop.


----------



## GerKNG (Nov 8, 2021)

PolRoger said:


> I'm not sure if it is something similar... But I did noticed ~200MHz core drops when enabling AIDA FPU @50x.
> I was thinking it was some kind of BIOS auto AVX offset? As it doesn't do that with just AIDA CPU enabled. You might try and disable the E-cores and test just the P-cores with your board to see if it still does it? I'm not seeing any MHz drop at all while running WCG @49x (12T).


I have everything that could possibly influence the clock speed disabled in the bios.
And if it would be a bios setting. Why is it only above 4.8... it basically makes overclocking for games useless


----------



## Outback Bronze (Nov 8, 2021)

GerKNG said:


> I have everything that could possibly influence the clock speed disabled in the bios.
> And if it would be a bios setting. Why is it only above 4.8... it basically makes overclocking for games useless



What OS you running it on? On my 11700K ive been playing around with I could not get clocks above 4.8/9Ghz  AC no matter what bios setting I had. Believe it or not it was the power setting in windows I had to change to get it to clock further. That was on Win 10. You tried the power settings in windows yet?


----------



## GerKNG (Nov 8, 2021)

Outback Bronze said:


> What OS you running it on? On my 11700K ive been playing around with I could not get clocks above 4.8/9Ghz  AC no matter what bios setting I had. Believe it or not it was the power setting in windows I had to change to get it to clock further. That was on Win 10. You tried the power settings in windows yet?


windows 11 pro
i tried power plans and basically everything you can do in the bios.
i flashed it again, installed everything that intel provides for Z690/ADL.
above 4.8 all core = dropping to 4.8 or 200Mhz when at 5.1
pretty sad since i think my 12600k is pretty decent... it runs P95 at around 1.28V load VCore @ 5Ghz/4.1Ghz/4.1Ghz (P/E/Ring) 
but even in P95 it barely drops to 4.8 at all. in games it's all over the place.


----------



## PolRoger (Nov 8, 2021)

GerKNG said:


> windows 11 pro
> i tried power plans and basically everything you can do in the bios.
> i flashed it again, installed everything that intel provides for Z690/ADL.
> above 4.8 all core = dropping to 4.8 or 200Mhz when at 5.1
> ...


Maybe it is something BIOS related? We are both running Gigabyte DDR4 boards?...
It is still quite early with ADL. I'm wondering if this also happens with the new i7/i9 cpus?? How about other board manufacturers? I not much of a gamer so I can't really test that aspect but I'm going to continue to see if I can notice this happening while running/testing. The WCG W10 scheduling bug is kind of a bummer but that should get sorted out eventually. I've been running overnight 6c/12T all P-core load (E-core disabled) @49x with ~1.25v Vcore and HWiNFO shows @4.9GHz with no drops. If I turn off WCG and  idle it just sits/maintains at 4.9GHz.

Edit: It could also somehow be software reporting related... ADL is new and I would expect refinements/improvements to upcoming beta HWiNFO release versions.


----------



## RandallFlagg (Nov 8, 2021)

Seems like people over at overclock.net are having a lot of success overclocking both DDR4 and DDR5 on Z690. 

This one on DDR4 reportedly couldn't get 3866 to work before, but on ADL has it running DDR4-4100 C14 .  And this is with a relatively cheap Strix D4 :





This guy OC'd some DDR5-4800 to 6200 C32.  The total latency here 51.5ns total is right in line with the DDR4-4100 C14 above.   This is on an ultra-expensive Apex though ($840). 





DDR5-6600 CL32 Trident-Z spotted :









Thinking that within a year, DDR5 is going to dust DDR4.  Not yet, but not that far off.


----------



## GerKNG (Nov 8, 2021)

PolRoger said:


> Maybe it is something BIOS related? We are both running Gigabyte DDR4 boards?...
> It is still quite early with ADL. I'm wondering if this also happens with the new i7/i9 cpus?? How about other board manufacturers? I not much of a gamer so I can't really test that aspect but I'm going to continue to see if I can notice this happening while running/testing. The WCG W10 scheduling bug is kind of a bummer but that should get sorted out eventually. I've been running overnight 6c/12T all P-core load (E-core disabled) @49x with ~1.25v Vcore and HWiNFO shows @4.9GHz with no drops. If I turn off WCG and  idle it just sits/maintains at 4.9GHz.
> 
> Edit: It could also somehow be software reporting related... ADL is new and I would expect refinements/improvements to upcoming beta HWiNFO release versions.


i made two screenshots to show what i mean.
i don't think that it is a software problem
why should it only be a thing above 4.8 and then specifically 100 Mhz at 4.9 and then 200Mhz above 4.9.


----------



## RJARRRPCGP (Nov 8, 2021)

ir_cow said:


> View attachment 224308
> 
> 4000 1:1 . I just enabled XMP on this Crucial Ballistix MAX for convenience. Don't care about timings yet.
> 
> Edit: Just tried 1T. Works as well.


AIDA64 displaying "0.0 ns"?



chrcoluk said:


> If I was upgrading today I very likely would be going ddr4, better latency and huge cost saving, which would potentially offset replacing the board with ddr5 at a later date after ddr5 prices drop.


Reminds me of LGA 775, when both DDR2 and DDR3 existed. The last time Intel did this, were with off-die RAM controller CPUs. Did Intel put them back on the motherboard? LOL.

It's like me in 2013, when I chose a motherboard with DDR2, because I couldn't afford more RAM on top of that!


----------



## seth1911 (Nov 8, 2021)

Thats wrong with 1151/v2 u could use DDR3 too, but only Asrock and Biostar had cheap boards for them.

I have here a H310 with 2x DDR3 it worked fine with a 9700  , but i had to give the 9700 back to a friend


----------



## PolRoger (Nov 8, 2021)

GerKNG said:


> i made two screenshots to show what i mean.
> i don't think that it is a software problem
> why should it only be a thing above 4.8 and then specifically 100 Mhz at 4.9 and then 200Mhz above 4.9.



I see what your saying and I can see it on my setup too while running AIDA stress at P49x/E38x... As early adopters I guess we'll have to deal with some issues that can hopefully get sorted out. 

BTW... I download a newly released "beta" HWiNFO today.


----------



## GerKNG (Nov 8, 2021)

PolRoger said:


> I see what your saying and I can see it on my setup too while running AIDA stress at P49x/E38x... As early adopters I guess we'll have to deal with some issues that can hopefully get sorted out.
> 
> BTW... I download a newly released "beta" HWiNFO today.
> 
> View attachment 224418


i really hope this gets sorted out quickly. until this is fixed i run my 12600k at 4.8 (almost 100mv less Vcore and the performance is barely different)


----------



## ir_cow (Nov 9, 2021)

RJARRRPCGP said:


> AIDA64 displaying "0.0 ns"?


That is normal if you leave core integrity enabled.


----------



## phanbuey (Nov 9, 2021)

So got my new chip in tonight and slapped it into the rig asap -- holy crap it's fast.












So far passing stability on the MSI-PRO A z690 at 5.2 ghz - temps dont go beyond 82 C in a hot af room (81F) in stress.

Havent touched memory or ring... going to stabilize core at 5.2/5.1 first.


----------



## lexluthermiester (Nov 9, 2021)

phanbuey said:


> holy crap it's fast.


Yeah it is. Damn.


phanbuey said:


> Havent touched memory or ring... going to stabilize core at 5.2/5.1 first.


Been watching OCing efforts elsewhere on the net and Alder Lake seems to have a general 5.1/5.2 ceiling. If you can get 5.1 out of it without high voltage, call it a day.


----------



## Deleted member 24505 (Nov 9, 2021)

Just got 1400 quid to blow, so i'm gonna get a 12600k with a z690 D4 board.  so looking forward to it. Only thing is, i'll have to get a lga1700 compatible water block too for my loop.


----------



## GerKNG (Nov 9, 2021)

lexluthermiester said:


> If you can get 5.1 out of it without high voltage, call it a day.


but what is the new high or safe voltage compared to 14nm? 
not that we end like with Zen 2 where people pumped 1.4V+ with high LLC into their 3700
normally i'd go (max) 50-100mv above stock.


----------



## 15th Warlock (Nov 9, 2021)

I got my Asus Strix D4 board last night, I just returned the DDR5 version of this very same board. Now I’m just waiting for the retrofit kit for my AIO cooler to put my system together.

Question to the pros in this thread, are you enabling XMP profiles on your BIOS in order to run your RAM at Gear 1? Or are you manually setting the timings for your particular memory kits?

Any help with this will be greatly appreciated!


----------



## GerKNG (Nov 9, 2021)

15th Warlock said:


> I got my Asus Strix D4 board last night, I just returned the DDR5 version of this very same board. Now I’m just waiting for the retrofit kit for my AIO cooler to put my system together.
> 
> Question to the pros in this thread, are you enabling XMP profiles on your BIOS in order to run your RAM at Gear 1? Or are you manually setting the timings for your particular memory kits?
> 
> ...


XMP up to 3200 runs at gear 1
you have to set it manually when you go above it.
intel seems to have a locked stock voltage for VCCSA now instead of justing going crazy with auto voltages.
if you run a 3600 Kit you need ~ 1.1V VCCSA to run at gear 1.

i just set XMP (3600 C16), Gear 1 and 1.1V SA. without the SA at 1.1V i can't even post.


----------



## 15th Warlock (Nov 9, 2021)

GerKNG said:


> XMP up to 3200 runs at gear 1
> you have to set it manually when you go above it.
> intel seems to have a locked stock vcore for VCCSA now instead of justing going crazy with auto voltages.
> if you run a 3600 Kit you need ~ 1.1V VCCSA to run at gear 1.
> ...


Thank you so much for your response! so with your kit, running at 3200 allowed you to run at tighter CAS latency timings?


phanbuey said:


> So got my new chip in tonight and slapped it into the rig asap -- holy crap it's fast.
> 
> View attachment 224467
> 
> ...



I love your system! Super clean build!


----------



## RandallFlagg (Nov 9, 2021)

GerKNG said:


> i really hope this gets sorted out quickly. until this is fixed i run my 12600k at 4.8 (almost 100mv less Vcore and the performance is barely different)



This may be relevant.  I don't know these are adjustable, possibly hidden BIOS settings :



*Power Limit 3 (PL3):* A threshold that if exceeded, the PL3 rapid power limiting algorithms will *attempt to limit the duty cycle of spikes above PL3 by reactively limiting frequency. This is an optional setting*
*Power Limit 4 (PL4)*: A limit that will not be exceeded, the PL4 power limiting algorithms *will preemptively limit frequency to prevent spikes above PL4.*
*Turbo Time Parameter (Tau)*: An averaging constant used for PL1 exponential weighted moving average (EWMA) power calculation.


----------



## GerKNG (Nov 9, 2021)

15th Warlock said:


> Thank you so much for your response! so with your kit, running at 3200 allowed you to run at tighter CAS latency timings?


it behaves pretty much like ryzen and 11th/10th gen. 
what was stable before is now stable as well. all it needs is manual VCCSA. 
3200 CL14 worked just like with my 5800X but the 3600 gear 1 with XMP gives me better (2-3%) gaming performance. (i have trident z neo 16gb dimms (32GB) 16-19-19-39)



RandallFlagg said:


> This may be relevant.  I don't know these are adjustable, possibly hidden BIOS settings :
> 
> 
> 
> ...


none of these three settings are available in my gigabyte bios.


----------



## 15th Warlock (Nov 9, 2021)

GerKNG said:


> it behaves pretty much like ryzen and 11th/10th gen.
> what was stable before is now stable as well. all it needs is manual VCCSA.
> 3200 CL14 worked just like with my 5800X but the 3600 gear 1 with XMP gives me better (2-3%) gaming performance. (i have trident z neo 16gb dimms (32GB) 16-19-19-39)
> 
> ...


Excellent, I’ll try that, my kit has the same timings as yours! Thank you again for your help, you mind if I ask for advice once I put my system together?


----------



## GerKNG (Nov 9, 2021)

15th Warlock said:


> Excellent, I’ll try that, my kit has the same timings as yours! Thank you again for your help, you mind if I ask for advice once I put my system together?


no problem


----------



## Deleted member 24505 (Nov 9, 2021)

I think i'm gonna go for the 12700k/z690/2x8gb ddr4 3600c18. Cant wait to play with it. just need a block for the lga 1700 socket


----------



## phanbuey (Nov 9, 2021)

Tigger said:


> Just got 1400 quid to blow, so i'm gonna get a 12600k with a z690 D4 board.  so looking forward to it. Only thing is, i'll have to get a lga1700 compatible water block too for my loop.


You don't need it most likely... the holes on most water blocks fit  1200  and then cut out to also fit x299 they overlap the 1700 holes... my block went right in no issues whatsoever.






They make it sound like the holes are so different, they're only off by about 1.5mm - super close and in between x299 and 1200. 99% of water blocks that have the cutouts for intel sockets will work as is.


----------



## Deleted member 24505 (Nov 9, 2021)

phanbuey said:


> You don't need it most likely... the holes on most water blocks fit  1200  and then cut out to also fit x299 they overlap the 1700 holes... my block went right in no issues whatsoever.
> 
> View attachment 224525
> 
> They make it sound like the holes are so different, they're only off by about 1.5mm - super close and in between x299 and 1200. 99% of water blocks that have the cutouts for intel sockets will work as is.



My CPU is a ryzen 2600x, so socket AM4, i doubt my block will fit on a LGA1700.


----------



## phanbuey (Nov 9, 2021)

Tigger said:


> My CPU is a ryzen 2600x, so socket AM4, i doubt my block will fit on a LGA1700.



It didn't come with an LGA Bracket? -- In any case most LGA 1200 blocks will work; in case that info saves you $$.


----------



## Deleted member 24505 (Nov 9, 2021)

phanbuey said:


> It didn't come with an LGA Bracket? -- In any case most LGA 1200 blocks will work; in case that info saves you $$.



My block is a xspc raystorm


----------



## phanbuey (Nov 9, 2021)

XSPC Raystorm Intel Aluminum Bracket - Black RAYSTORM-INTEL-ALUM-BRKT-BK-D (performance-pcs.com)

frozen cpu has a $7 bracket too.


----------



## docnorth (Nov 9, 2021)

RandallFlagg said:


> Seems like people over at overclock.net are having a lot of success overclocking both DDR4 and DDR5 on Z690.
> 
> This one on DDR4 reportedly couldn't get 3866 to work before, but on ADL has it running DDR4-4100 C14 .  And this is with a relatively cheap Strix D4 :
> 
> ...


Midrange boards, like Asus Prime-A, already support 6000MT/s (via Asus webpage).


----------



## Deleted member 24505 (Nov 9, 2021)

phanbuey said:


> XSPC Raystorm Intel Aluminum Bracket - Black RAYSTORM-INTEL-ALUM-BRKT-BK-D (performance-pcs.com)
> 
> frozen cpu has a $7 bracket too.



So this will fit LGA1700?


----------



## phanbuey (Nov 9, 2021)

Tigger said:


> So this will fit LGA1700?


yes that one will fit .


----------



## unclewebb (Nov 9, 2021)

GerKNG said:


> i really hope this gets sorted out quickly.


Does HWiNFO show the reasons for throttling for the new 12th Gen CPUs? It should. 

There might be a clue in there what is causing the throttling that you are seeing. It might be caused by the voltage regulators or a current limit that is set too low. When stress testing, the Limit Reasons data should all show No. 

Here is an example of this data on a 10th Gen CPU.


----------



## PolRoger (Nov 9, 2021)

Tigger said:


> So this will fit LGA1700?





phanbuey said:


> yes that one will fit .



I also have the XSPC Raystorm waterblock . The Intel and AMD versions were sold as two separate blocks but you could buy adapter kits to convert from one to the other. The aluminum bracket will fit on top of the AMD version block but you still need the bottom bracket and the correct posts, springs, nuts etc. I believe Intel/AMD have different threading pattern to the female bottom bracket holes. The new stock metal 1700 OEM socket plate also has 4 bumps (rectangular) from the socket attachment screws The original XSPC Intel 115x bottom bracket has holes or a cut out for just 3 bumps (1156/1155 has triangular socket attachment screws). It looks like the original XSPC 115X bottom bracket won't lay flat against the back of the motherboard?? One side fits but the other won't and will be higher due to the two corresponding rectangular bumps on the opposite side of the 1700 OEM socket plate.


----------



## phanbuey (Nov 9, 2021)

PolRoger said:


> I also have the XSPC Raystorm waterblock . The Intel and AMD versions were sold as two separate blocks but you could buy adapter kits to convert from one to the other. The aluminum bracket will fit on top of the AMD version block but you still need the bottom bracket and the correct posts, springs, nuts etc. I believe Intel/AMD have different threading pattern to the female bottom bracket holes. The new stock metal 1700 OEM socket plate also has 4 bumps (rectangular) from the socket attachment screws The original XSPC Intel 115x bottom bracket has holes or a cut out for just 3 bumps (1156/1155 has triangular socket attachment screws). It looks like the original XSPC 115X bottom bracket won't lay flat against the back of the motherboard?? One side fits but the other won't and will be higher due to the two corresponding rectangular bumps on the opposite side of the 1700 OEM socket plate.



bottom brackets will not fit - only top mount - you can use bolts with no backplate and not over-tighten block or use another bracket / make your own using slot brackets.


----------



## Deleted member 24505 (Nov 9, 2021)

Am i just better off buying a new block? was looking at this-
https://www.corsair.com/uk/en/Categ...PU-Water-Block/p/CX-9010013-WW#tab-tech-specs

I already have the loop setup, so may as well water cool the i7 12700k as it seems they "may" run a tad hot.


----------



## PolRoger (Nov 10, 2021)

unclewebb said:


> Does HWiNFO show the reasons for throttling for the new 12th Gen CPUs? It should.
> 
> There might be a clue in there what is causing the throttling that you are seeing. It might be caused by the voltage regulators or a current limit that is set too low. When stress testing, the Limit Reasons data should all show No.
> 
> ...



Gigabyte Z690i @P49xE39x AIDA stress:

Edit: Better snapshot of HWiNFO while running AIDA stress.


----------



## lexluthermiester (Nov 10, 2021)

GerKNG said:


> but what is the new high or safe voltage compared to 14nm?
> not that we end like with Zen 2 where people pumped 1.4V+ with high LLC into their 3700
> normally i'd go (max) 50-100mv above stock.


The general vibe I'm getting(keep in mind I have no hands-on with AL yet) is that voltage should be kept below 1.4v The observed voltages from Intel automatic regulation seem to max out at around 1.35 to 1.375, so it should be safe to bump it bit higher. But remember, all the testing done so far seems to also indicate that Intel is really pushing the limits of AL dies to get the most from them, thus not a lot of headroom exists for OCing. There's not enough raw data about Intel's new 7nm(ish) process to know where the absolute limits are.

Given how expensive and high demand they are, my current advice to anyone doing OCing with Alder Lake is to go easy on the voltage and be happy(for now) with whatever OC you can get from default or near default voltages. The benefit is not worth the risk at this time. Once more testing has been done by reviewers and "elite" overclockers the limits will be better fleshed out.


----------



## GerKNG (Nov 10, 2021)

PolRoger said:


> Gigabyte Z690i @P49xE39x AIDA stress:
> 
> Edit: Better snapshot of HWiNFO while running AIDA stress.


looks almost like alder lake "K" CPUs get treated like non K SKUs.


----------



## W1zzard (Nov 10, 2021)

GerKNG said:


> looks almost like alder lake "K" CPUs get treated like non K SKUs.
> 
> View attachment 224647


What else would you expect? Infinite CPU clock?


----------



## GerKNG (Nov 10, 2021)

W1zzard said:


> What else would you expect? Infinite CPU clock?


that i get an unlocked CPU when i buy an unlocked CPU?


----------



## W1zzard (Nov 10, 2021)

GerKNG said:


> that i get an unlocked CPU when i buy an unlocked CPU?


It is unlocked, but it will only run X MHz by default


----------



## GerKNG (Nov 10, 2021)

W1zzard said:


> It is unlocked, but it will only run X MHz by default


we don't talk about a stock CPU.

this is a *manual overclock* to 5Ghz all core. and Alder lake currently forces the CPU down to 100Mhz below the single core boost or -200 Mhz when you're more than 200 Mhz above the Single Core boost. (like it is on my screenshot)

4.9 is the SC boost.
going to 4.9 all core = 4.8
5 Ghz all core = 4.8
5.1 Ghz all core = 4.9
5.2 Ghz all core= 5.0
...


----------



## chrcoluk (Nov 10, 2021)

Reports of AL having issues with some denuvo games, is this true?

I am guessing if true simply disabling the e-cores will fix it.


----------



## phanbuey (Nov 10, 2021)

Interesting --I did not see this behavior on the 12600k on the Aorus 690I we built.

Was ratio 51, auto voltage, efficient turbo -> on,

We did notice TVB ratio clipping doesn't work even though it's an option, likely due to the 12600K not having TVB.



chrcoluk said:


> Reports of AL having issues with some denuvo games, is this true?
> 
> I am guessing if true simply disabling the e-cores will fix it.



Or just set default affinity to a P core.


----------



## PolRoger (Nov 10, 2021)

GerKNG said:


> looks almost like alder lake "K" CPUs get treated like non K SKUs.
> 
> View attachment 224647





W1zzard said:


> What else would you expect? Infinite CPU clock?





GerKNG said:


> that i get an unlocked CPU when i buy an unlocked CPU?





W1zzard said:


> It is unlocked, but it will only run X MHz by default


I'm wondering if this is now a BIOS related issue for Gigabyte motherboards? Or is this some kind of Intel ADL "limit" that applies to all of the other motherboard makers? If ASUS, MSI, ASRock can design a BIOS to circumvent a type of ADL default rules like what we are seeing here then GIgabyte should also be able to come up with a BIOS that does the same?

EDIT: 
Maybe this is just Gigabyte BIOS issue and is related to AVX??...

AIDA stress utilizes AVX and so Does Prime95 and they both will drop the clocks on various cores while stressing with manual all core OC.  Disable AVX instructions on Prime95 and I'm not seeing the drop in core speeds.  WCG which I believe doesn't use AVX will also run without lowering clocks on all core with E-cores disabled (which is due to WCG W10 scheduling bug).


----------



## GerKNG (Nov 10, 2021)

PolRoger said:


> I'm wondering if this is now a BIOS related issue for Gigabyte motherboards? Or is this some kind of Intel ADL "limit" that applies to all of the other motherboard makers? If ASUS, MSI, ASRock can design a BIOS to circumvent a type of ADL default rules like what we are seeing here then GIgabyte should also be able to come up with a BIOS that does the same?
> 
> EDIT:
> Maybe this is just Gigabyte BIOS issue and is related to AVX??...
> ...


optimum techs 12600k review has the same problem.
it's on a z690 Unify.

the throttling happens completely random.
sometimes at idle on the desktop, ALL the time in cod vanguard for some reason. 
NEVER in Warzone.
every couple minutes in WoW.

i have every single option in the bios set to "unlock" every single thing that might throttle the CPU.
above 4.8 Ghz = down to 4.8 Ghz.
at 4.8 Ghz = stays for ever at 4.8 Ghz.

looks more like a bug to me tbh.


----------



## phanbuey (Nov 10, 2021)

GerKNG said:


> optimum techs 12600k review has the same problem.
> it's on a z690 Unify.
> 
> the throttling happens completely random.
> ...



Really does sound like a bug... I am going to try COD vanguard to see if this is happening for me.


----------



## unclewebb (Nov 10, 2021)

@GerKNG - The IA: Max Turbo Limit throttling reason can get triggered when a CPU is idle. This one is usually not that important or meaningful. 

Run a full load stress test. Take a screenshot of the limit reasons while the CPU is fully loaded and while the CPU is throttling.






@PolRoger - Your screenshot indicates that it might be related to IccMax or PL4. Have a look in the BIOS to see what IccMax is set to. Setting IccMax to the maximum, 255.75, used to work to eliminate this type of throttling for 10th and 11th Gen CPUs. I am not sure if 12th Gen use the same maximum. Both the CPU core and cache need to be set to the same value. If there is a bug in the BIOS, it might only be setting the core properly and not the cache / ring. 

It might not be possible to adjust PL4 in the BIOS but have a look for that setting.


----------



## GerKNG (Nov 10, 2021)

unclewebb said:


> @GerKNG - The IA: Max Turbo Limit throttling reason can get triggered when a CPU is idle. This one is usually not that important or meaningful.
> 
> Run a full load stress test. Take a screenshot of the limit reasons while the CPU is fully loaded and while the CPU is throttling.
> 
> ...


Your PL4 limit is for the uncore/Ring
This is constantly a thing because it is dynamic to 4.5GHz but only runs at 3.6 under load.

My throttle reason is in the screenshot i posted. Max turbo frequency ( just like when it would be a non K chip)


----------



## phanbuey (Nov 10, 2021)

I dont actually know what's going on anymore....

I just keep increasing multiplier and the numbers keep going up.  Scores starting to get to 'joke' territory.


----------



## RandallFlagg (Nov 10, 2021)

phanbuey said:


> View attachment 224683
> 
> 
> I dont actually know what's going on anymore....
> ...



You are getting close to +50% over my 10850K in single thread CPU-Z.  

Can you tell a difference in the UI response with that massive single thread performance, or was it already in the realm of not noticeable to normal humans?


----------



## phanbuey (Nov 10, 2021)

RandallFlagg said:


> You are getting close to +50% over my 10850K in single thread CPU-Z.
> 
> Can you tell a difference in the UI response with that massive single thread performance, or was it already in the realm of not noticeable to normal humans?


I think it's mostly placebo but it for sure feels faster .

5.4 failed unfortunately - died after some time with the LinPack... back down to 5.3 now and going to run it for a few days before moving on.






Havent touched the E cores yet but in 3900x cinebench mt territory.


----------



## RandallFlagg (Nov 10, 2021)

> 2000 single core in Cinebench R23 with a 12600K, good grief.  

I guess the days where the XX600K were just defective XX900K chips and generally couldn't clock as high are gone.  That's a different die on that chip from the 12700K/12900K.


----------



## tabascosauz (Nov 10, 2021)

RandallFlagg said:


> > 2000 single core in Cinebench R23 with a 12600K, good grief.
> 
> I guess the days where the XX600K were just defective XX900K chips and generally couldn't clock as high are gone.  That's a different die on that chip from the 12700K/12900K.



Isn't the other type of ADL die only 6+0, so 12600K is still just the same die as the other two? Nevertheless, very impressive.

I'm always jealous how comprehensive the performance data and reporting (IA limit reasons etc) are on Intel since even Haswell.


----------



## RandallFlagg (Nov 10, 2021)

tabascosauz said:


> Isn't the other type of ADL die only 6+0, so 12600K is still just the same die as the other two? Nevertheless, very impressive.
> 
> I'm always jealous how comprehensive the performance data and reporting (IA limit reasons etc) are on Intel since even Haswell.



I think the 12600K is the top tier of the 6+4 die.  The yet to be released lower SKUs like 12400 6+0 is supposedly a defective 12600K.  

From what I understand, Intel has two dies one is 8+8 (12900K) and its lower rated skus and defects, the other 6+4 (12600K) and its lower rated SKUs defects.

I put it rather harshly, but this is ofc the way all the vendors work including AMD and NVidia.  

In short, the old way is why top SKUs usually had the highest OC potential in single thread, despite having more points of failure (more cores).  This is true of Zen also.  In this new model, the 12600K may wind up hitting the highest Ghz / OC, since it has fewer things to fail as it is a 6+4 instead of 8+8 die to start with.


----------



## tabascosauz (Nov 10, 2021)

RandallFlagg said:


> From what I understand, Intel has two dies one is 8+8 (12900K) and its lower rated skus and defects, the other 6+4 (12600K) and its lower rated SKUs defects.



Do you have a source for this? All I can find are various outlets all unanimously reporting on the same MSI ADL overview, which showed one 8+8 and one 6+0. Made a big hubbub about it because the dies are located a bit differently so the hotspot may be different on upcoming 6+0:


----------



## RandallFlagg (Nov 10, 2021)

tabascosauz said:


> Do you have a source for this? All I can find are various outlets all unanimously reporting on the same MSI ADL overview, which showed one 8+8 and one 6+0. Made a big hubbub about it because the dies are located a bit differently so the hotspot may be different on upcoming 6+0:



Yeah I may be mistaken,  I thought I had seen a die shot of the 6+4 that showed 6 golden coves and a 4 core gracemont cluster.  Looking again all I find is the same as your post :


----------



## phanbuey (Nov 10, 2021)

Good to know either way.


----------



## PolRoger (Nov 10, 2021)

phanbuey said:


> I think it's mostly placebo but it for sure feels faster .
> 
> 5.4 failed unfortunately - died after some time with the LinPack... back down to 5.3 now and going to run it for a few days before moving on.
> 
> ...



I don't think my sample is as strong as yours... Mine needs more voltage for ~100MHz less at ~52x... (52P/38E)

Also... If people think there're going to be running these new ADL cpu at 50x to 53x (all core) with long term heavy loads on average air coolers and thin 240 AIO's... I think they will be disappointed. 

CPUZ bench (52P/38E):






Stock defaults with XMP enabled Gear 1:


----------



## unclewebb (Nov 10, 2021)

PolRoger said:


> Mine needs more voltage


This is not a fair comparison. CPU-Z screenshot that @phanbuey posted is showing Core VID voltage. 

Your screenshot shows actual Core Voltage. That is the important number.


----------



## RandallFlagg (Nov 10, 2021)

PolRoger said:


> I don't think my sample is as strong as yours... Mine needs more voltage for ~100MHz less at ~52x... (52P/38E)
> 
> Also... If people think there're going to be running these new ADL cpu at 50x to 53x (all core) with long term heavy loads on average air coolers and thin 240 AIO's... I think they will be disappointed.
> 
> ...




That's actually a hefty boost from default, looks like +10% single and Multi on ZPU-Z.  

Can you guys show the peak power / temp while running that?  

This is my 10850K- note I've got more running than this as it's a workday so it's not a clean bench but is ballpark.  It basically runs 210W during the MT test :


----------



## phanbuey (Nov 10, 2021)

My friend just picked up 12700K well and it's overheating like crazy on his AIO...  here is his 12700K at 240W-280W at 4:9 all core, cannot get 5.0 stable at the moment, here is his cinebench @ 4.9 95C temps:






Seems the 12700Ks are much harder to push?  W1zz's was 5.0 at 1.4V ~ seeing 3 other reviews of 12700K deeming it 'not worth' to overclock.  Could just be the lottery/bioses?  Seems weird for 2 more cores to have such a difference (altho TPU 12600K also didn't really OC).

@PolRoger -- what motherboard are you running?


----------



## RandallFlagg (Nov 10, 2021)

phanbuey said:


> My friend just picked up 12700K well and it's overheating like crazy on his AIO...  here is his 12700K at 240W-280W at 4:9 all core, cannot get 5.0 stable at the moment, here is his cinebench @ 4.9 95C temps:
> 
> 
> View attachment 224710
> ...



240W is insane for 4.9.   It sounds like the BIOS isn't tuned well to me.  He is just running 4.9 on the P-cores right?  Just sayin...

PolRoger is using an Aorus Ultra ITX :


----------



## Devon68 (Nov 10, 2021)

phanbuey said:


> reviews deeming it 'not worth' to overclock.


It might get even better after the Beta testing is done , new motherboards might come out as well as better coolers .


----------



## Outback Bronze (Nov 10, 2021)

Interesting read here for Alder Lake CPU's:

Maximus Z690 and Alder Lake: Modern CPU's require Modern Overclocking Solutions. - CPUs, Motherboards, and Memory - Linus Tech Tips


----------



## fevgatos (Nov 10, 2021)

rares495 said:


> Fair enough. Sometimes I can't read. The 5600X did spank the 10900K in all but a few games though.


Nah, it didn't. I mean, in stock maybe, yeah. The 10900k has an insane overclocking headroom, be it cache at 5.0ghz, ram at 4400c16-4600c16, freequency at 5.1-5.3 ghz. The 5600x doesn't hold a candle. And yes, I assume people that  spend ~500 for a 10900k, will oc the crap of it. I have some numbers from my 10900k in ingame benchmarks that I can share with you, it regularly beats / trades blows with a highly tuned 5950x.


----------



## mstenholm (Nov 10, 2021)

5600X and 10900K. Headline is Adler Lake. Take your disagreements elsewhere please.


----------



## TheoneandonlyMrK (Nov 10, 2021)

mstenholm said:


> 5600X and 10900K. Headline is Adler Lake. Take your disagreements elsewhere please.


Probably Should have made it a club then, no offense intended.


----------



## phanbuey (Nov 10, 2021)

@RandallFlagg -- Everything identical except motherboard + Cpu and bios settings.

12600K @ 5.3 ghz 40 ring 3600mhz Gear 1 Cl 16 not yet tuned:  EDIT: i set it to lowest not sure why it said custom... will rerun soon but i didn't touch any settings between the two.




10850K @ 5.1Ghz - 4000mhz ram CL17 4x8GB tuned timings, 44 ring:





The 12600K is super fast and powerful... but man those old Skylakes are still fast AF.  Enjoy your 10850  



12600K

10850 was around 7.4 seconds.


----------



## PolRoger (Nov 10, 2021)

phanbuey said:


> *My friend just picked up 12700K well and it's overheating like crazy on his AIO..*.  here is his 12700K at 240W-280W at 4:9 all core, cannot get 5.0 stable at the moment, here is his cinebench @ 4.9 95C temps:
> 
> Seems the 12700Ks are much harder to push?  W1zz's was 5.0 at 1.4V ~ seeing 3 other reviews of 12700K deeming it 'not worth' to overclock.  Could just be the lottery/bioses?  Seems weird for 2 more cores to have such a difference (altho TPU 12600K also didn't really OC).
> 
> @PolRoger -- what motherboard are you running?


I was testing my chip at 50x with Prime95 non-AVX  6C/6T and E cores disabled using a thin 240 rad.  Stress was running fine at first with temps in the low to mid 70's. I left the room for a while and when I came back later to check it had completely heat soaked the radiator and temps were in the upper high 90's and some cores had bumped up to 100 c and thermal throttling had been triggered. It was still running Prime with no errors though. I shut it off!


----------



## Psychoholic (Nov 10, 2021)

I have my 12900k limited to 200w package power, it doesnt seem to drop the score much.
Actually, i feel like the sweet spot is around 180w package power.

180 Package Power maxes around 70c and doesnt drop scores much more than this.

I havnt even attempted to OC it because its so damn fast anyway i see no need.


----------



## Deleted member 24505 (Nov 10, 2021)

I intend to get a 12700k soon, probably this week, i think i will just run it stock tbh till there's a few more on here with them fiddling. I have no idea on board yet though apart from z690 and DDR4.


----------



## freeagent (Nov 10, 2021)

phanbuey said:


> I think it's mostly placebo but it for sure feels faster .
> 
> 5.4 failed unfortunately - died after some time with the LinPack... back down to 5.3 now and going to run it for a few days before moving on.
> 
> ...


Man, that thing is a savage. Very nice.


----------



## Caring1 (Nov 11, 2021)

PolRoger said:


> I was testing my chip at 50x with Prime95 non-AVX  6C/6T and E cores disabled using a thin 240 rad.  Stress was running fine at first with temps in the low to mid 70's. I left the room for a while and when I came back later to check it had completely heat soaked the radiator and temps were in the upper high 90's and some cores had bumped up to 100 c and thermal throttling had been triggered. It was still running Prime with no errors though. I shut it off!


Would a decent tower cooler with airflow over the VRMs make a difference?


----------



## PolRoger (Nov 11, 2021)

Caring1 said:


> Would a decent tower cooler with airflow over the VRMs make a difference?


I'm running open air bench style... I prefer to run thick 420 rads on my main rigs... Currently 5950/5900X. From what I've seen the 12600K runs fine @ stock settings (all core) which drops to 45x on my setup. With a decent tower cooler and or a thin 240 type rad I was ok @48x all core. If you want to run Alder Lake @50x-52x(+) all core long term with high loads you will need very good top notch cooling.


----------



## AlwaysHope (Nov 11, 2021)

birdie said:


> FWIW only memtest86 can test your _entire_ memory. Its free version is enough for absolute most average people including OC/tech enthusiasts. Highly recommended. The application is signed so there's no need to disable EFI secure boot.
> 
> View attachment 224142
> 
> Windows built-in memtest can do it too but it's quite simplistic and not as thorough as memtest86 which can find errors not detectable by Windows memory checker.


I agree with you on Memtest86 but your aware that Windows has 3 levels of its memory checker & cache can be disabled?
Don't think MS are going to code poorly for an app if its flagship OS is relying on it.


----------



## phanbuey (Nov 11, 2021)

Alder lake REALLY scales with cache & RAM -- ring frequency, in the games im testing,  makes a bigger fps difference than core clock - SOTTR, BL3 and HZD benchmarks, and CP2077 and Outer worlds playtime.

Here is the pass 1 tuned result -
ram 4x8gb 4000MHz Gear 1 (2000% kuhru ram test stable) - 17 18 18 37 | TRRDS/TRRDL/TFAW = 4/5/20 | TRFC= 400(200ns) - no RTL/IOL tuning yet.
Cpu - P cores 5.3ghz at 1.30-1.32, E cores at stock
Ring - 44x





The biggest jumps were all from RAM and RING (as is usually the case in SOTTR - but I am also noticing it more in other games (Outer Worlds etc.)) - markedly smoother with ring and cache OC, minimal difference in core clock.





3080 is undervolted @ 1815mhz -- not bad for a "6" core.  Biggest jump in Timespy score was also ram+cache driven.  Basically miniscule difference from 5.0 to 5.3 core (except the massive echub).  Huge double digit % gains on cache and ram to 4000 Gear1.  Platform seems starved of memory/cache with heaps of clock/ IPC in the tank.


----------



## RandallFlagg (Nov 11, 2021)

phanbuey said:


> Alder lake REALLY scales with cache & RAM -- ring frequency, in the games im testing,  makes a bigger fps difference than core clock - SOTTR, BL3 and HZD benchmarks, and CP2077 and Outer worlds playtime.
> 
> Here is the pass 1 tuned result -
> ram 4x8gb 4000MHz Gear 1 (2000% kuhru ram test stable) - 17 18 18 37 | TRRDS/TRRDL/TFAW = 4/5/20 | TRFC= 400(200ns) - no RTL/IOL tuning yet.
> ...



I've been wondering about that.  DDR5 is in its infancy, ADL so far seems to be able to keep increasing IPC if you can keep it fed.  

There are some folks on overclock.net that have it up to DDR5-6400 C30.  This memory because of the speed actually has lower latency than my DDR4-3200 C16 :


----------



## lexluthermiester (Nov 11, 2021)

chrcoluk said:


> Reports of AL having issues with some denuvo games, is this true?


It is true. Denuvo will not run on Alder Lake, regardless of OS version.



Psychoholic said:


> I have my 12900k limited to 200w package power, it doesnt seem to drop the score much.
> Actually, i feel like the sweet spot is around 180w package power.
> 
> 180 Package Power maxes around 70c and doesnt drop scores much more than this.
> ...


These are the experiences most people are having that I have seen. The "sweet spot" seems to vary a bit from sample to sample, which seems like a silicon lottery scenario. Alder Lake performance is outstanding, no doubt, but Intel seems to have pushed the architecture close to it's limits. So OCing is showing limited returns.


----------



## Outback Bronze (Nov 11, 2021)

Z690 Prime D4 with 12700KF in my HTPC : )





Had to make sure the old Noctua was going to be compatible. It wasn't a snug fit but works.






Made sure paste was squished.





Heatsink on : )





My HTPC Case.





All Installed : )





Noticed temps are quite hot but the Noctua is a HTPC one which I cant remember the name of so its all good.

I'm running at 4.4Ghz All Core with HT off and no E cores atm. Temps under RealBench stress are about 60c gaming 50c. This is with a -0.1600v undervolt everything stable with gear 1, 3467Mhz 15,15,15,28,2T atm

Now I think the bios needs some work. As soon as I take it off "Sync all Cores" it wont boot windows. Even if un-sync them and they are all @ 4.4Ghz. I would like 2 cores running @ 4.8Ghz with 6x @ 4.4Ghz

Running Window 10 atm with no real issues except for the OC bios problem stated above.

Cheers.


----------



## lexluthermiester (Nov 11, 2021)

Outback Bronze said:


> Made sure paste was squished.


That paste needs to be spread across the entire IHS. This is important. Alder Lake dies are long.


----------



## Outback Bronze (Nov 11, 2021)

lexluthermiester said:


> That paste needs to be spread across the entire IHS. This is important. Alder Lake dies are long.



That was just to test to make sure it was going to make contact which it did. That HSF is not made for Socket 1700.

The next application I did properly.


----------



## lexluthermiester (Nov 11, 2021)

Outback Bronze said:


> That was just to test to make sure it was going to make contact which it did. That HSF is not made for Socket 1700.
> 
> The next application I did properly.


Ah ok.


----------



## PolRoger (Nov 11, 2021)

Outback Bronze said:


> That was just to test to make sure it was going to make contact which it did. That HSF is not made for Socket 1700.
> 
> The next application I did properly.


That looks like a NH-U9B variant?? You need to get the correct new Noctua NM-i17xx bottom bracket adaptor for you cooler. The original 115X bracket might work for short term temporary use but there are some changes to ADL socket specs vs 115X. There are two types of new Noctua brackets available depending on actual cooler revision etc. I got two of them go with my two older Noctua coolers.

Check the Noctua website for the correct match to your specific cooler version.

NM-i17xx-MP78 mounting-kit (noctua.at)

NM-i17xx-MP83 mounting-kit (noctua.at)


----------



## Outback Bronze (Nov 11, 2021)

PolRoger said:


> You need to get the correct new Noctua NM-i17xx bottom bracket adaptor for you cooler.



I doubt it. Been like this for a few days with temps staying the same. Its been screwed on pretty tight so cant see how its going to move? Ill keep monitoring it though. Cheers.


----------



## GerKNG (Nov 11, 2021)

made two videos of the down clocking issue including the bios settings... (i tested per core Overclocks, specific ratios per core, everything on auto/enabled/disabled. manual voltage/current limits in the IA VR Configs)

first one is 100 Mhz above the SC Boost (5 Ghz) and it down clocks to 4.8 to be below the 4.9 max turbo.
second one is 4.8 Ghz (to be 100 Mhz below the SC boost) and it imediately stops down clocking.

12600k frequency Problem #1 (5Ghz) - YouTube
12600k frequency Problem #2 (4.8Ghz) - YouTube


----------



## RandallFlagg (Nov 11, 2021)

Outback Bronze said:


> I doubt it. Been like this for a few days with temps staying the same. Its been screwed on pretty tight so cant see how its going to move? Ill keep monitoring it though. Cheers.



Would be interested in seeing what HWInfo shows for power and temp while running CPU-Z Bench.  If I move to Alder Lake it would likely be the mATX version of your motherboard.


----------



## phanbuey (Nov 12, 2021)

Grabbed top spot for 12600K CPU in timespy for now... People still waiting on their deliveries


----------



## AlwaysHope (Nov 12, 2021)

Outback Bronze said:


> ...
> Made sure paste was squished.
> ...


Oh my... I would never do that with TIM . Could never be certain no tiny air pockets got stuck when screwing down the HS.


----------



## Outback Bronze (Nov 12, 2021)

AlwaysHope said:


> Oh my... I would never do that with TIM . Could never be certain no tiny air pockets got stuck when screwing down the HS.



Not sure I understand you there matey? Are you talking about the type of application I applied? I do a line down the middle of the CPU. 

I was just testing to make sure the TIM "did" get compressed between the CPU and HSF as the bracket for the Noctua is not made for socket 1700 and so far its working sweet once I did a proper application of TIM.


----------



## lexluthermiester (Nov 12, 2021)

AlwaysHope said:


> Oh my... I would never do that with TIM . Could never be certain no tiny air pockets got stuck when screwing down the HS.


Air pockets are almost a myth these days. It's just not going to happen.


----------



## RandallFlagg (Nov 12, 2021)

phanbuey said:


> View attachment 224875
> 
> Grabbed top spot for 12600K CPU in timespy for now... People still waiting on their deliveries



That's an excellent score given that the #1 non HEDT (two memory channel) score on the planet is only getting a 25% higher score :





SkatterBencher got 5.7Ghz on a 12900K with open loop water :


----------



## phanbuey (Nov 12, 2021)

Once I get some more time I will try to find a golden core or 2 to pump to 5.5 ghz...


----------



## RandallFlagg (Nov 12, 2021)

phanbuey said:


> ram 4x8gb 4000MHz Gear 1 (2000% kuhru ram test stable) - 17 18 18 37 | TRRDS/TRRDL/TFAW = 4/5/20 | TRFC= 400(200ns) - no RTL/IOL tuning yet.



Hey what brand of memory is that?  

I think I'm going to upgrade my storage to a single m.2 2TB and my RAM to DDR4-4000, was looking at some Trident Z  F4-4000C18D-32GTZR.  Those timings you have are pretty tight compared to the memory I was looking at though.


----------



## phanbuey (Nov 12, 2021)

RandallFlagg said:


> Hey what brand of memory is that?
> 
> I think I'm going to upgrade my storage to a single m.2 2TB and my RAM to DDR4-4000, was looking at some Trident Z  F4-4000C18D-32GTZR.  Those timings you have are pretty tight compared to the memory I was looking at though.


Teamgroup Dark Pro - samsung b die... just got it to 4000 17 17 17 36 -- great kit

TEAMGROUP T-Force Dark Pro Samsung IC 16GB Kit (2x8GB) DDR4 Dram 3200MHz (PC4-25600) CL14 Desktop Memory Module Ram (Gray) - TDPGD416G3200HC14ADC01 at Amazon.com

This is the kit


----------



## AlwaysHope (Nov 13, 2021)

lexluthermiester said:


> Air pockets are almost a myth these days. It's just not going to happen.


Interesting use of the word "almost". Does not rule out "never".


Outback Bronze said:


> Not sure I understand you there matey? Are you talking about the type of application I applied? I do a line down the middle of the CPU.
> 
> I was just testing to make sure the TIM "did" get compressed between the CPU and HSF as the bracket for the Noctua is not made for socket 1700 and so far its working sweet once I did a proper application of TIM.


As soon as the HS makes contact with the TIM after you've applied it to the top of the cpu, NEVER lift it off, even slightly unless you want to do a complete remount & another dose of your TIM. That's an invitation for air to get in & thus get trapped. Air being a great conductor of heat is *not a good idea* to introduce here.


----------



## Outback Bronze (Nov 13, 2021)

AlwaysHope said:


> NEVER lift it off,



I didn't mate. It was just to show you guys (and myself for that matter) that the old HSF made contact with the TIM on a new socket. I seem to have confused a few people here. I did state "made sure paste was squished" and showed pictures. lol im trying to help you guys 



AlwaysHope said:


> remount & another dose of your TIM



That's exactly what I did.


----------



## The King (Nov 13, 2021)

E-Cores only Gaming test.


Spoiler: The E-Cores were not meant to be used This Way... but it's Amazing!


----------



## chrcoluk (Nov 13, 2021)

Seen debauer video on the e-cores they like a 7700k core so way more powerful than the old atom cores.  The size increase on the cores to get that 40% or so IPC improvement shows how hard it is to get the recent performance gains.


----------



## fevgatos (Nov 13, 2021)

RandallFlagg said:


> Hey what brand of memory is that?
> 
> I think I'm going to upgrade my storage to a single m.2 2TB and my RAM to DDR4-4000, was looking at some Trident Z  F4-4000C18D-32GTZR.  Those timings you have are pretty tight compared to the memory I was looking at though.


Don't get the c18, those are not bdie. Either go for the 4000c16 which are relatively cheap, or if you wanna go balls to the wall you can go for the 4000c14, but the pricing on them is insane. Also Vipers are always a good and cheap option


----------



## lexluthermiester (Nov 13, 2021)

AlwaysHope said:


> Interesting use of the word "almost". Does not rule out "never".


It leaves an allowance for the inexperienced who really screw things up putting on a heatsink improperly. It does happen from time to time. As a general rule anyone with even a shred of competence will get a TIM properly applied and a heatsink mounted to a CPU with the needed pressure/tension to ensure no air "pockets" or "bubbles" will be present when finished.


----------



## RandallFlagg (Nov 13, 2021)

fevgatos said:


> Don't get the c18, those are not bdie. Either go for the 4000c16 which are relatively cheap, or if you wanna go balls to the wall you can go for the 4000c14, but the pricing on them is insane. Also Vipers are always a good and cheap option



Yeah and I just realized those are single rank as well, what I currently have is dual rank.


----------



## phanbuey (Nov 13, 2021)

Going to test a clean install of w11 later today to see if numbers are better


----------



## looniam (Nov 13, 2021)

AlwaysHope said:


> Interesting use of the word "almost". Does not rule out "never".


well, he does claim to be scientist ya' know - with statements like that . . .


----------



## AlwaysHope (Nov 13, 2021)

Outback Bronze said:


> I didn't mate. It was just to show you guys (and myself for that matter) that the old HSF made contact with the TIM on a new socket. I seem to have confused a few people here. I did state "made sure paste was squished" and showed pictures. lol im trying to help you guys
> 
> 
> 
> That's exactly what I did.


No prob. I just haven't seen anyone do that on a PC enthusiasts forum before. At least you had the guts to do that.


----------



## Nike_486DX (Nov 13, 2021)

And all we needed was a core 2 duo, not another pentium 4 abomination.  Lets imagine that this thread is about AMD FX discussing some questionable bulldozer/piledriver advantages, when in reality we should be talking about performance per w, and compare 5950X vs 12900K under continuous 100% load.  And the fact that intel once again changed to yet another socket, without making major changes to the cpu itself (20 cores, 3nm how about that?)


----------



## AlwaysHope (Nov 14, 2021)

Nike_486DX said:


> And all we needed was a core 2 duo, not another pentium 4 abomination.  Lets imagine that this thread is about AMD FX discussing some questionable bulldozer/piledriver advantages, when in reality we should be talking about performance per w, and compare 5950X vs 12900K under continuous 100% load.  And the fact that intel once again changed to yet another socket, without making major changes to the cpu itself (20 cores, 3nm how about that?)


All good points there, especially 2 sockets in 1 year!


----------



## Outback Bronze (Nov 14, 2021)

AlwaysHope said:


> No prob. I just haven't seen anyone do that on a PC enthusiasts forum before. At least you had the guts to do that.



It was worth a shot wasn't it? What have I got to lose? And I wanted to set things up : )

Mate, I've been building PC's for like 20 years and done all sorts of mods, especially with water cooling. I'm the practical type.


----------



## Psychoholic (Nov 16, 2021)

So my "LGA 1700" Kit from corsair for my H150i will be here tomorrow.
The "Kit" is just screws..  shorter I assume due to the shorter CPU in order to apply the correct mounting pressure.

Will be interesting to see if mounting pressure makes a difference.  It is already 74C or so full load using the LGA1200 screws.


----------



## lexluthermiester (Nov 16, 2021)

Psychoholic said:


> Will be interesting to see if mounting pressure makes a difference.


My guess is there will be little, if any.


----------



## Psychoholic (Nov 16, 2021)

lexluthermiester said:


> My guess is there will be little, if any.



Agreed.. but for $2.99 and free shipping we'll do it anyway


----------



## RandallFlagg (Nov 16, 2021)

fevgatos said:


> Don't get the c18, those are not bdie. Either go for the 4000c16 which are relatively cheap, or if you wanna go balls to the wall you can go for the 4000c14, but the pricing on them is insane. Also Vipers are always a good and cheap option



Well, in searching for truly good / fast DDR4 memory, one thing I learned.  

It's not cheaper than DDR5, at least not for 32GB / 2x16GB sticks and comparing to DDR5 MSRP.  32GB DDR4-4000+ with CL18 and below start around $270.  

I think if I were to go all out on a new build I'd want this instead :


----------



## ir_cow (Nov 16, 2021)

@RandallFlagg make sure to get a MB that supports 6400! I'm hoping TeamGroup sends this kit to me for a proper review.


----------



## RandallFlagg (Nov 17, 2021)

So this is the 2nd time I saw a gaming + streaming benchmark where 12900K was able to flex a bit.  The other one was vs 5950X and the results were similar, almost twice as fast.   Now we see why, looks like DDR5 should be the go-to choice for streamers and such :






Source :   https://occlub.ru/testodrom/80697-hp-v10-rgb-ddr4-3600-2x-8-gb-obzor-chto-ona-ispolnjaet/4


----------



## fevgatos (Nov 17, 2021)

RandallFlagg said:


> Well, in searching for truly good / fast DDR4 memory, one thing I learned.
> 
> It's not cheaper than DDR5, at least not for 32GB / 2x16GB sticks and comparing to DDR5 MSRP.  32GB DDR4-4000+ with CL18 and below start around $270.
> 
> ...


Well, a really good ddr4 kit can cost upwards of 500€ for 32gb. That is...a lot of money. Thing with ddr4 is, because of the maturity of them, you can get away with a cheaper kit and probably clock them to the same freequency and latency as the more expensive ones. My 4000c17 can be clocked to 4500c16, and that's an old ram with the old die that doesn't really clock that well. Nowadays all ram kits use the new die and are pretty good at ocing

Now with that said, I still went for a ddr5 + an apex. 1-2 years down the line i planto upgrade my ddr5 to something insane, beiing stuck with ddr4 mobo would suck


----------



## Vayra86 (Nov 17, 2021)

The King said:


> E-Cores only Gaming test.
> 
> 
> Spoiler: The E-Cores were not meant to be used This Way... but it's Amazing!


He might say 'there is barely any load on P cores' but I'm seeing a constant 22% load on it while running his gaming load. And its not the only one either. This is not unlike what happens now, the fat gaming thread takes a full core. Not convinced here that there is an advantage or something 'amazing', just games that work better on threading due to the APIs.

And you're not telling me 22% on a P core is to run 'background tasks' while running a bench.

He's also seeing 66W package... which is not any kind of step forward for gaming on Intel. They used to deliver packages that maxed at 77W~65W that gamed perfectly, for over a decade now. So all we know is a big bunch of E cores will still need P core logic to function as they do. Its really an accelerator, more than a real core. Smart, but the advantage is still misty.


----------



## lexluthermiester (Nov 17, 2021)

RandallFlagg said:


> So this is the 2nd time I saw a gaming + streaming benchmark where 12900K was able to flex a bit.  The other one was vs 5950X and the results were similar, almost twice as fast.   Now we see why, looks like DDR5 should be the go-to choice for streamers and such :
> 
> 
> View attachment 225478
> ...


Let's not over-react. That is one result out of all the others which show DDR4 has the advantage. Additionally, there might be a factor involved in the result that needs fleshing out.

It is very unwise to make a buying decision based on a single result that has not been seen by anyone else on the net.


----------



## RandallFlagg (Nov 17, 2021)

lexluthermiester said:


> Let's not over-react. That is one result out of all the others which show DDR4 has the advantage. Additionally, there might be a factor involved in the result that needs fleshing out.
> 
> It is very unwise to make a buying decision based on a single result that has not been seen by anyone else on the net.



Not the first time I saw that, there just aren't many benchmarks of gaming + streaming, not part of the cookie cutter tests most sites do.  It seems that for gaming + streaming, DDR5 just takes over.  The higher end DDR5 isn't slower than DDR4 either, just expensive (though, higher end DDR4 is also expensive - not talking a ton of $$ difference).  

Edit: should probably also throw in like @ir_cow  pointed out - the motherboard to support these speeds is also quite expensive.  

ofc, high end DDR5 is unobtanium right now but I doubt it will remain that way.


----------



## lexluthermiester (Nov 17, 2021)

RandallFlagg said:


> Not the first time I saw that, there just aren't many benchmarks of gaming + streaming, not part of the cookie cutter tests most sites do.  It seems that for gaming + streaming, DDR5 just takes over.  The higher end DDR5 isn't slower than DDR4 either, just expensive (though, higher end DDR4 is also expensive - not talking a ton of $$ difference).
> 
> Edit: should probably also throw in like @ir_cow  pointed out - the motherboard to support these speeds is also quite expensive.
> 
> ...


For the time being, tight timing DDR4 out performs DDR5 in 99% of use-case-scenario's. That will of course change but not for another 18 to 24 months. Until then, DDR4 with tight timings is the smart choice.


----------



## fevgatos (Nov 17, 2021)

lexluthermiester said:


> For the time being, tight timing DDR4 out performs DDR5 in 99% of use-case-scenario's. That will of course change but not for another 18 to 24 months. Until then, DDR4 with tight timings is the smart choice.


No. Just no.


----------



## RandallFlagg (Nov 17, 2021)

lexluthermiester said:


> For the time being, tight timing DDR4 out performs DDR5 in 99% of use-case-scenario's. That will of course change but not for another 18 to 24 months. Until then, DDR4 with tight timings is the smart choice.



The early DDR4 reviews were done mostly with DDR4-4800 and DDR4-5200, and I also bought into that line for a bit, but the data on those early takes was incomplete.   In particular this line of thought comes from places like AnandTech, which I've repeatedly noted does not reflect what an enthusiast will actually use in either hardware or software - think using DDR4-2933 C20 on an Apex motherboard.  It's not even reflective of what an OEM system will perform like.  It then gets regurgitated everywhere.  

So you might want to check TPU's DDR5-6000 vs DDR4 article.  Or the video I linked to.  Higher speed (6000+) XMP DDR5 is outperforming the highest end DDR4.   And - (edit lower end DDR5 is being tuned to get results unobtainable on DDR4 rigs even when tuned.   
None of that means that DDR4 isn't the best *deal*, but high end DDR5 is demonstrably outperforming high end DDR4 now.


----------



## lexluthermiester (Nov 17, 2021)

RandallFlagg said:


> The early DDR4 reviews were done mostly with DDR4-4800 and DDR4-5200


I'm talking about DDR4 3600 with low timings.


RandallFlagg said:


> and I also bought into that line for a bit, but the data on those early takes was incomplete.


Sorry, the following was not incomplete;








						DDR4 vs. DDR5 on Intel Core i9-12900K Alder Lake Review
					

The Intel Alder Lake platform has support for both DDR5 and DDR4 memory. We ran 38 application benchmarks and 10 games at multiple DDR4 configurations to learn what performance to expect when using DDR4 vs. DDR5 on 12th Gen, and whether there's a point at which DDR4 performance can beat the much...




					www.techpowerup.com
				



This testing showed very clearly that currently, DDR4 has the performance advantage.


RandallFlagg said:


> but high end DDR5 is demonstrably outperforming high end DDR4 now.


Sorry, that is just not correct.


----------



## RandallFlagg (Nov 17, 2021)

lexluthermiester said:


> I'm talking about DDR4 3600 with low timings.
> 
> Sorry, the following was not incomplete;
> 
> ...



Are we looking at the same review?  There's actually no major sub-category where DDR4 wins vs a DDR5-6000 kit on TPU's review.


----------



## lexluthermiester (Nov 17, 2021)

RandallFlagg said:


> Are we looking at the same review?  There's actually no major sub-category where DDR4 wins vs a DDR5-6000 kit on TPU's review.
> 
> View attachment 225529
> 
> View attachment 225530


It would seem we are interpreting the results differently.


----------



## oxrufiioxo (Nov 17, 2021)

So a 800 usd kit of ram beats a $hit 3600 kit by 1% in gaming a wins a win I guess......


----------



## RandallFlagg (Nov 17, 2021)

IDK I just found out according to cpu-z i own all RAM.

DDR4-20500 C18 FTW baby.  26 Ghz ring too.


----------



## SuperMumrik (Nov 17, 2021)

Tuned ddr4 4000c15 gear 1 beats tuned micron ddr5 ic's in gaming, but I highly doubt that you can beat tuned 6400c28 with any d4 config.


----------



## lexluthermiester (Nov 17, 2021)

SuperMumrik said:


> but I highly doubt that you can beat tuned 6400c28 with any d4 config.


Yeah try buying some of that. Not gonna happen for a year or more.


----------



## SuperMumrik (Nov 17, 2021)

lexluthermiester said:


> Yeah buying some of that. Not gonna happen for a year or more.


You can tweak random Hynix 4800Mhz ic's there with some luck in the binning. But ofc those are a nightmare to get atm


----------



## Psychoholic (Nov 17, 2021)

for anyone wondering..  these are the corsair mounting screws for LGA1200 vs LGA1700
LGA1200 on the Left.


----------



## phanbuey (Nov 18, 2021)

SuperMumrik said:


> Tuned ddr4 4000c15 gear 1 beats tuned micron ddr5 ic's in gaming, but I highly doubt that you can beat tuned 6400c28 with any d4 config.


1. Great avatar.  and 2. totally agree


----------



## RandallFlagg (Nov 18, 2021)

More news / leaks on the 12400 :









						Intel Core i5-12400 6 Core CPU Matches The AMD Ryzen 5 5600X In PugetBench Benchmarks, Little i5 To Destroy Top Ryzen 5 In Perf Per $
					

Benchmarks of Intel Core i5-12400 Alder Lake CPU have leaked out and show great performance per dollar against the AMD Ryzen 5 5600X.




					wccftech.com


----------



## tabascosauz (Nov 18, 2021)

Psychoholic said:


> for anyone wondering..  these are the corsair mounting screws for LGA1200 vs LGA1700
> LGA1200 on the Left.



Which is why I was going to say that Asus including the LGA1200 holes may not work perfectly, there's also a z-height difference between LGA1200 and LGA1700 package that a lot of Asus owners seem to have glossed over.

Noctua's page mentions as well:


----------



## RandallFlagg (Nov 18, 2021)

Alder Lake - P, H which is probably 35/45W.  So far this appears to be an absolute beast for mobile. 

I may just go back to laptop once these release since desktop GPUs cost as much as a laptop with a GPU.  These scores put the mobile 12700H in the same multi-core performance area as a 3900X and 12600K, and above any Zen 3 in single core (slightly below AL desktop K series), squashes the 5980HX mobile part which is AMDs current top line Zen 3 mobile.    









						Intel Core i7-12700H outperforms Ryzen 9 5900HX by 47% in leaked Cinebench benchmarks - VideoCardz.com
					

Intel Core i7-12700H in Cinebench Notebookcheck gained access to information on the unreleased Intel Alder Lake-P laptop CPU. The website shares Cinebench R20 and R23 scores featuring unreleased Core i7-12700H Alder Lake-P CPU. This particular processor is equipped with 14 cores and 20 threads...




					videocardz.com


----------



## The red spirit (Nov 18, 2021)

RandallFlagg said:


> More news / leaks on the 12400 :
> 
> 
> 
> ...


A bit of shame that it doesn't have any E cores at all and that it has base clock of just 2.5 GHz. At least it has respectable boost. BTW so far it is slower than 5600X, which is also worrying. Anyway, does not having E cores, mean that Denuvo works?


----------



## lexluthermiester (Nov 18, 2021)

Who cares if Denuvo works? That crap needs to die, like all DRM...


----------



## phanbuey (Nov 20, 2021)

Final 24/7 settings on the 12600K -  settled in at ~5.3 ghz
























Ok time to stop benching and play some games.


----------



## Outback Bronze (Nov 20, 2021)

phanbuey said:


> Final 24/7 settings on the 12600K - settled in at ~5.3 ghz



What's your temps like?


----------



## phanbuey (Nov 20, 2021)

Outback Bronze said:


> What's your temps like?



80c max on cores and 75c on package ring during timespy / heavy load. 60-63c during gaming.  Touches 90c during cinebench r23 after about 5-6mins I have the mobo throttle it at 90.


----------



## gerardfraser (Nov 20, 2021)

I also have settled on a daily setting on 12900K.
5400Mhz P-Core
4000Mhz E-Core
4100Mhz Ring

Forza Horizon 5 12900K 5400Mhz 4K HDR PC Gameplay CPU Voltage,Temperature OSD Shown​


----------



## birdie (Nov 20, 2021)

Some Interesting findings from a german forum.

_"Setting the TDP to 175-200W __appears__ to the sweet, losing just under 5% performance for an efficiency gain of ~30%. At 175W, the Core i9-12900K offers 95% of stock performance while drawing 75W less power, gaining an efficiency advantage of 35%. The temperature also drops by 20 degrees, allowing for prolonged and more consistent boosts."_









12900K can actually be forced to use at most 25 or even ... 2W of power. How is that even possible if it idles around 6?


----------



## phanbuey (Nov 21, 2021)

birdie said:


> Some Interesting findings from a german forum.
> 
> _"Setting the TDP to 175-200W __appears__ to the sweet, losing just under 5% performance for an efficiency gain of ~30%. At 175W, the Core i9-12900K offers 95% of stock performance while drawing 75W less power, gaining an efficiency advantage of 35%. The temperature also drops by 20 degrees, allowing for prolonged and more consistent boosts."_
> 
> ...


That's crazy, CPU Z is not even an intense stress test.  to hit 100C on that means you instathrottle on Cinebench/Prime.




Guys PSA - if you're running 4 dimms on alder lake turn around timings on alder lake are MASSIVE in performance impact in gear 1.





Going to play a bit more but this is with tRDRD_SG/DG/DR/DD @ 7/4/77 -- 16,445 on timespy up from ~15,900-16,000
SOTTR went from 281 to 291 on avg with higher mins.


----------



## oxrufiioxo (Nov 21, 2021)

phanbuey said:


> That's crazy, CPU Z is not even an intense stress test.  to hit 100C on that means you instathrottle on Cinebench/Prime.
> 
> 
> 
> ...



I think I've seen the 12900k over 360fps in SOTTR benchmark 5950X seems to cap out around 350fps. Definitely need really good dual rank ram with tight timings though.


----------



## phanbuey (Nov 21, 2021)

oxrufiioxo said:


> I think I've seen the 12900k over 360fps in SOTTR benchmark 5950X seems to cap out around 350fps. Definitely need really good dual rank ram with tight timings though.



Yeah the 12900k has the extra cache and 2 more cores that SOTTR loves.  It does go up to 400 for some areas fine so I think the cap is hardware bound.  The difference between builds is so huge ~ +70 fps avg with the steam update :/ to v453.0_64  I just use it to compare to my own scores to see if my tinkering is having a positive or negative effect lol.


----------



## RandallFlagg (Nov 21, 2021)

Don't forget tWRWR_dd , WRRD_dd, RDWR_dd and so on - timings for read/write to the other DIMM.  

Are there any new settings with DDR5?


----------



## phanbuey (Nov 22, 2021)

RandallFlagg said:


> Don't forget tWRWR_dd , WRRD_dd, RDWR_dd and so on - timings for read/write to the other DIMM.
> 
> Are there any new settings with DDR5?


Just tweaked those last night... they do make a tiny difference but not like the read turnarounds... Still playing with read to write and write to read turnaround delays at this point i feel like i hit the point of diminishing returns.

EDIT: so those timings together are tricky af.  If they are out of sync some really weird stuff starts happening in the frame pacing department.  I think I found a combo that works (tested in 5 games) but man... the FPS gain per hour of tinkering definitely took a dive with the full set of turnarounds vs just the read to read and write to write.


----------



## GerKNG (Nov 23, 2021)

Assassins Creed Valhalla is now fixed (works with e cores enabled just like it's supposed to be)


----------



## phanbuey (Nov 26, 2021)

Best CPUs of 2021 (Gaming, Workstation, Budget, & Disappointment) - YouTube

I agree.  Except for the 11900k bit.  From a consumer point of view yes the 11th gen was atrocious. But I think the Ian had it right that this was really an internal exercise in adapting designs to different nodes / backporting more than anything else.

Rocket Lake is a Success for Intel  - YouTube


----------



## RandallFlagg (Nov 26, 2021)

phanbuey said:


> Best CPUs of 2021 (Gaming, Workstation, Budget, & Disappointment) - YouTube
> 
> I agree.  Except for the 11900k bit.  From a consumer point of view yes the 11th gen was atrocious. But I think the Ian had it right that this was really an internal exercise in adapting designs to different nodes / backporting more than anything else.
> 
> Rocket Lake is a Success for Intel  - YouTube



Yeah he was too harsh on RKL.  From a gaming perspective sure, there's nothing there vs Comet Lake.  But, you could say the same thing of Zen 3 vs Comet Lake in gaming.   

However they are significantly faster in single and lightly threaded apps, particularly web, which means snappier feel for the end user.   This is why, for example, RKL was a > 10% faster than its nearest competitor in converting files using iTunes - which is something that a lot of people actually do as opposed to running Blender.  It's also faster on many games which are either lightly threaded or bound to a single limiting thread.  That actually describes most games in existence, new and old.


----------



## gerardfraser (Nov 26, 2021)

Some 12900K some cheap MSI board
CPU 1.16v LLC Mode 2 + Adaptive voltage+offset
P-Core 5100
E-Core 40
Ring auto


----------



## phanbuey (Nov 26, 2021)

gerardfraser said:


> Some 12900K some cheap MSI board
> CPU 1.16v LLC Mode 2 + Adaptive voltage+offset
> P-Core 5100
> E-Core 40
> Ring auto


Cheap MSI boards are the amazing on intel.


----------



## Zubasa (Nov 30, 2021)

phanbuey said:


> Cheap MSI boards are the amazing on intel.


Especially their DDR4 boards, they are at the top of the game right now in terms of DDR4 support on 12th gen.


----------



## Deleted member 24505 (Dec 4, 2021)

Here are my stock scores on R23 and geekbench. I have added my temps too, you can see temp with CPU at 100% I notice its at 3.6ghz, is that the max stock on a 12700k?


----------



## Outback Bronze (Dec 4, 2021)

Tigger said:


> I notice its at 3.6ghz, is that the max stock on a 12700k?



You mean 4.6?

Pretty sure mine will boost to 4.7Ghz stock AC.


----------



## Psychoholic (Dec 4, 2021)

Tigger said:


> Here are my stock scores on R23 and geekbench. I have added my temps too, you can see temp with CPU at 100% I notice its at 3.6ghz, is that the max stock on a 12700k?



If you're referring to the 3.6ghz in Cinebench, it is reporting your base clocks.


----------



## RandallFlagg (Dec 4, 2021)

New leak:


----------



## Selaya (Dec 4, 2021)

Iirc all B660s are DDR4 (only Z690s have DDR5 versions) which makes sense since if you're going for a B660 you're unlikely to drop $300 on a pair of DDR5s or highend B-Dies and instead settle w/ the cheapest 3600-C16 or something


----------



## Deleted member 202104 (Dec 4, 2021)

Might be a bit off.  The PRO Z690-A is the same price


----------



## RandallFlagg (Dec 4, 2021)

weekendgeek said:


> Might be a bit off.  The PRO Z690-A is the same price
> 
> View attachment 227556



Well it is Twitter, and they don't usually have that many mATX boards.


----------



## Zubasa (Dec 4, 2021)

RandallFlagg said:


> weekendgeek said:
> 
> 
> > Might be a bit off.  The PRO Z690-A is the same price
> ...


It is possible to have the lower end chipset to be at almost the same price.
B550 is an example, there are plenty of boards that cost more than the cheapest X570 boards and even with better components.
The cost of the PCH is a rather small difference compare to the retail price.
B560 allows Ram OC and MCE, some boards allow adjusting the PL and Turbo within the locked ratios.
The higher-end B660 boards better have VRMs good enough to support a 12900 non-K under MCE.


----------



## lexluthermiester (Dec 4, 2021)

Tigger said:


> is that the max stock on a 12700k?


You may need to adjust your UEFI settings to raise the power levels.



Outback Bronze said:


> You mean 4.6?


Nope, look at the screenshot. It shows 3.61ghz. Of course Cinebench also says he on Windows 10... I think Cinebench might be screwy.


----------



## Outback Bronze (Dec 4, 2021)

lexluthermiester said:


> look at the screenshot



His 100% in HWM says 4.579.2Mhz though no? This is what I was going by not CB.


----------



## Deleted member 24505 (Dec 4, 2021)

Outback Bronze said:


> His 100% in HWM says 4.579.2Mhz though no? This is what I was going by not CB.



thx, i never noticed that.


----------



## Outback Bronze (Dec 4, 2021)

Tigger said:


> thx, i never noticed that.



Looks like nobody did matey : )


----------



## Deleted member 24505 (Dec 4, 2021)

Outback Bronze said:


> Looks like nobody did matey : )



The temp wasn't too bad for a100% either, probably got a bit of headroom left too.


----------



## Outback Bronze (Dec 4, 2021)

Tigger said:


> The temp wasn't too bad for a100% either, probably got a bit of headroom left too.



Saw that. Yes good temps to push further.


----------



## Deleted member 24505 (Dec 5, 2021)

https://valid.x86.fr/au1wdb


----------



## Psychoholic (Dec 5, 2021)

Nice, our single thread scores are pretty much identical!  
Defaults with a slight undervolt to run cool on air.









						Intel Core i9 12900K @ 4900 MHz - CPU-Z VALIDATOR
					

[tjhavj] Validated Dump by BEAST (2021-12-05 04:07:36) - MB: Asus ROG STRIX Z690-A GAMING WIFI D4 - RAM: 32768 MB




					valid.x86.fr


----------



## birdie (Dec 8, 2021)

Intel has shortened the list of games with DRM/Denuvo that malfunctioned under Windows 10 and 11 in combination with the #AlderLake processors. Only the following are still affected:

Assassin’s Creed: Valhal
Fernbus Simulate
Madden 22
Source: List of Games Affected by DRM Issue in 12th Gen Intel Core Processors for Windows 10 and 11.


----------



## RandallFlagg (Dec 8, 2021)

birdie said:


> Intel has shortened the list of games with DRM/Denuvo that malfunctioned under Windows 10 and 11 in combination with the #AlderLake processors. Only the following are still affected:
> 
> Assassin’s Creed: Valhal
> Fernbus Simulate
> ...



Noticed I got a couple of Windows updates yesterday (I'm on Win 11), going to guess that was the fix.


----------



## vMax65 (Dec 9, 2021)

Tigger said:


> Here are my stock scores on R23 and geekbench. I have added my temps too, you can see temp with CPU at 100% I notice its at 3.6ghz, is that the max stock on a 12700k?
> View attachment 227549


The score looks a bit on the lower side and the P-Cores should be running at 4.7GHz. Okay, okay was reading too fast and noticed it is all sorted now!


----------



## Deleted member 24505 (Dec 9, 2021)

vMax65 said:


> The score looks a bit on the lower side and the P-Cores should be running at 4.7GHz. Okay, okay was reading too fast and noticed it is all sorted now!



That's completely stock. i don't think it's too bad.


----------



## vMax65 (Dec 9, 2021)

Tigger said:


> That's completely stock. i don't think it's too bad.


Yep, I did miss a part of the responses as you mentioned core clocks in the 3.7GHz but that was cleared up in the later post. I have pretty much the exact same setup with the 12700K and the Strix-A D4 which for my first ASUS motherboard is not bad at all. Hope you took the cash back offer from ASUS which made it better value... Stock I am getting just over 23,178 and overclocked a fraction over 24,079 so you are absolutely in the same ballpark as I have DDR4 3600 vs DDR4 3200.

PS with the EK setup, what temps do you get for CinebenchR23 at stock if you don't mind me asking? Was thinking of going for a custom loop rather than the AIO at some point?


----------



## Deleted member 24505 (Dec 9, 2021)

vMax65 said:


> Yep, I did miss a part of the responses as you mentioned core clocks in the 3.7GHz but that was cleared up in the later post. I have pretty much the exact same setup with the 12700K and the Strix-A D4 which for my first ASUS motherboard is not bad at all. Hope you took the cash back offer from ASUS which made it better value... Stock I am getting just over 23,178 and overclocked a fraction over 24,079 so you are absolutely in the same ballpark as I have DDR4 3600 vs DDR4 3200.
> 
> PS with the EK setup, what temps do you get for CinebenchR23 at stock if you don't mind me asking? Was thinking of going for a custom loop rather than the AIO at some point?



Look at max temp on the SS. i have since added a EK 360 PE rad to the EK 280 CE. temps are pretty good tbh-
idle




Gaming-


----------



## Caring1 (Dec 10, 2021)

birdie said:


> Some worrisome information:
> 
> 
> 
> ...


Already posted in this thread:








						Intel LGA 1700 socket problem??
					

Found this, its about the LGA 1700 CPU/socket bending causing bad temps. Pretty interesting. Glad i'm using the EK block with the pretty hefty EK LGA 1700 plate.  https://www.igorslab.de/en/bad-cooling-at-alder-lake-problems-at-socket-lga-1700-on-the-lane-among-all-remedies/




					www.techpowerup.com


----------



## RandallFlagg (Dec 10, 2021)

birdie said:


> Some worrisome information:
> 
> 
> 
> ...



Read through that and some of the comments.  

Seems like two problems, one is that some of the CPUs are not flat on the top.  Other has to do with damaging the socket, soft socket.

I'm wondering if the LGA 1200 coolers + adapter affects the 2nd item too.   Supposedly some of them are pretty tight as the AL chips are taller.  That plus a convex heat spreader (even taller) would I imagine lead to a problem with any weakness in the socket.  

i.e. :


----------



## Zubasa (Dec 11, 2021)

RandallFlagg said:


> Read through that and some of the comments.
> 
> Seems like two problems, one is that some of the CPUs are not flat on the top.  Other has to do with damaging the socket, soft socket.
> 
> ...





I am pretty sure it is the the other way around, the overall Z-height is lower on LGA1700. Meaning there is less material to resist bending.


----------



## Deleted member 24505 (Dec 11, 2021)

Could less than 1mm decrease in Z height make so much difference. Here are pics of the paste when i took my EK block off on when i changed cases. Maybe the thick EK back plate is working.


----------



## vMax65 (Dec 11, 2021)

Tigger said:


> Look at max temp on the SS. i have since added a EK 360 PE rad to the EK 280 CE. temps are pretty good tbh-
> idle
> View attachment 228293
> Gaming-
> View attachment 228294


Thanks your idle and gaming temps are very good...been a long time since I did a custom loop but I am going to jump back in! Thanks again.


----------



## TheoneandonlyMrK (Dec 11, 2021)

Tigger said:


> Look at max temp on the SS. i have since added a EK 360 PE rad to the EK 280 CE. temps are pretty good tbh-
> idle
> View attachment 228293
> Gaming-
> View attachment 228294


Wouldn't actually running cinebench for ten minutes and showing those temps tell him and s more about the efficacy of your cooling, I'm not sure idle temps are useful personally.

Just intrigued what custom cooling can do for say crunching loads on those chips.


----------



## Deleted member 24505 (Dec 11, 2021)

Shows max temps, just ran R23


----------



## gerardfraser (Dec 14, 2021)

Cinebench20 run.
Adaptive voltage + offset at -165
LLC mode 2
AC load 25
DC load 100
51 P-core
40 E-core
Load voltage 1.12v @ 66°C


----------



## Deleted member 24505 (Dec 20, 2021)

Cashback approved


----------



## 1100R (Dec 20, 2021)

Tigger said:


> Cashback approved
> View attachment 229553


Here's mine


----------



## lexluthermiester (Dec 20, 2021)

Folks, Professor Barnatt just did a video that explains a lot that many have found confusing about the Pcore/Ecore dynamic. Learned a few things I didn't know and as some of you know I keep up on specs and whatnot. Worth a watch.


----------



## GerKNG (Dec 20, 2021)

birdie said:


> Intel has shortened the list of games with DRM/Denuvo that malfunctioned under Windows 10 and 11 in combination with the #AlderLake processors. Only the following are still affected:
> 
> Assassin’s Creed: Valhal
> Fernbus Simulate
> ...


Valhalla has been long fixed (several weeks ago)


----------



## Deleted member 24505 (Dec 20, 2021)

Well seems Intels adoption of Big.Little might not be the big joke it has garnered. Won't be long till AMD announce their take on it.


----------



## 1100R (Dec 20, 2021)

I ordered this backplate to take advantage of  my Bykski CPU waterblock. I hope it arrives soon.


----------



## AlwaysHope (Dec 21, 2021)

Tigger said:


> Well seems Intels adoption of Big.Little might not be the big joke it has garnered. Won't be long till AMD announce their take on it.


Chipzilla has always led the industry with innovations, big.LITTLE is no exception.


----------



## Deleted member 24505 (Dec 21, 2021)

AlwaysHope said:


> Chipzilla has always led the industry with innovations, big.LITTLE is no exception.



Big. Little is a good idea, phones have been using it for ages, no one using a phone whines about it. Intel using it is a good idea, if it is properly implemented it is a good energy saving system. there's no point using a 500hp truck to go to the shops is there, the little fiesta is great for that, so i get it. Maybe some don't.
Amd will implement it at some point no doubt.


----------



## ratirt (Dec 21, 2021)

Tigger said:


> Big. Little is a good idea, phones have been using it for ages, no one using a phone whines about it. Intel using it is a good idea, if it is properly implemented it is a good energy saving system. there's no point using a 500gp truck to go to the shops is there, the little fiesta is great for that, so i get it. Maybe some don't.
> Amd will implement it at some point no doubt.


I though phones use only little with no big. None of the cores is x86 architecture but ARM. Phones are also different devices than a computer chip. Just because phones are using smaller chips doesnt mean the computers has to as well because nobody is complaining. This approach is to reduce power only. That is the only logical reason and marketing reason.
They need way less processing power and the difference in cores in the phone is due to money saving and clock higher which is faster.


----------



## Deleted member 24505 (Dec 21, 2021)

ratirt said:


> I though phones use only little with no big. None of the cores is x86 architecture but ARM. Phones are also different devices than a computer chip. Just because phones are using smaller chips doesnt mean the computers has to as well because nobody is complaining. This approach is to reduce power only. That is the only logical reason and marketing reason.
> They need way less processing power and the difference in cores in the phone is due to money saving and clock higher which is faster.



the way phones use it is similar to PC use. big chip/section for high power apps, Small chip/section for low power apps. The intention is to create a multi-core processor that can adjust better to dynamic computing needs and use less power than clock scaling alone. which is exactly what Intels Big.Little is doing. 
*ARM big.LITTLE* is a heterogeneous computing architecture developed by ARM Holdings, coupling relatively battery-saving and slower processor cores (_LITTLE_) with relatively more powerful and power-hungry ones (_big_). Typically, only one "side" or the other will be active at once, but all cores have access to the same memory regions, so workloads can be swapped between Big and Little cores on the fly.


----------



## ratirt (Dec 21, 2021)

Tigger said:


> the way phones use it is similar to PC use. big chip/section for high power apps, Small chip/section for low power apps. The intention is to create a multi-core processor that can adjust better to dynamic computing needs and use less power than clock scaling alone. which is exactly what Intels Big.Little is doing.
> *ARM big.LITTLE* is a heterogeneous computing architecture developed by ARM Holdings, coupling relatively battery-saving and slower processor cores (_LITTLE_) with relatively more powerful and power-hungry ones (_big_). Typically, only one "side" or the other will be active at once, but all cores have access to the same memory regions, so workloads can be swapped between Big and Little cores on the fly.


Yes I agree but why do you think this is a good approach for a PC? Just because phones are using it and as a battery powered devices need this solution? No matter how you slice it, it all comes down to power consumption balance with performance and that is why Intel used this approach. Not to mention, change the core count to tackle AMD's products. It still doesn't explain any other benefit that would this approach give to a PC market. How I see it, It is better to use little core for Intel and AMD because they don't need to advance their technology that much. They just use slower, less power hungry cores show some sort of improvement and still advertise a CPU as 16c despite half of them is small cores. Don't you see that this is some sort of marketing scheme here? Now you say it would have been nice to have this approach in a PC. I understand the smartphone market but PC?
When you say high power apps? Meaning more demanding apps which will use the processors high performance cores all the time. These cores can do 'lower power apps' (you would say it that way) as well and faster than any smaller core. The fact is it is not necessary and thus smaller cores are fast enough to keep things going. Use less power and have more cores since the smaller cores are smaller than bigger obviously and you can pack more. Either way, I dont think this approach is a good idea for a PC, just because phones are using it and nobody complains.


----------



## lexluthermiester (Dec 21, 2021)

Tigger said:


> Well seems Intels adoption of Big.Little might not be the big joke it has garnered.


Some people thought it was a gimmick, but they delivered.


----------



## Deleted member 24505 (Dec 21, 2021)

ratirt said:


> Yes I agree but why do you think this is a good approach for a PC? Just because phones are using it and as a battery powered devices need this solution? No matter how you slice it, it all comes down to power consumption balance with performance and that is why Intel used this approach. Not to mention, change the core count to tackle AMD's products. It still doesn't explain any other benefit that would this approach give to a PC market. How I see it, It is better to use little core for Intel and AMD because they don't need to advance their technology that much. They just use slower, less power hungry cores show some sort of improvement and still advertise a CPU as 16c despite half of them is small cores. Don't you see that this is some sort of marketing scheme here? Now you say it would have been nice to have this approach in a PC. I understand the smartphone market but PC?
> When you say high power apps? Meaning more demanding apps which will use the processors high performance cores all the time. These cores can do 'lower power apps' (you would say it that way) as well and faster than any smaller core. The fact is it is not necessary and thus smaller cores are fast enough to keep things going. Use less power and have more cores since the smaller cores are smaller than bigger obviously and you can pack more. Either way, I dont think this approach is a good idea for a PC, just because phones are using it and nobody complains.



Well don't people whine when a CPU uses "too" much power. Imo big.little is a good way to change it. No point using high power cores for small tasks.


----------



## ratirt (Dec 21, 2021)

Tigger said:


> Well don't people whine when a CPU uses "too" much power. Imo big.little is a good way to change it. No point using high power cores for small tasks.


Yes but it doesn't matter. You have to use shortcuts to bring it lower don't you?
There is a point here. Big cores can do both small cores are for small tasks because with big they would have been inefficient.


----------



## hat (Dec 21, 2021)

Tigger said:


> Well don't people whine when a CPU uses "too" much power. Imo big.little is a good way to change it. No point using high power cores for small tasks.


And yet ADL still sucks down hundreds of watts. Not that I'm complaining about that. I just don't think this design has much place in the desktop PC space. Maybe laptops, but I thought all the existing tech we had already worked well enough. Now we have funky, weird silicon that requires a bunch of work on the software side to get it to work correctly. Or maybe I'm just misunderstanding something?


----------



## GerKNG (Dec 21, 2021)

complaints about pulling more power is only a thing of people from the other "tribe" to have something after losing in everything else.


----------



## Deleted member 24505 (Dec 21, 2021)

As far as Big.little is concerned, like it or not, all CPU's will be this way soon.


----------



## lexluthermiester (Dec 21, 2021)

hat said:


> And yet ADL still sucks down hundreds of watts.


Only under max load. Most loads, even heavy gaming, do not force AlderLake to max power draw. W1zzard's testing clearly showed this and his testing is echoed by many other sites reviewers as well. It is a point we really shouldn't be harping on..



Tigger said:


> As far as Big.little is concerned, like it or not, all CPU's will be this way soon.


Not all. For example, the i5-12400 has no Ecores at all. Not everything needs that balance of power/economy.


----------



## Deleted member 24505 (Dec 21, 2021)

lexluthermiester said:


> Only under max load. Most loads, even heavy gaming, do not force AlderLake to max power draw. W1zzard's testing clearly showed this and his testing is echoed by many other sites reviewers as well. It is a point we really shouldn't be harping on..
> 
> 
> Not all. For example, the i5-12400 has no Ecores at all. Not everything needs that balance of power/economy.



Maybe they will just use it for the higher end ones. Is there gonna be a 4P 2E chip for 10 core i wonder


----------



## phanbuey (Dec 21, 2021)

hat said:


> And yet ADL still sucks down hundreds of watts. Not that I'm complaining about that. I just don't think this design has much place in the desktop PC space. Maybe laptops, but I thought all the existing tech we had already worked well enough. Now we have funky, weird silicon that requires a bunch of work on the software side to get it to work correctly. Or maybe I'm just misunderstanding something?



You have silicon that can now run threads on different core types depending on <whatever> and that enables a performance advantage even on worse performing silicon.

Instead of having just inefficient cores and pumping 400W through them to get close to multi performance of Zen 3, you now can pump a max of 241W through your super fat, power hungry cores, smash single cores and all lightly-threaded/latency sensitive workloads for the first 16 threads, and then match on MT performance even though your real core design had no chance in hell to do that by itself.

More importantly you can now run threads on different types of cores ... in the future even different ISA's - maybe ARM cores? maybe GPU / GPGPU instructions? and route all of them in real time to the correct piece of silicon.  Seems like it could be a big deal.  On one hand you can squeeze out performance and on the other you can add accelerators/different core types/ architectures to tailor the chip to whatever it's supposed to be good at.

I'm inclined to agree with Tigger, all future chips will use this sooner or later.


----------



## Deleted member 24505 (Dec 22, 2021)

Imo with few bios updates and some scheduler tweaks from Microsoft, ADL could get a whole lot better. Seems to me, the motherboard manufacturers and Microsoft are still working out the nitty gritty. Laugh and call us bios testers if you like, but i really believe these will get better once they have.


----------



## vMax65 (Dec 24, 2021)

My current score on CBR23 and HWinfo screen shot...Still not happy as vcore seems a tad high though temps are no issues..


----------



## fevgatos (Dec 24, 2021)

hat said:


> And yet ADL still sucks down hundreds of watts. Not that I'm complaining about that. I just don't think this design has much place in the desktop PC space. Maybe laptops, but I thought all the existing tech we had already worked well enough. Now we have funky, weird silicon that requires a bunch of work on the software side to get it to work correctly. Or maybe I'm just misunderstanding something?


Yes you are. ADL sucks down hundreds of watts running ALL CORE WORKLOADS AT 5ghz. Try doing that with a Ryzen CPU. You know what you'd need? Yeah, a bunch of LN2 canisters. So with that in mind, ALD is extremely efficient. Saying "it sucks down hundreds of watts" is just dumb. Every CPU can suck down hundreds of watts if you push it. So the 12900k was pushed from the factory. You don't like it, lower the limits, problem solved.


----------



## Deleted member 24505 (Dec 24, 2021)

fevgatos said:


> Yes you are. ADL sucks down hundreds of watts running ALL CORE WORKLOADS AT 5ghz. Try doing that with a Ryzen CPU. You know what you'd need? Yeah, a bunch of LN2 canisters. So with that in mind, ALD is extremely efficient. Saying "it sucks down hundreds of watts" is just dumb. Every CPU can suck down hundreds of watts if you push it. So the 12900k was pushed from the factory. You don't like it, lower the limits, problem solved.



It's the same and only argument of all ryzen fans against ADL, i'm over it, they can say what they like. My chip is fast and runs cool.


----------



## fevgatos (Dec 25, 2021)

Tigger said:


> It's the same and only argument of all ryzen fans against ADL, i'm over it, they can say what they like. My chip is fast and runs cool.


It would be absolutely fine complaining about power consumption if it was actually power hungry. It is not, it's just pushed out of the factory, something you can fix in literally 5-10 seconds, by lowering the power limit.

Anyone with a brain would realize that being able to hit 5.3 + ghz all core is an achievement on it's own. Instead we have people complaining. Does anyone think for a second what youd actually need to push a ryzen CPU to those clockspeeds at blender? Subzero, dry ice and l2 canisters.

Sadly im stuck out of ddr5 ram, supposedly it's coming on the 29th, so I can do some testing on a U12A and see how far that can get me. I have the cpu and mobo for a month now :/


----------



## Zubasa (Dec 25, 2021)

fevgatos said:


> Yes you are. ADL sucks down hundreds of watts running ALL CORE WORKLOADS AT 5ghz. Try doing that with a Ryzen CPU. You know what you'd need? Yeah, a bunch of LN2 canisters. So with that in mind, ALD is extremely efficient. Saying "it sucks down hundreds of watts" is just dumb. Every CPU can suck down hundreds of watts if you push it. So the 12900k was pushed from the factory. You don't like it, lower the limits, problem solved.


So since all core work loads don't matter, whats the point of the i9 over the i7?
The same argument have been going on and on, the fact is the i9 is not as efficient as its competition no matter how you spin it.
Not that I care about efficiency much on Desktop. But there is nothing wrong with calling a duck a duck.


----------



## Deleted member 24505 (Dec 25, 2021)

Some people build a desktop system with high power components, then under volt and or down clock every thing as it's using "too" much power. what is the point in that.

Zubasa "Not that I care about efficiency much on Desktop", neither do I.


----------



## Zubasa (Dec 25, 2021)

Tigger said:


> Some people build a desktop system with high power components, then under volt and or down clock every thing as it's using "too" much power. what is the point in that.
> 
> Not that I care about efficiency much on Desktop, neither do I.


Agree.
I just can't wrap my head around people getting an i9 and not having enough cooling for it.
Then they just insist the reviews are all wrong because only gaming loads matter.
IMO in that case they should have gotten the i7 or i5 and save themselves the trouble.
The i5 being more efficient than the 5800X is cheaper and performs better most of the time.
Its the same BS round and round again that Ryzen fans insist that ADL isn't better in games and i9 fans insisting that their CPU is actually more efficient.


----------



## fevgatos (Dec 25, 2021)

Zubasa said:


> So since all core work loads don't matter, whats the point of the i9 over the i7?
> The same argument have been going on and on, the fact is the i9 is not as efficient as its competition no matter how you spin it.
> Not that I care about efficiency much on Desktop. But there is nothing wrong with calling a duck a duck.


Ι did not say that all core workloads don't matter. I'm saying you are not supposed to run those workloads at 5ghz all core frequencies.  If blender is your game power limit it to a reasonable consumption that your cooler can handle. As you would do with every other CPU, right?



Zubasa said:


> Agree.
> I just can't wrap my head around people getting an i9 and not having enough cooling for it.
> Then they just insist the reviews are all wrong because only gaming loads matter.
> IMO in that case they should have gotten the i7 or i5 and save themselves the trouble.
> ...


You are missing the point again. Yes, 12900k on stock power limit is inefficient on all core workloads. That's because of the insane all core frequency it is running at. Try running an all core workload with that frequency on a 5950x. Tell me the results. 

If you actually do a test with lower power limits, 12900k is extremely efficient. It's not as good as a 5950x in all core workloads, but it beats everything else and it stomps the 5950x in lightly threaded tasks and games, both in performance and efficiency. Derbauer run a test and he found that 12900k is up to 50% more efficient than the 5950x in gaming.


----------



## Zubasa (Dec 25, 2021)

fevgatos said:


> Ι did not say that all core workloads don't matter. I'm saying you are not supposed to run those workloads at 5ghz all core frequencies.  If blender is your game power limit it to a reasonable consumption that your cooler can handle. As you would do with every other CPU, right?
> 
> 
> You are missing the point again. Yes, 12900k on stock power limit is inefficient on all core workloads. That's because of the insane all core frequency it is running at. Try running an all core workload with that frequency on a 5950x. Tell me the results.
> ...


Comparing frequencies on different architectures is pointless. Also i9 does not sustain 5Ghz on P-cores under its stock 241W PL, it hovers around 4.9Ghz.
And yes Intel did stated to reviwers that it is how the 12900K is intended to be used.



https://www.techpowerup.com/forums/...ts-between-50-w-and-241-w.289572/post-4663182

Also there is a fatal flaw to a lot of arguments presented out there. That is assuming AMD is in capable of reducing power limit as well.
Fact is reducing the power limit on Zen3 also barely impacts gaming performance. The OEM 5800 non-X runs at a PL of less than 65Ws and has almost identical gaming performance.
Also unlike manually tweaking the PL on Intel, Eco Mode (running at 65W) is a standard option on AMD 105W TDP cpus. The 5800 non-X is basically doing that out of the box.









As for the i9, it is not even capable of scaling under 75W, at 50W it actually slows down enough that is uses more energy in the end.
Not that it matters, becuase the i9 is NOT a laptop CPU and doesn't need to be capable of scaling all the way down to laptop power levels.


----------



## fevgatos (Dec 25, 2021)

Zubasa said:


> Comparing frequencies on different architectures is point less. Also i9 does not sustain 5Ghz on P-cores under its stock 241W PL, it hovers around 4.9Ghz.


Yes and no. All CPU's are running under the same laws of physics. Forcing a transistor to work at 5ghz consumes a lot of power, no matter the architecture. 5ghz is way outside the efficiency curve. So the fact that Intel CPU's can do it on normal cooling you can buy off of a shelve is impressive to me. Ryzen needs exotic cooling, ln2 canisters and the likes.

The point is, if you don't push the 12900k to insane frequencies on all core workloads, it is actually a really really efficient CPU. Now whether it's more efficient than a 5950x (the smaller ryzens like the 5800x / 5600x can't really compete here) depends on the workload, and it doesn't frankly matter. They are both efficient, there isn't that huge gap everyone on the internet is talking about where the 12900k consumes double. As you can see from your graph, even at stock 4.9 ghz 240w pl (which is extremely bad in terms of efficiency) it is more efficient than a 3900x. Is suddenly 3900x bad at efficiency??? If I go back to the comment section of that CPU will I see people complaining about how inefficient that CPU is? I don't think so..

Obviously if you want to be running blender and cinebench all day long, and efficiency is of the utmost importance to you, the 5950x is the better CPU. If you want to do anything else, the 12900k is just unbeatable. Lightly threaded tasks, single thread performance, gaming performance, it's just a beast


----------



## vMax65 (Dec 25, 2021)

This whole efficiency argument on desktop between Ryzen and ALder Lake is pointless especially for gaming and pro workloads. The actual power usage over time will balance itself and is miniscule over the life time of parts usage, especially as we are never running CPU's at max core and power levels 24/7 unless we are semi / professionals and then you tend to buy Threadripper or higher. For gaming and workloads like rendering, encoding etc. both CPU's architectures find a happy balance with one providing a bit more performance at the cost of energy and the other being just a tad more efficient vs performance (core for core), and lets be honest, Intel have made huge strides getting to a point where they can match or even exceed Ryzen with a big/small design which is innovative and will only get better as they optimise going forward and when running under 'normal' conditions it is highly efficient especially the 12600K and 12700K whilst the 12900K is a halo product along with the 5900 and 5950 just based on pricing and if you really are all into efficiency and worried about your power bills buy sub 65w parts and be done with, we are after all talking about being enthusiasts....

AMD on the other hand are just as creative (if not more as they were the under dog!) and will no doubt hit back with more efficiency and performance so we all win as Intel and AMD go at each other and I like the fact they they have taken different routes to achieve performance and efficiency.

I have never worried about my power bill due to my PC running games, streaming and encoding....I mean we are not even talking about GPU power usage which is going up and up...

Is one better than the other, no, they just do things differently in getting to the same result, which in my world is great as we actually have a choice...I have to admit, this whole AMD vs Intel thing just does my head in....just buy what suit your budget for your use case and be it AMD or Intel it is the right decision for you...there is no wrong decision not at the current place we are in CPU maturity..We are in fact lucky that AMD provided competition and Intel responded...


----------



## Deleted member 24505 (Dec 26, 2021)

Good video by Der8auer. Love the way he says E cores are not crap cores, like some Fidiots think. "makes no sense having discord running on a P core" so true


----------



## gerardfraser (Dec 27, 2021)

lexluthermiester said:


> Only under max load. Most loads, even heavy gaming, do not force AlderLake to max power draw. W1zzard's testing clearly showed this and his testing is echoed by many other sites reviewers as well. It is a point we really shouldn't be harping on..
> 
> 
> Not all. For example, the i5-12400 has no Ecores at all. Not everything needs that balance of power/economy.


Hmmm a PC game doing 186W on one of my 12900K CPU's .I love it but the truth is 12900 is hot and eats watts more than any Intel/AMD CPU I owned over the last couple years.
2600X/3600X/3600XT/3800X/3800XT/5600X/5800X/10850K although the 10850K could hit 300+watts. 

Blows my mind with some comments from a few people in the comments.

Video where the screen shot is from


----------



## Outback Bronze (Dec 27, 2021)

gerardfraser said:


> Hmmm a PC game doing 186W on one of my 12900K CPU's



Is it me or is that running @ 5.6ghz? Not bad for 186w if I say so myself.


----------



## gerardfraser (Dec 27, 2021)

Outback Bronze said:


> Is it me or is that running @ 5.6ghz? Not bad for 186w if I say so myself.


Yeah it was 5600Mhz, but I owned two 12900K and only run them at 5400Mhz for PC Gaming and highest I tried was 5700Mhz. To be clear though at 4900Mhz I still got the same FPS on 12900K,just bigger numbers bigger ego e-peen.


----------



## Zubasa (Dec 27, 2021)

gerardfraser said:


> Yeah it was 5600Mhz, but I owned two 12900K and only run them at 5400Mhz for PC Gaming and highest I tried was 5700Mhz. To be clear though at 4900Mhz I still got the same FPS on 12900K,just bigger numbers bigger ego e-peen.


I wonder if there is clock stretching invloved since you are getting the same performance.


----------



## fevgatos (Dec 27, 2021)

gerardfraser said:


> .I love it but the truth is 12900 is hot and eats watts more than any Intel/AMD CPU I owned over the last couple years.


Because no other CPU you owned was running at 5.6 ghz. Try a 5950x at 5.6ghz and come back to tell me how much it consumes..


----------



## Outback Bronze (Dec 27, 2021)

fevgatos said:


> Because no other CPU you owned was running at 5.6 ghz. Try a 5950x at 5.6ghz and come back to tell me how much it consumes..



Good luck with a 5950x @ 5.6Ghz


----------



## Zubasa (Dec 27, 2021)

Outback Bronze said:


> Good luck with a 5950x @ 5.6Ghz


Good luck running anything @5.6Ghz all core on ambient. Realistically 5.4~5.5 is where Alder Lake settles with really good cooling.


----------



## fevgatos (Dec 27, 2021)

Outback Bronze said:


> Good luck with a 5950x @ 5.6Ghz


Thats my point.


----------



## Deleted member 24505 (Dec 27, 2021)

Worlds fastest ADL-


----------



## lexluthermiester (Dec 27, 2021)

gerardfraser said:


> Hmmm a PC game doing 186W on one of my 12900K CPU's .I love it but the truth is 12900 is hot and eats watts more than any Intel/AMD CPU I owned over the last couple years.
> 2600X/3600X/3600XT/3800X/3800XT/5600X/5800X/10850K although the 10850K could hit 300+watts.
> 
> Blows my mind with some comments from a few people in the comments.
> ...


And that is with your CPU OC'd to 5.58GHZ on the Pcores. 186w is not bad with that voltage and OC. Supports the point I was trying to make actually. Thank You for showing a good example!


gerardfraser said:


> To be clear though at 4900Mhz I still got the same FPS on 12900K


If you're hitting the performance ceiling, you really don't need the OC.


----------



## Bomby569 (Dec 27, 2021)

I think what's matters is the performance you get per watt (or dollar), if it does 6Ghz or not it's just technicall stuff. If another CPU can do the same with less wattage then it sucks, if not good for him.


----------



## lexluthermiester (Dec 27, 2021)

Bomby569 said:


> I think what's matters is the performance you get per watt (or dollar), if it does 6Ghz or not it's just technicall stuff. If another CPU can do the same with less wattage then it sucks, if not good for him.


That is a narrow way of thinking. Does the CPU in question render the best performance? Does it provide the functionality desired? In the case of the 12700k, the answer is yes. The differences on power used for a such a high end part compared to other highend CPUs is rather trivial. We're not talking about CPU's for mobile platforms where performance per watt is a chief concern. We're talking about highend Desktop platforms were performance is the chief concern.


----------



## Bomby569 (Dec 27, 2021)

lexluthermiester said:


> That is a narrow way of thinking. Does the CPU in question render the best performance? Does it provide the functionality desired? In the case of the 12700k, the answer is yes. The differences on power used for a such a high end part compared to other highend CPUs is rather trivial. We're not talking about CPU's for mobile platforms where performance per watt is a chief concern. We're talking about highend Desktop platforms were performance is the chief concern.



i did wrote "if another cpu can do the same..." so i guess it was clear i was comparing performance per watt in identical workloads, not the performance per watt of a celeron and a 12900K.


----------



## Deleted member 24505 (Dec 27, 2021)

People running a highend Desktop should not really be whining about power use.


----------



## Bomby569 (Dec 27, 2021)

Tigger said:


> People running a highend Desktop should not really be whining about power use.



Depends were you live i guess, electricity is really expensive were i live. If i can do the same and spend less i prefer not to waste money
I know in places like the US people don't even consider that because gas and electricity is dirt cheap


----------



## fevgatos (Dec 27, 2021)

Bomby569 said:


> Depends when were you live i guess, electricity is really expensive were i live. If i can do the same and spend less i prefer not to waste money
> I know in places like the US people don't even consider that because gas and electricity is dirt cheap


Well if you are gaming and you care about power consumption, you should get an alderlake cause it is WAY more efficient than ryzen's. If you are working then frankly it shouldn't really matter, both the 5950x and the 12900k are extremely efficient (with the correct power limits), you are supposed to be getting paid by your work, if that 2-3€ difference of electricity per month shouldn't make or break you. Or if it does, find a better job I guess


----------



## TheoneandonlyMrK (Dec 27, 2021)

AlwaysHope said:


> Chipzilla has always led the industry with innovations, big.LITTLE is no exception.


Big little is indeed no exception, Arm did it first ,proved it etc Soo, I wouldn't shout about Intel's innovation on that point personally.
There's little innovation in fact here.
AL is good don't get me wrong I just disagree it's innovative.
Copying isn't leading.


----------



## Bomby569 (Dec 27, 2021)

fevgatos said:


> Well if you are gaming and you care about power consumption, you should get an alderlake cause it is WAY more efficient than ryzen's. If you are working then frankly it shouldn't really matter, both the 5950x and the 12900k are extremely efficient (with the correct power limits), you are supposed to be getting paid by your work, if that 2-3€ difference of electricity per month shouldn't make or break you. Or if it does, find a better job I guess



if you see i was actually not defending either, just saying i would choose the one that did the same for less wattage.


----------



## Vayra86 (Dec 27, 2021)

Tigger said:


> People running a highend Desktop should not really be whining about power use.



Power = heat = noise.

I do like silence and lower power components are enablers, while inefficient high power is not.


----------



## Deleted member 24505 (Dec 27, 2021)

Vayra86 said:


> Power = heat = noise.
> 
> I do like silence and lower power components are enablers, while inefficient high power is not.



I have 6 fans in my PC, 3 120 on top rad, two 140 on bottom, 1 120 on the back and can hardly hear it. Just because you have a highend desktop does not mean it needs to sound like a jet engine.


----------



## Vayra86 (Dec 27, 2021)

Tigger said:


> I have 6 fans in my PC, 3 120 on top rad, two 140 on bottom, 1 120 on the back and can hardly hear it. Just because you have a highend desktop does not mean it needs to sound like a jet engine.



It doesnt have to, Im saying, lower power / higher efficiency means you can reduce noise further, maybe even cool passively, whereas with inefficient components that is going to get more difficult.

This difficulty results in higher cost of cooling solutions, higher space requirements, and more heat dumped in a room. So I think these are all very good reasons to prefer efficiency over inefficiency. Ergo, power matters.


----------



## Deleted member 24505 (Dec 27, 2021)

Well in winter in the UK, my PC can dump as much heat as it wants into my room. It is 7c outside atm, maybe lower with wind.


----------



## Vayra86 (Dec 27, 2021)

Tigger said:


> Well in winter in the UK, my PC can dump as much heat as it wants into my room. It is 7c outside atm, maybe lower with wind.



Yep and if you live in another country your airconditioner is on 24/7 to keep cool. But even in a well isolated house and a small sized man cave, those 400 watts of exhaust heat can get you sweaty.


----------



## Deleted member 24505 (Dec 27, 2021)

Vayra86 said:


> Yep and if you live in another country your airconditioner is on 24/7 to keep cool. But even in a well isolated house and a small sized man cave, those 400 watts of exhaust heat can get you sweaty.



 doubt my PC will ever exhaust 400 watts, not unless i care about benchmarks, which i don't really. Gaming, films/vids/music/browsing Which is not going to push my 12700k no where near 400 watts. Shame as it means my room is not so warm. Maybe i need to crack out some balls to the wall benches to warm myself up


----------



## TheoneandonlyMrK (Dec 27, 2021)

Tigger said:


> Well in winter in the UK, my PC can dump as much heat as it wants into my room. It is 7c outside atm, maybe lower with wind.


I agree on the heat dumping in the UK, but I don't think your situation and mine is the only perspective on it.
And my 12 fan pc is fairly quiet at full load 24/7, but I don't count idle noise as a worthy point to mention personally, as in any standpoint on a personal pc noise output should be at load for an hour, all PC's should be quite at idle you know.

I'd recommend folding at home and WCG for a nice comfortable room


----------



## fevgatos (Dec 27, 2021)

A pc should never get noisy. Having to crack the fans so high that they start being noisy means there is inadequate airflow. Usually fans are completely silent below 1000 rpm, and you shouldn't need more than that unless you are like benching. 

Personally i don't mind a little noise, im more concerned about temps, so my fractal torrent gets a little bit noisy but its exhausting 600+ watts during gaming. Although it keeps my 3090 at 65 degrees while doing it. I can make it absolutely quiet but then the gpu temps will go 70


----------



## Deleted member 24505 (Dec 27, 2021)

Good thing with a water loop is, the CPU and GPU are not dumping their heat into the case, but it is getting exhausted from the radiators


----------



## fevgatos (Dec 27, 2021)

Tigger said:


> Good thing with a water loop is, the CPU and GPU are not dumping their heat into the case, but it is getting exhausted from the radiators


Yeah, I always felt that watercooling should either be done on both or not at all.


----------



## lexluthermiester (Dec 27, 2021)

Bomby569 said:


> Depends were you live i guess, electricity is really expensive were i live.


Then don't buy highend PC parts.


----------



## Bomby569 (Dec 27, 2021)

lexluthermiester said:


> Then don't buy highend PC parts.



that makes no sense, just buy efficient ones. Btw even if your electricity is dirt cheap you should care about making bad use of it for obvious reasons


----------



## lexluthermiester (Dec 27, 2021)

Bomby569 said:


> that makes no sense, just buy efficient ones.


Whether it makes sense to you is not relevant. People can buy what they want. You don't get to dictate what people buy. Quit trying to.


Bomby569 said:


> Btw even if your electricity is dirt cheap you should care about making bad use of it for obvious reasons


While I pay the bills, I decide how to use my power. Don't tell people how to live.

However, you're once again missing some context. AlderLake CPUs are only power hogs WHEN MAXED OUT, which something most users will rarely, if ever, do. So quit harping on a about power usage because it's NOT a big deal.


----------



## ExcuseMeWtf (Dec 27, 2021)

i3 definitely looks VERY interesting, however availability and price of entry level mobos will be a determining factor for it.


----------



## Bomby569 (Dec 27, 2021)

lexluthermiester said:


> Whether it makes sense to you is not relevant. People can buy what they want. You don't get to dictate what people buy. Quit trying to.
> 
> While I pay the bills, I decide how to use my power. Don't tell people how to live.
> 
> However, you're once again missing some context. AlderLake CPU are only power hog WHEN MAXED OUT, which something most users will rarely, if ever, do. So quit harping on a about power usage because it's NOT a big deal.



I'm just giving my opinion as i have been doing since the beggining, you're the one that replied to me trying to convince me your right and my opinion is wrong.


----------



## lexluthermiester (Dec 28, 2021)

Bomby569 said:


> I'm just giving my opinion as i have been doing since the beggining


There is a difference between offering an opinion for insight & preference and stating your opinion as a basis of merit while at the same time imposing it on others. Plenty of other users share their opinions here. Do you see me calling everyone out? No?...Hmm... I got no problems calling people out...


Bomby569 said:


> you're the one that replied to me trying to convince me your right and my opinion is wrong.


That's because your opinion is not supported by factual information and merit.

The AlderLake CPUs are TOP performers. And while they can use a lot of power, 99.9% of the time, they don't. You're trying to convince everyone that they are a waste because they are so power hungry while at the same time subtly imply Ryzen is better. NEWS FLASH, the only two real competitors to the 12900k/12700k are the 5900X and 5950x and they use a ton of power too when maxed full tilt.


----------



## ratirt (Dec 28, 2021)

lexluthermiester said:


> That's because your opinion is not supported by factual information and merit.
> 
> The AlderLake CPUs are TOP performers. And while they can use a lot of power, 99.9% of the time, they don't. You're trying to convince everyone that they are a waste because they are so power hungry while at the same time subtly imply Ryzen is better. NEWS FLASH, the only two real competitors to the 12900k/12700k are the 5900X and 5950x and they use a ton of power too when maxed full tilt.


Actually that is not true. 5900x and 5950x are very similar in power usage. When stressed they use actually the same power both (around 10W difference) and that is stress test. which is around what 12900K uses when capped at PL1 100W and PL2 100W. So if AMD's use ton of power, then how would you describe Alder Lake's power usage? Because if you set the PL1 and PL2 to 125W cap it can go over 200 Watts during stress test. That's a ton as well if AMD's is. I know gaming is different but you argued about AMD's using ton of power when stressed. I disagree especially if you put AL in comparison. Not to mention, AL can go even above 300W if you let it.


----------



## vMax65 (Dec 28, 2021)

An interesting article reviewing the 12700K vs the 5800X and when gaming they use the same power whilst the 12700K runs cooler (surprising to me)...Overall the performance difference is fairly large across production type workloads for the 12700K vs the 5800X though smaller in gaming but the 12700K still comes out on top and is balanced by the power draw so all things turn out to be equal in terms of efficiency with higher performance in Intel..









						Intel Core i7-12700K vs AMD Ryzen 7 5800X Performance Review - Page 8 of 9 - The FPS Review
					

We pit the Intel Core i7-12700K CPU vs the AMD Ryzen 7 5800X head-to-head in benchmarks and games and find out who comes out on top.




					www.thefpsreview.com
				




I still am confused that efficiency is an issue especially at the desktop level. Yes Ryzen will be more efficient on the power side but that is balanced by efficiency on the performance side on Intel 12th gen. Do you want more performance or more power efficiency? In gaming there is no difference.....In production type workloads and it really depends on which one, then Ryzen will become power efficient at the cost of performance.

No wrong decision though as both AMD and now Intel make great CPU's...


----------



## Braegnok (Dec 28, 2021)

Running default settings my i9-12900K at idle draws 18W and under heavy load 200W.




My graphics card is the power hog @ 400W


----------



## GerKNG (Dec 28, 2021)

lexluthermiester said:


> The AlderLake CPUs are TOP performers. And while they can use a lot of power, 99.9% of the time, they don't.


This.

my 12600k (even at almost 1.45V at 5.1 Ghz pulls barely 60W in games (even BF2042 averages out at around 55W)


at 4.9 Ghz (just needs around 1.22V) i see less powerdraw in games than with my ryzen 3 3100 (just above 35W)


----------



## lexluthermiester (Dec 28, 2021)

ratirt said:


> Not to mention, AL can go even above 300W if you let it.


Again with the people missing context...



ratirt said:


> Actually that is not true. 5900x and 5950x are very similar in power usage. When stressed they use actually the same power both (around 10W difference) and that is stress test.


You have actually READ the reviews of AlderLake CPUs, right? Go read them again and specifically look here;








						Intel Core i9-12900K Review - Fighting for the Performance Crown
					

The Intel Core i9-12900K is Intel's flagship processor for the Alder Lake architecture. In our testing, we saw fantastic gaming performance from this new processor. Not only low-threaded tests have improved, the 12900K can even beat AMD at highly threaded workloads.




					www.techpowerup.com
				











						Intel Core i7-12700K Review - Almost as Fast as the i9-12900K
					

With the Core i7-12700K, Intel has released a formidable competitor to AMD's Ryzen 5800X and even 5900X. Thanks to eight powerful Golden Cove cores, the processor handles all workloads very well, including gaming. Compared to the i9-12900K, it runs almost as fast, but much cooler, with better...




					www.techpowerup.com
				




Let's compare to the tests on the Ryzen models;








						AMD Ryzen 9 5950X Review
					

Ryzen 9 5950X is AMD's flagship 16-core, 32-thread monster. It offers outstanding application performance, your productivity tasks will complete faster than before. Thanks to the Zen 3 IPC advantage, it also excels in gaming, even winning against Intel's Core i9-10900K.




					www.techpowerup.com
				











						AMD Ryzen 9 5900X Review
					

The Ryzen 9 5900X dominates Intel's Core i9-10900K in our testing because of AMD's massive IPC improvements. At $550, this processor is certainly not cheap, but it offers so much more performance, especially single-threaded, that AMD has a clear winner on their hands.




					www.techpowerup.com
				




Hmm... Now isn't that interesting...


----------



## ratirt (Dec 29, 2021)

lexluthermiester said:


> Again with the people missing context...
> 
> 
> You have actually READ the reviews of AlderLake CPUs, right? Go read them again and specifically look here;
> ...


I really have no idea what you are trying to say here.
You said that 5900x and 5950x use a ton of power while stressed. My question was, if that is the case what will you say about AL using same or higher and yet you give me these articles. Maybe you should read my question and simply answer, what is your take on that subject instead of sending articles?


----------



## phanbuey (Dec 29, 2021)

All these chips, for daily use or enthusiast use are more or less at parity -- at theoretical max AL pulls more, but not massively for the performance increase up to the 5950x.  For hardcore workstation tasks like 24/7 blender the 5950x wins big (same performance but lower power = savings for a 24/7 over years).  For every day stuff it's almost the same if not favored a bit to AL.





Using the same power and running cooler while getting more fps.
Intel Core i7-12700K vs AMD Ryzen 7 5800X Performance Review - Page 8 of 9 - The FPS Review

As DDR5-5200 is slower than tuned DDR4 at the moment the performance numbers even baby alder lakes put out chart topping.  3D cache is going to change all that but for now AL is the top setup and cheap for what it is.

My current OC on the 12600K at 5.3 is pulling around 190W of power in cinebench and getting decently better numbers than a 5800x with similar temps -- which is around the same power draw at ~180w.  So as alder lake scales down it becomes a bit more power efficient than zen3 (we will probably see this in mobile).

I think the 5950x is the real standout in terms of MT power usage over 24/7 tasks - mainly because the 12900k has to be pushed way beyond the point of diminishing returns to compete.


----------



## ratirt (Dec 29, 2021)

phanbuey said:


> All these chips, for daily use or enthusiast use are more or less at parity -- at theoretical max AL pulls more, but not massively for the performance increase up to the 5950x. For hardcore workstation tasks like 24/7 blender the 5950x wins big. For every day stuff it's almost the same if not favored a bit to AL.


I think that is a very mature conclusion. It would have been hard to notice the difference but yet AL is a tad faster in certain loads like gaming across the board on lower resolution. Well no wonder it is faster. It's just been released and we all know how sensitive Intel is about gaming and the performance crown. So no shock for me here.


----------



## lexluthermiester (Dec 29, 2021)

ratirt said:


> I really have no idea what you are trying to say here.


Oh, that is for certain.


ratirt said:


> Maybe you should read my question and simply answer, what is your take on that subject instead of sending articles?


The articles ARE my answer. Read, inwardly digrest and maybe you might strike upon the point. I'm not spending an hour of my time typing out an explanation that is readily obvious from the comparative data on offer in those articles. People who can't connect the dots are not my problem.


----------



## ratirt (Dec 30, 2021)

lexluthermiester said:


> Oh, that is for certain.
> 
> The articles ARE my answer. Read, inwardly digrest and maybe you might strike upon the point. I'm not spending an hour of my time typing out an explanation that is readily obvious from the comparative data on offer in those articles. People who can't connect the dots are not my problem.


You might just tell everyone what you want to tell not tell them to read. That's rude. 
AL uses more power than a 5000 series CPU respectively. That is what I get. Not sure what's your take.


----------



## lexluthermiester (Dec 30, 2021)

ratirt said:


> You might just tell everyone what you want to tell not tell them to read. That's rude.


Why should I have to repeat what's already been clearly stated?


ratirt said:


> AL uses more power than a 5000 series CPU respectively.


True, but not by much.


ratirt said:


> That is what I get. Not sure what's your take.


It's a thinker...


----------



## fevgatos (Dec 30, 2021)

ratirt said:


> You might just tell everyone what you want to tell not tell them to read. That's rude.
> AL uses more power than a 5000 series CPU respectively. That is what I get. Not sure what's your take.


Power consumption is irrelevant. What matters is efficiency. Gaming and productivity (anything except rendering 24/7) alderlake is hands down better in terms of efficiency. Rendering, it's the high end ryzens


----------



## Deleted member 24505 (Dec 30, 2021)

As far as i am concerned. If you push any CPU to the max it won't be efficient


----------



## The King (Jan 2, 2022)

Some R23 MT and ST info on the 12400 ADL.


----------



## TheoneandonlyMrK (Jan 2, 2022)

So what's the thoughts of Owners of 12th gen parts on the removal of Avx512 by bios update, seems like a dik move to me.
Could it affect hardware benches, does C23 use avx512 I am obviously not sure how much Avx512 is used at the minute though I am aware of it's performance If used.


----------



## Deleted member 24505 (Jan 2, 2022)

The score is better than a stock 5600x i believe


----------



## Psychoholic (Jan 2, 2022)

TheoneandonlyMrK said:


> So what's the thoughts of Owners of 12th gen parts on the removal of Avx512 by bios update, seems like a dik move to me.
> Could it affect hardware benches, does C23 use avx512 I am obviously not sure how much Avx512 is used at the minute though I am aware of it's performance If used.



Doesn't really bother me.
IIRC it is already disabled by default and the only way to enable is to disable the E-Cores (and not all motherboards support enabling it then)

I am not sure how often it is used either, as far as i know C23 does not use it.


----------



## Deleted member 202104 (Jan 2, 2022)

Psychoholic said:


> Doesn't really bother me.
> IIRC it is already disabled by default and the only way to enable is to disable the E-Cores (and not all motherboards support enabling it then)
> 
> I am not sure how often it is used either, as far as i know C23 does not use it.



This is correct - AVX512 isn't an officially included extension for Alder Lake. C23 uses AVX (AVX2 I think), but not AVX512.


----------



## ir_cow (Jan 2, 2022)

TheoneandonlyMrK said:


> So what's the thoughts of Owners of 12th gen parts on the removal of Avx512 by bios update, seems like a dik move to me.
> Could it affect hardware benches, does C23 use avx512 I am obviously not sure how much Avx512 is used at the minute though I am aware of it's performance If used.


AVX512 is only enabled via BIOS bypass when E-cores are disabled. The next BIOS updates will remove this completely per Intels request.


----------



## lexluthermiester (Jan 2, 2022)

ir_cow said:


> AVX512 is only enabled via BIOS bypass when E-cores are disabled. The next BIOS updates will remove this completely per Intels request.


My question is why disable AVX512 at all? What's the technical purpose? Kinda stupid...


----------



## Deleted member 24505 (Jan 2, 2022)

lexluthermiester said:


> My question is why disable AVX512 at all? What's the technical purpose? Kinda stupid...


Intel accidentally confirmed that AVX-512 would not work on Alder Lake because its Gracemont Efficiency cores simply don't support AVX-512.

https://hothardware.com/news/rumor-claims-intel-will-forcibly-disable-avx-512-on-alder-lake

They might be able to do a bios/windows update to fix it somehow

The weird thing is, the high end CPU's will have no AVX512, but the P core only ones could end up being able to use it as they have no E cores.


----------



## lexluthermiester (Jan 3, 2022)

Tigger said:


> Intel accidentally confirmed that AVX-512 would not work on Alder Lake because its Gracemont Efficiency cores simply don't support AVX-512.
> 
> https://hothardware.com/news/rumor-claims-intel-will-forcibly-disable-avx-512-on-alder-lake
> 
> ...


That doesn't really explain much about why those instructions, which were deliberately built-in, are being disabled. Why do they need to be?


----------



## Deleted member 24505 (Jan 3, 2022)

lexluthermiester said:


> That doesn't really explain much about why those instructions, which were deliberately built-in, are being disabled. Why do they need to be?



My guess is, the E cores do not support AVX512, so if a app can run AVX512 and is sent to a E core it would cause an error. Even though the P cores can run it, i guess they chose to disable it on both.


----------



## lexluthermiester (Jan 3, 2022)

Tigger said:


> My guess is, the E cores do not support AVX512, so if a app can run AVX512 and is sent to a E core it would cause an error.


But that can be handled programmatically. I'm not buying that reason. Nothing personal against you, we know you're just theorizing..


----------



## ir_cow (Jan 3, 2022)

My guess its reserved for the 13th Gen HEDT or 12th Gen XEON and Intel doesn't want to give buyers that need AVX512 to just use a i9 Alder Lake instead for a lot cheaper. That or the AVX512 set is somehow actually broken at the core and will give invalid results on scientific applications. We are all guessing why Intel has it on Alder Lake in the first place if it isn't supported. We can only guess.


----------



## lexluthermiester (Jan 3, 2022)

ir_cow said:


> My guess its reserved for the 13th Gen HEDT or 12th Gen XEON and Intel doesn't want to give buyers that need AVX512 to just use a i9 Alder Lake instead for a lot cheaper. That or the AVX512 set is somehow actually broken at the core and will give invalid results on scientific applications. We are all guessing why Intel has it on Alder Lake in the first place if it isn't supported. We can only guess.


Though that idea has some plausibility, it really wouldn't make much sense either.


----------



## phanbuey (Jan 3, 2022)

+1 to the thread director not being able to properly deal with it yet.

Gracemont supports _AVX_, AVX2, and _AVX_-VNNI but not avx 512.  It's the most probable reason -- that something is not working right when deciding which cores to use during avx.


----------



## looniam (Jan 3, 2022)

lexluthermiester said:


> Though that idea has some plausibility, it really wouldn't make much sense either.


when did market artificial market segmentation ever make sense to a consumer?


----------



## lexluthermiester (Jan 3, 2022)

phanbuey said:


> Gracemont supports _AVX_, AVX2, and _AVX_-VNNI but not avx 512.  It's the most probable reason -- that something is not working right when deciding which cores to use during avx.


But again, core selection can be handled programmatically, so why disable it instead of releasing a patch that just fixes the core selection problem? It would take all of two lines of code to instruct the Windows scheduler to select a Pcore when a call to the AVX512 instruction set is made.


----------



## phanbuey (Jan 3, 2022)

lexluthermiester said:


> But again, core selection can be handled programmatically, so why disable it instead of releasing a patch that just fixes the core selection problem? It would take all of two lines of code to instruct the Windows scheduler to select a Pcore when a call to the AVX512 instruction set is made.



Well that's the assumption... 'can be handled programmatically' - but will it? is it at all times? does the CPU crash when the instruction is sent without special programmatic care?  what about virtual machines and cloud providers / hadoop clusters that use ISA-L/ kernel virtualization that can mask cores?  Legacy software in universities? -- do they have to rewrite software or will it just take a dirt nap when it tries to use a gracemont core?  There's just so many failure points for a mass produced product that's going to be sitting in a poorly optimized/supported dell somewhere - especially since legacy code that would crash it is already out and in use.

It's such a perfect setup for a small disaster and bad publicity, so Intel probably decided to only release chips that are 100% stable under all circumstances and deal with customers that need avx 512 using a different product line.


----------



## lexluthermiester (Jan 3, 2022)

phanbuey said:


> does the CPU crash when the instruction is sent without special programmatic care?


No, the instructions are dumped, an error code is generated and sent back to the OS.


phanbuey said:


> Legacy software in universities?


Legacy software would not call the AVX512 instructions and would not be affected.


----------



## phanbuey (Jan 3, 2022)

lexluthermiester said:


> No, the instructions are dumped, an error code is generated and sent back to the OS.



Correct, sure then the software crashes....

And you have a CPU that can sort of support AVX-512.

It's definitely technically possible.  It's just a question of did they have enough time to make this implementation good enough.  And the answer is no - it was clearly in the works and got dropped.



lexluthermiester said:


> Legacy software would not call the AVX512 instructions and would not be affected.


Why not?  There's software out now (basically all AVX-512 enabled software) that is running 512 that isn't AL optimized = legacy.


----------



## lexluthermiester (Jan 3, 2022)

phanbuey said:


> Correct, sure then the software crashes....


No, the software just stops. There is a difference between a software crash and a halt-on-error.


----------



## phanbuey (Jan 3, 2022)

lexluthermiester said:


> No, the software just stops. There is a difference between a software crash and a halt-on-error.



Not the point.  Depends on how it's coded. You don't want your software stopping whether you catch the error or not.  Stopping because of an error is synonymous with a crash and would be treated in the news as such.

Point is CPU can't support AVX-512 at the hardware level 100% and Intel can't control the software that will run on these chips, so they have to say that it doesn't support it.  Or they risk lawsuits, bad press, customer dissatisfaction, loss of shareholder value -> lawsuits from shareholders vs management etc. etc.


----------



## lexluthermiester (Jan 3, 2022)

phanbuey said:


> Depends on how it's coded.


True.


phanbuey said:


> You don't want your software stopping whether you catch the error or not.


Not true. And, yes, yes you do.


----------



## Deleted member 24505 (Jan 3, 2022)

phanbuey said:


> Not the point.  Depends on how it's coded. You don't want your software stopping whether you catch the error or not.  Stopping because of an error is synonymous with a crash and would be treated in the news as such.
> 
> Point is CPU can't support AVX-512 at the hardware level 100% and Intel can't control the software that will run on these chips, so they have to say that it doesn't support it.  Or they risk lawsuits, bad press, customer dissatisfaction, loss of shareholder value -> lawsuits from shareholders vs management etc. etc.



They possibly can fix it in the future if they choose to. The fact for now that they are forcing motherboard makers to bios disable it, means they might not have no intention of re enabling it in the future.


----------



## phanbuey (Jan 3, 2022)

lexluthermiester said:


> Not true. And, yes, yes you do.


I'll be sure to tell my boss that if one of our apps goes down.  "But sir, it caught the exception - and logged it successfully before it kicked all those doctors out in the middle of their medical visits!"

While we are at it, we should rename Crash to Desktop (CTD) to  Desired Successful Halt On Error (DSHOE)



Tigger said:


> They possibly can fix it in the future if they choose to. The fact for now that they are forcing motherboard makers to bios disable it, means they might not have no intention of re enabling it in the future.


Right -- agreed.  To be fair wasn't ever 'fully' functional -- you had to disable E cores it to work and only a few makers supported it (no msi).  It's basically a half finished feature that some motherboards found a bios hack for.


----------



## lexluthermiester (Jan 3, 2022)

phanbuey said:


> I'll be sure to tell my boss that if one of our apps goes down.  "But sir, it caught the exception - and logged it successfully before it kicked all those doctors out in the middle of their medical visits!"
> 
> While we are at it, we should rename Crash to Desktop (CTD) to  Desired Successful Halt On Error (DSHOE)


I think you misunderstood what I was describing. And since you insist on being unpleasant...


----------



## Deleted member 24505 (Jan 3, 2022)

phanbuey said:


> I'll be sure to tell my boss that if one of our apps goes down.  "But sir, it caught the exception - and logged it successfully before it kicked all those doctors out in the middle of their medical visits!"
> 
> While we are at it, we should rename Crash to Desktop (CTD) to  Desired Successful Halt On Error (DSHOE)
> 
> ...



Might have to check my bios to see if it is still there, or gone with latest bios. I keep my board up to date as the platform is so new.


----------



## Wirko (Jan 3, 2022)

lexluthermiester said:


> No, the instructions are dumped, an error code is generated and sent back to the OS.


More exactly, an invalid opcode exception is generated. The exception handler (interrupt handler), if it exists, can then emulate the AVX-512 instruction using other available instructions, then return control to the thread that caused the exception. Even if that slows the program down to nearly zero, it prevents it from crashing. 

The other alternative is to suspend the thread and make sure it continues its execution on a P core.

That's the theory at least, all modern CPUs can do that, but I don't know if it has ever been implemented.


----------



## phanbuey (Jan 3, 2022)

lexluthermiester said:


> I think you misunderstood what I was describing. And since you insist on being unpleasant...


I was legitimately trying to be funny by showing the point from a business user point of view.  Since my entire angle is that this is a business decision rather than an engineering decision.

Being unpleasant is one of my features though, so there's that.


----------



## lexluthermiester (Jan 3, 2022)

phanbuey said:


> I was legitimately trying to be funny by showing the point from a business user point of view.  Since my entire angle is that this is a business decision rather than an engineering decision.
> 
> Being unpleasant is one of my features though, so there's that.


Ah, I took that very differently. Fair enough.



Wirko said:


> More exactly, an invalid opcode exception is generated.


Yes. I was trying to keep the description simple so that non-coders would understand the idea.


The only thing I can think of that legitimately makes sense is that there is some form of power-gating flaw in the AVX512 section of the die that causes more power draw for that section than is needed and disabling it is the only fix. Otherwise, disabling make no sense whatsoever and it's just dumb.


----------



## Deleted member 24505 (Jan 4, 2022)

lexluthermiester said:


> Ah, I took that very differently. Fair enough.
> 
> 
> Yes. I was trying to keep the description simple so that non-coders would understand the idea.
> ...


If you look at links here Lex, it actually seems to use less power with AVX-512 enabled.
https://www.techpowerup.com/forums/...-on-alder-lake-processors.290460/post-4676108


----------



## ExcuseMeWtf (Jan 5, 2022)

Placed a tentative order on 12100F + 16GB DDR4 3200 MHz kit.

Not finalizing it, as we don't have budget mobos yet, and not willing to splurge on Z690. If I understand right H610 only supports single channel and no XMP? That means basically to avoid it beyond office applications and aim for H670 or B660?


----------



## Deleted member 24505 (Jan 5, 2022)

ExcuseMeWtf said:


> Placed a tentative order on 12100F + 16GB DDR4 3200 MHz kit.
> 
> Not finalizing it, as we don't have budget mobos yet, and not willing to splurge on Z690. If I understand right H610 only supports single channel and no XMP? That means basically to avoid it beyond office applications and aim for H670 or B660?



The upcoming MSI B/H motherboards are looking good


----------



## Sithaer (Jan 6, 2022)

ExcuseMeWtf said:


> Placed a tentative order on 12100F + 16GB DDR4 3200 MHz kit.
> 
> Not finalizing it, as we don't have budget mobos yet, and not willing to splurge on Z690. If I understand right H610 only supports single channel and no XMP? That means basically to avoid it beyond office applications and aim for H670 or B660?



I just checked and the first H/B mobos showed up in one of the biggest r'e'-tailer in my country where I also buy my new stuff most of the time.

ASUS PRIME H610M-A D4 ~ 120 $
ASUS PRIME B660M-A D4 ~ 172 $

and for the lolz the ASUS ROG STRIX B660-F is ~ 288 $

The cheapest Z690 is ~ 236 $ in comparison.

Yeah thats just nope, I know these are only the early chickens but still thats high even for my country's standards.

Btw those prices already include the 27% VAT we have here.


----------



## ExcuseMeWtf (Jan 6, 2022)

Yeah, Asus is really overpriced.

Couldn't find MSI for good price either, ended up going for Gigabyte B660M Gaming instead for ~$145 or so.

Higher mobo prices are a flip side to the situation sadly. I remember getting mobos for half of that price until that point and they weren't bottom of the barrel ones either. Then again my current systems are i5-2xxx so about a decade old at this point - upgrade is probably warranted anyways.

Also apparently getting a Gear Up Bundle for free with that CPU.


----------



## Sithaer (Jan 6, 2022)

ExcuseMeWtf said:


> Yeah, Asus is really overpriced.
> 
> Couldn't find MSI for good price either, ended up going for Gigabyte B660M Gaming instead for ~$145 or so.
> 
> ...



Yeah mobo prices are sure getting out of hand nowadays.

Back in 2018 May when I built my current rig for the most part, that B350 F Strix I have cost me around 130 $ brand new and at the time that was one of the better B350 mobos available.

If you are upgrading from a Sandy Bridge then the 12100 will be a solid upgrade for sure, heck even I wouldn't mind that but with these mobo prices I think I'm better off buying a second hand R5 3600 and call it a day. _'thats the last gen my mobo supports'_


----------



## ExcuseMeWtf (Jan 6, 2022)

Also thought about going Team Red, but then checked local availability and pricing... Yeaaaaaah, I'd prob pay as much for comparable Ryzen CPU alone as I paid for this whole set of CPU + mobo + RAM.
Nothing against second-hand either, in fact both my current i5s were bought used and still going strong decade later. Being non-K surely helps as I know they couldn't have been OC'd to any significant degree, and likely not at all.


----------



## vMax65 (Jan 7, 2022)

I am sure MSI and ASROCK will have some B and H series motherboards priced well. The ASRock B660 Pro RS is now available in the UK at £129 and the Gigabyte B660 MATX is £103 and I think they will go a little lower as we get deeper into the launch.


----------



## ExcuseMeWtf (Jan 7, 2022)

> and the Gigabyte B660 MATX is £103


Might be the same one I ordered.

Not actually available yet though, gotta wait 7 more days for delivery. Not a big deal, I have a running PC after all.

Found a few tests on YT meanwhile:



















If those are to believed, I def made a good choice there.


----------



## catulitechup (Jan 11, 2022)

Interesting offer in newegg at simple seek most cheap b660 (asrock) mainboard avalaible









						ASRock B660M-HDV LGA 1700 Intel B660 SATA 6Gb/s DDR4 Micro ATX Motherboard - Newegg.com
					

Buy ASRock B660M-HDV LGA 1700 Intel B660 SATA 6Gb/s DDR4 Micro ATX Motherboard with fast shipping and top-rated customer service. Once you know, you Newegg!




					www.newegg.com


----------



## ExcuseMeWtf (Jan 11, 2022)

Another test:









Strong gaming showing, mediocre productivity. Shouldn't really expect much better for the latter given specs/budget I guess.

Regardless it's coming today to me and will sure make a sweet upgrade over Sandy lol.


----------



## Wirko (Jan 11, 2022)

ExcuseMeWtf said:


> Regardless it's coming today to me and will sure make a sweet upgrade over Sandy lol.


12100 - 2100 = 10000
...
12100 - 2700 = 9400

In any case, it's over 9000.


----------



## Deleted member 24505 (Jan 11, 2022)

What are these?

Intel Processor Number detection for Core i5-12490F, 12500E, 12600HE
Intel Processor Number detection for Core i7-12700E, 12700TE, 12800HE
Intel Processor Number detection for Core i9-12900E, 12900H, 12900TE
Saw them on a update notification for aida. Never heard of the E versions


----------



## Noah Katz (Jan 11, 2022)

Are CPU benchmarks like these https://www.cpubenchmark.ne... based on baseline clock or turbo boost?

How does the CPU know when to invoke turbo boost?

Can I expect continuous turbo boost if I'm running a demanding single-threaded application (CAD), since only one core will be used for this?

If using only one core isn't enough to prevent local overheating, does anyone know if the CPU's are able to shift processing to a different core?


----------



## catulitechup (Jan 11, 2022)

ExcuseMeWtf said:


> Another test:
> 
> 
> 
> ...



curiously gamernexus pay 130us for 12100F ?

because in newegg sold out in 109us


----------



## Kissamies (Jan 13, 2022)

What a disappointment that Celerons are still 2c/2t ones.












Tigger said:


> What are these?
> 
> Intel Processor Number detection for Core i5-12490F, 12500E, 12600HE
> Intel Processor Number detection for Core i7-12700E, 12700TE, 12800HE
> ...


Wikipedia says that the E versions are for embedded use. TE ones are the low-power variants of those.


----------



## lexluthermiester (Jan 13, 2022)

Maenad said:


> What a disappointment that Celerons are still 2c/2t ones.


Let's be fair, the Celeron range has always been the budget and bargain lineup. I'll bet the prices are low.


----------



## Kissamies (Jan 13, 2022)

lexluthermiester said:


> Let's be fair, the Celeron range has always been the budget and bargain lineup. I'll bet the prices are low.


Yes they have been, but even in basic tasks 2c/2t is so low-end that I just can't understand it anymore. They should have those as 2c/4t SKUs and Pentiums as 4c/4t.

I still remember when Celerons were a great bang for the buck for a budget user (or even gamer) in the Core2 era as they overclocked somewhat okay.


----------



## TheoneandonlyMrK (Jan 13, 2022)

Noah Katz said:


> Are CPU benchmarks like these https://www.cpubenchmark.ne... based on baseline clock or turbo boost?
> 
> How does the CPU know when to invoke turbo boost?
> 
> ...


I can tell you it would move a workload about to optimise load and performance, you can also tune it to do what you want but I think it would probably downclock when thermally limited mostly , within a power budget.
Others know more on this.

As for turbo, everyone has they're foot to the floor, don't they! Intel and AMD spank em till they're hot.


----------



## fevgatos (Jan 13, 2022)

Just got my ddr5 ram, so I was able to play with my 12900k around a bit. On a U12A i get around 70-73C during cinebench R20 and 45 to 51 during gaming. So yeah....impossible to cool right?


----------



## Noah Katz (Jan 13, 2022)

Maenad said:


> Yes they have been, but even in basic tasks 2c/2t is so low-end that I just can't understand it anymore.



FWIW









						Intel's $42 Celeron G6900 Alder Lake CPU Gets OC'd And Then Beats A Core i9-10900K
					

We don't expect a whole lot from a sub-$50 processor, and we most certainly would not anticipate it hanging with a CPU that costs $400 more, let alone beating it.




					hothardware.com


----------



## TheoneandonlyMrK (Jan 14, 2022)

fevgatos said:


> Just got my ddr5 ram, so I was able to play with my 12900k around a bit. On a U12A i get around 70-73C during cinebench R20 and 45 to 51 during gaming. So yeah....impossible to cool right?


Be nice to see CB23 instead of it's lightweight prequel.
It's going to be a great zpc though


----------



## fevgatos (Jan 14, 2022)

TheoneandonlyMrK said:


> Be nice to see CB23 instead of it's lightweight prequel.
> It's going to be a great zpc though


It hits around 77 to 79 in that one. A little high for my taste but I expected thermal throttling so yeah...

WIth AI OC 5.1 all cores was hitting 84-86


----------



## AlwaysHope (Jan 14, 2022)

Noah Katz said:


> FWIW
> 
> 
> 
> ...


If its benched with Linux...


----------



## lexluthermiester (Jan 14, 2022)

Maenad said:


> Yes they have been, but even in basic tasks 2c/2t is so low-end that I just can't understand it anymore.


While I see your point, remember, these are the new performance Pcores. Also remember, I and others got Windows 11 running on a Core2Quad. The one I tested was a Q8400, and it ran well. I can't remember who, but someone in the forums is running a C2Q as a daily driver

These Celerons will not win any competitions, but at about $50 it'll be a decent budget offering for anyone not doing anything intense.


Maenad said:


> They should have those as 2c/4t SKUs and Pentiums as 4c/4t.


And they'll likely fill out the line up just like you suggest. It think these 2c/2t offerings will be the bottom of the barrel. Remember, Intel needs time to build up stock of binned dies, just like AMD and NVidia did.



Noah Katz said:


> FWIW
> 
> 
> 
> ...


Good find! And Welcome to TPU!


----------



## Deleted member 24505 (Jan 14, 2022)

fevgatos said:


> Just got my ddr5 ram, so I was able to play with my 12900k around a bit. On a U12A i get around 70-73C during cinebench R20 and 45 to 51 during gaming. So yeah....impossible to cool right?



Don't forget the Joke part too. Why so much hate on ADL I don't get it.


----------



## fevgatos (Jan 14, 2022)

Tigger said:


> Don't forget the Joke part too. Why so much hate on ADL I don't get it.


You know why  

Gaming though was a revelation. Temperatures are stupidly low (my 11600k was scorching) and the fps is insane. Dropped to lowest resolution possible with DLSS ultra performance (everything else ultra including RT) and managed to hit 204 fps in cyberpunk. That is out of this planet.

Also seems alderlake aren't THAT much memory sensitive. Not a huge difference between 4800 ram no xmp and 6000c36 with xmp on. Noticed around a 10-12% difference, which is peanuts compared to older generations. My 10900k could get anywhere up to 40-50% more framerate going from 3200 to 4400c16


----------



## ExcuseMeWtf (Jan 14, 2022)

Celerons/Pentiums make for suitable office PCs with H610 mobo.

I think where I am is where the fun starts  (i3 + B660. 4C/8T that can hang on with past i5s + XMP. That Gigabyte I bought has pretty poor connectivity options so caveat emptor. I'll be fine though. When I get this thing to POST that is)


----------



## lexluthermiester (Jan 14, 2022)

Tigger said:


> Why so much hate on ADL I don't get it.


You know the answer to that...



fevgatos said:


> My 10900k could get anywhere* up to 40-50% more framerate going from 3200 to 4400c16*


That is mathematically impossible.


----------



## fevgatos (Jan 14, 2022)

Wanna elaborate on the mathematical impossibility of it?


----------



## lexluthermiester (Jan 14, 2022)

fevgatos said:


> Wanna elaborate on the mathematical impossibility of it?


Sure, I'll take a stab at it.. With DDR4, going from 3200mhz to 4400mhz is about a 26% jump in raw clock speed.
However, going from 3200 to 4400 would *REQUIRE* more relaxed timings, thus diminishing the actual effect of the speed increase.
In the real world, you might have gained as much as 10% boost in overall RAM performance, maybe even 15% if you bought some 4400 with incredible timings.

However, there is no effing way you got a 40% to 50% increase in performance from that memory speed jump!
So either you are wildly exaggerating or you are full of what comes out of the south end of a north-bound moose...


----------



## fevgatos (Jan 14, 2022)

lexluthermiester said:


> Sure, I'll take a stab at it.. With DDR4, going from 3200mhz to 4400mhz is about a 26% jump in raw clock speed.
> However, going from 3200 to 4400 would *REQUIRE* more relaxed timings, thus diminishing the actual effect of the speed increase.
> In the real world, you might have gained as much as 10% boost in overall RAM performance, maybe even 15% if you bought some 4400 with incredible timings.
> 
> ...


Why would you need more laxed timmings? Sorry but its obvious to me you ve never tinkered with ddr overclocking. A 3200c16 xmp kit gets around 45k bandwidth and 55 ns latency. A tuned 4400c16 get 70k bandwidth and 35 to 37ns latency.

Dont have a 10900k anymore but a few days ago i tested an 11600k. With 3200c16 kit was getting 65 fps in cp2077 in front of vis apartment. With 3333c12 manually tuned was getting over 90.

A 30%+ performance gain in gaming is pretty common with memory tuning. Especially if you go down all the way to RTLs, ive seen Ryzens get an 80% increase in minimum fps.


----------



## lexluthermiester (Jan 14, 2022)

fevgatos said:


> Sorry but its obvious to me you ve never tinkered with ddr overclocking.


Irony.


fevgatos said:


> A 3200c16 xmp kit gets around 45k bandwidth and 55 ns latency. A tuned 4400c16 get 70k bandwidth and 35 to 37ns latency.


That's total nonsense. But even if it were true, you're still not getting anywhere near 40% performance improvement. Period, end of story, full stop.


----------



## Noah Katz (Jan 14, 2022)

fevgatos said:


> Dont have a 10900k anymore but a few days ago i tested an 11600k. With 3200c16 kit was getting 65 fps in cp2077 in front of vis apartment. With 3333c12 manually tuned was getting over 90.





lexluthermiester said:


> Irony.





lexluthermiester said:


> That's total nonsense. But even if it were true, you're still not getting anywhere near 40% performance improvement. Period, end of story, full stop.




I haven't a clue who's correct, but as a rule I go with assertions supported by numbers over data-free certainty.


----------



## lexluthermiester (Jan 14, 2022)

Noah Katz said:


> I haven't a clue who's correct, but as a rule I go with assertions supported by numbers over data-free certainty.


Ok. Let's do the simple math. What is the number difference between 3200 and 4400? 1200. 4400 /1200 = 3.6. When converted into what percentage of 4400 1200 is, it's about 27%. Ok, let's remember that number.

Now let's talk about timings.
The best example of 4400 is here;








						G.SKILL Trident Z Royal Series 16GB (2 x 8GB) 288-Pin PC RAM DDR4 4400 (PC4 35200) Desktop Memory Model F4-4400C16D-16GTRS - Newegg.com
					

Buy G.SKILL Trident Z Royal Series 16GB (2 x 8GB) 288-Pin PC RAM DDR4 4400 (PC4 35200) Desktop Memory Model F4-4400C16D-16GTRS with fast shipping and top-rated customer service. Once you know, you Newegg!




					www.newegg.com
				



This is the best 4400 kit I could find.

With 3200, we'll stay with a the same brand and product line;








						G.SKILL Trident Z Royal Series 16GB (2 x 8GB) 288-Pin RGB DDR4 SDRAM DDR4 3200 (PC4 25600) Desktop Memory Model F4-3200C16D-16GTRS - Newegg.com
					

Buy G.SKILL Trident Z Royal Series 16GB (2 x 8GB) 288-Pin RGB DDR4 SDRAM DDR4 3200 (PC4 25600) Desktop Memory Model F4-3200C16D-16GTRS with fast shipping and top-rated customer service. Once you know, you Newegg!




					www.newegg.com
				




Now let's look closely at those timings.
4400mhz
16-19-19-39

3200mhz
16-18-18-38

Doing the math those differences are minor until you factor speed difference. In the case of these two kits, the approximate differences would amount to a 21% performance difference.

However, if we go with a better performing kit of 3200mhz the result changes;








						G.SKILL Ripjaws V Series 16GB (2 x 8GB) DDR4 3200 (PC4 25600) Desktop Memory Model F4-3200C14D-16GVK - Newegg.com
					

Buy G.SKILL Ripjaws V Series 16GB (2 x 8GB) DDR4 3200 (PC4 25600) Desktop Memory Model F4-3200C14D-16GVK with fast shipping and top-rated customer service. Once you know, you Newegg!




					www.newegg.com
				




Now let's compare again.
4400mhz
16-19-19-39

3200mhz
14-14-14-34

Hmmm... The math shows that performance difference shrinking greatly because the 3200mhz kit now performs better with tighter timings.

What have we learned here? Simple, the maximum increase from 3200mhz to 4400mhz, assuming similar timings, is 21% not the 40% to 50% claimed by @fevgatos. That's assuming one would intentionally buy a lower performing 3200mhz ram kit. The smart buy would be the kit with the tighter timings and at that point the performance difference becomes entirely less rosy.

So once again, the claim of 40% to 50% is hogwash. Total and complete nonsense.


----------



## Noah Katz (Jan 14, 2022)

That's better, but fevgatos gave (possibly erroneous) measurements vs. your perhaps well-founded conjecture; I don't know enough about where the bottlenecks etc are to judge for myself.


----------



## lexluthermiester (Jan 14, 2022)

Noah Katz said:


> That's better, but fevgatos gave (possibly erroneous) measurements vs. your perhaps well-founded conjecture; I don't know enough about where the bottlenecks etc are to judge for myself.


That's the thing, the percentages math doesn't lie. The laws of physics require that two similar things perform similarly. When you increase the speed of one thing, the performance will increase by a proportional degree over the other.


----------



## phanbuey (Jan 14, 2022)

I overclocked my ram with my 10850K @ 5.1 - and I think SOTR Trial would go from 124-125 minimum CPU with XMP 3200 14-14-14-31 to something like 152 min with 4133 C17 with tuned subs -- so about a ~22% difference between the two kits in min FPS for about ~29% more bandwith.  That was probably the most sensitive bench that I could reliably reproduce for me.

If you OC ram though, you're probably also yeeting the cache and core so you _can_ get 40-50% more FPS, but yeah probably not just from ram alone.


----------



## fevgatos (Jan 14, 2022)

lexluthermiester said:


> Ok. Let's do the simple math. What is the number difference between 3200 and 4400? 1200. 4400 /1200 = 3.6. When converted into what percentage of 4400 1200 is, it's about 27%. Ok, let's remember that number.
> 
> Now let's talk about timings.
> The best example of 4400 is here;
> ...


I specifically compared a 3200c16 kit running XMP with a bdie kit being manually tuned. OBVIOUSLY, if you compare a bdie kit to another bdie kit (in your case, the 3200c14 you just put up there) the difference shrinks. Also, as I've said before, you have no clue how ddr oc works. Judging the first 4 numbers is just...wrong. Tuning a ddr4 kit involves around 30+ different timings, all of them are going to be lower on a tuned 4000+ kit than on a 3200c16. Again, these are things that anyone who tried memory ocing would know. 

Also, comparing percentages between freequency and CL's is just wrong. The CL timing has to be translated into actual nanoseconds, by multiplying it with the freequency. The formula is CL / freequency x 2000. Basically a 4000c16 kit, besides the bandwidth, has a lot lower latency than a 3200c16 kit. That's without touching any other timings. As for the secondary and tertiary's themselves, a 3200c16 for example runs trfc at 560 on xmp while you can tight it down all the way to ~280-320 on a bdie kit. Trefi runs at 8092 on a 3200c16 kit, you can run it at 65535 on a bdie kit. How do you compare that with a calculator? 

So, question, have you ever tuned your ram or are you just theorycrafting?



phanbuey said:


> I overclocked my ram with my 10850K @ 5.1 - and I think SOTR Trial would go from 124-125 minimum CPU with XMP 3200 14-14-14-31 to something like 152 min with 4133 C17 with tuned subs -- so about a ~22% difference between the two kits in min FPS for about ~29% more bandwith.  That was probably the most sensitive bench that I could reliably reproduce for me.
> 
> If you OC ram though, you're probably also yeeting the cache and core so you _can_ get 40-50% more FPS, but yeah probably not just from ram alone.


That's a GPU bottleneck. The 10850k can get over 220 in SOTR. Check the game CPU numbers under the average FPS. That's how many FPS your CPU can do if there is no GPU bottleneck. Was getting around 240-250 with tuned ram.


----------



## phanbuey (Jan 14, 2022)

fevgatos said:


> That's a GPU bottleneck. The 10850k can get over 220 in SOTR. Check the game CPU numbers under the average FPS. That's how many FPS your CPU can do if there is no GPU bottleneck. Was getting around 240-250 with tuned ram.


So this was the freeware version that I did that ram difference bench on -- I have the full version of the game and it was 238 on 1080P lowest so your numbers definitely line up... Unfortunately I didn't write down ram testing settings or anything on the later builds because I am lazy af.


----------



## fevgatos (Jan 14, 2022)

Was talking about highest preset settings, basically the exact same settings Guru 3d is testing. Here is one of my runs, was testing the 3090 in this one but doesn't really matter, you can see the CPU game graph for CPU results.







And here it is at 1080p highest


----------



## The King (Jan 15, 2022)




----------



## lexluthermiester (Jan 15, 2022)

fevgatos said:


> I specifically compared a 3200c16 kit running XMP with a bdie kit being manually tuned. OBVIOUSLY, if you compare a bdie kit to another bdie kit (in your case, the 3200c14 you just put up there) the difference shrinks. Also, as I've said before, you have no clue how ddr oc works. Judging the first 4 numbers is just...wrong. Tuning a ddr4 kit involves around 30+ different timings, all of them are going to be lower on a tuned 4000+ kit than on a 3200c16. Again, these are things that anyone who tried memory ocing would know.
> 
> Also, comparing percentages between freequency and CL's is just wrong. The CL timing has to be translated into actual nanoseconds, by multiplying it with the freequency. The formula is CL / freequency x 2000. Basically a 4000c16 kit, besides the bandwidth, has a lot lower latency than a 3200c16 kit. That's without touching any other timings. As for the secondary and tertiary's themselves, a 3200c16 for example runs trfc at 560 on xmp while you can tight it down all the way to ~280-320 on a bdie kit. Trefi runs at 8092 on a 3200c16 kit, you can run it at 65535 on a bdie kit.


Folks, this is text book example of someone trying to move the goalposts rather than admitting they were wrong or exaggerated. It's also an example of someone letting their pride override logic.


fevgatos said:


> How do you compare that with a calculator?


Very easily.


fevgatos said:


> So, question, have you ever tuned your ram or are you just theorycrafting?


No, I've never done anything of that sort... never.... not at all..... /S

Alderlake doesn't have a magical IMC and the laws of physics prevail. There is no possible way to get a 40% to 50% RAM performance boost going from 3200mhz RAM to 4400mhz RAM. End of story, full stop. And now it's time to use a button...


----------



## fevgatos (Jan 15, 2022)

lexluthermiester said:


> Alderlake doesn't have a magical IMC and the laws of physics prevail. There is no possible way to get a 40% to 50% RAM performance boost going from 3200mhz RAM to 4400mhz RAM. End of story, full stop. And now it's time to use a button...


Was talking about Cometlake actually, not Alderlake. But doesn't matter, you are wrong either way. There is no moving goalposts, from my very first post I was talking about a 3200c16, I even wrote it, lol. Using your argument against you, maybe because you don't want to admit you pretend taht I was talking about a bdie 3200c14.

If I install back my 11600k and run cyberpunk in front of Vi's apartment with 2 different ram configurations and prove you wrong, do I get something?


----------



## TheoneandonlyMrK (Jan 15, 2022)

fevgatos said:


> Was talking about Cometlake actually, not Alderlake. But doesn't matter, you are wrong either way. There is no moving goalposts, from my very first post I was talking about a 3200c16, I even wrote it, lol. Using your argument against you, maybe because you don't want to admit you pretend taht I was talking about a bdie 3200c14.
> 
> If I install back my 11600k and run cyberpunk in front of Vi's apartment with 2 different ram configurations and prove you wrong, do I get something?


Validated, because at the minute I think it's you stretching reality, I have OCD memory, I haven't messed with tertiary timings though but still, show us the way or do you think we wouldn't want an extra 50% FPS.


----------



## fevgatos (Jan 15, 2022)

TheoneandonlyMrK said:


> Validated, because at the minute I think it's you stretching reality, I have OCD memory, I haven't messed with tertiary timings though but still, show us the way or do you think we wouldn't want an extra 50% FPS.


What ram and CPU do you have? 

What timings have you tinkered with already? 

Generally speaking, the "easy" way of doing it is setting IO / SA to the max voltages you are comfortable with running 24/7 (that's up to you honestly), set VDIMMs to 1.5 volt, enable XMP and start upping the freequency. Then you need multiple TM5 1usmus or antaextreme runs (10 cycles) every time you change any timing to validate stability. It's not something you can do over a weekend, unless you already know the roundabout capabilities of your dimms and IMC. If you don't want to spend an eternity, primaries / trefi and TRFC are the most important timings. You can use AIDA64 to test your latency and see your progress, but make sure to boot in safe mode for consistent results. Every app running on the background makes it incosistent (steam by itself for example adds 2-3ns to your latency). 

I can post some old gaming benches where I run 3200c16 and 4400 tuned, but would you actually believe it if I told you it's just the ram making the difference? If yes, I can try to find them, no problem. 

*The 50% difference obviously applies to CPU bound runs, I mean it's self evident right? Don't expect to be running 1440p ultra and 4k and see any meaningful difference.


----------



## TheoneandonlyMrK (Jan 15, 2022)

fevgatos said:


> What ram and CPU do you have?
> 
> What timings have you tinkered with already?
> 
> ...


I'm sorry but initially it seemed you were talking gaming FPS gain's, I have ran b die , on tight 1usmus settings ,all of them too and didn't see 50% gains in anything, different platform though.
So like I said you need to validate your statement, surely someone has demonstrated your point in a provable way, it shouldn't take you rebuilding anything since it's a big world and I know others oc memory.
I'll have a look round too.


----------



## fevgatos (Jan 15, 2022)

Well for example, this is my 10900k running AOTS DX12 stock with tuned ram. A 10900k stock with XMP gets around 130 in normal batch. You can validate that with capframe X, they did some runs back when 5800x performance was leaked. It's not the most memory sensitive game but it's up there. AOTS runs the CPU logic completely independent of the GPU, so the numbers are consistent across GPU's (had a 1080ti in this one).  Biggest gains are in Farcry 6 and SOTR.








TheoneandonlyMrK said:


> I'm sorry but initially it seemed you were talking gaming FPS gain's


I do, but that only applies to me and my specific use case. I'm playing warzone and COD at 5120*1440 but with most settings to low and DLSS balanced. With these settings I'm mostly CPU bound, so for me ram makes a difference even at that resolution. Other people that prefer to run high or ultra will be GPU bound so they don't need to tinker with their ram as much. Also some games just don't care about ram at all, so even if you are playing on 720p ram won't make a difference (the whole crysis series for example) Ram makes a difference when there are lots of cache prediction misses, so the cache has to get data from the the memory.


----------



## lexluthermiester (Jan 15, 2022)

fevgatos said:


> Was talking about Cometlake actually, not Alderlake. But doesn't matter, you are wrong either way. There is no moving goalposts, from my very first post I was talking about a 3200c16, I even wrote it, lol. Using your argument against you, maybe because you don't want to admit you pretend taht I was talking about a bdie 3200c14.
> 
> If I install back my 11600k and run cyberpunk in front of Vi's apartment with 2 different ram configurations and prove you wrong, do I get something?


You have proven nothing. You are only making claims that are unsupported by reality. We all know you're full of moose droppings, but hey, do carry on.


----------



## fevgatos (Jan 15, 2022)

lexluthermiester said:


> You have proven nothing. You are only making claims that are unsupported by reality. We all know you're full of moose droppings, but hey, do carry on.


I don't care about proving anything. Everyone that tinkers with ram knows this. You specifically have no idea how memory oc works, that you have proven. You dont know how timings scale with freequency and instead you are comparing flat timings across dimms with different frequencies. Im sorry but its obvious you dont know much about the topic, so why are you so invested in your opinion when it's obviously uninformed..?


----------



## lexluthermiester (Jan 15, 2022)

fevgatos said:


> I don't care about proving anything.


Because you can't. You made a crazy, impossible claim and got called out for it. But finally, I digress, we've strayed from the topic..


----------



## fevgatos (Jan 15, 2022)

lexluthermiester said:


> Because you can't. You made a crazy, impossible claim and got called out for it. But finally, I digress, we've strayed from the topic..


You are right, if that makes you feel better. Ignorance is bliss.


----------



## fusseli (Jan 15, 2022)

have you guys never seen this table someone made? THe combination of cas latency and ram clock is what makes speed, it's that simple. This is why I have 3600 C14 and not something else, it's basically as fast as 4600 C18. That 4000 C15 is nice stuff lexluthermiester


----------



## The King (Jan 16, 2022)

Intel is going to lock this down IMHO get the BIOS if you have the Z690 Apex or Hero.


----------



## Deleted member 24505 (Jan 16, 2022)

The King said:


> Intel is going to lock this down IMHO get the BIOS if you have the Z690 Apex or Hero.



Nice, ADL is so crappy isn't it /s

Unlinked BCLK is so awesome


----------



## birdie (Jan 16, 2022)

The King said:


> Intel is going to lock this down IMHO get the BIOS if you have the Z690 Apex or Hero.


I'm sure Intel will absolutely ask ASUS and other vendors (AsRock?) to close this loophole cause it makes their products segmentation null and void.



Tigger said:


> Nice, ADL is so crappy isn't it /s



I see quite the opposite. Not sure if it's a joke, sarcasm or something else. ADL is actually great.


----------



## lexluthermiester (Jan 16, 2022)

birdie said:


> I'm sure Intel will absolutely ask ASUS and other vendors (AsRock?) to close this loophole cause it makes their products segmentation null and void.


Yeah, they'll get right on that!.... 



birdie said:


> Not sure if it's a joke, sarcasm or something else. ADL is actually great.


It was sarcasm, thus the " /s "...


----------



## Deleted member 24505 (Jan 16, 2022)

birdie said:


> I'm sure Intel will absolutely ask ASUS and other vendors (AsRock?) to close this loophole cause it makes their products segmentation null and void.
> 
> 
> 
> I see quite the opposite. Not sure if it's a joke, sarcasm or something else. ADL is actually great.



I use /s for sarcasm as there is no other way to get it across. I actually have a 12700k and luuuurve it



lexluthermiester said:


> Yeah, they'll get right on that!....



Lots will be storing that 0811 bios for future reference i am sure.


----------



## The King (Jan 16, 2022)

Tigger said:


> I use /s for sarcasm as there is no other way to get it across. I actually have a 12700k and luuuurve it
> 
> 
> 
> Lots will be storing that 0811 bios for future reference i am sure.


People who buy a 12400 and OC it to beat your 12700K will luuuurvee it even more.  
j/k


----------



## Deleted member 24505 (Jan 16, 2022)

The King said:


> People who buy a 12400 and OC it to beat your 12700K will luuuurvee it even more.
> j/k



Mine still has more cores


----------



## lexluthermiester (Jan 16, 2022)

The King said:


> People who buy a 12400 and OC it to beat your 12700K will luuuurvee it even more.
> j/k


Until they try to run something that needs more cores/threads.


----------



## Deleted member 24505 (Jan 16, 2022)

lexluthermiester said:


> Until they try to run something that needs more core/threads.



Which is what the video shows. No doubt the OC'd 12400 is impressive, but who is gonna buy a top end Asus board with DDR5 and put a 12400 in it........No one

There is a small chance of modded bios appearing for lower end boards though


----------



## fevgatos (Jan 17, 2022)

These are some results on a 12900k with a u12a. Temps hit 90c  Single is at 5.6-5.7, depending on the workload and multi is 5.2-5.3. Using OCTVB to downclock at 85c


----------



## Jonny21 (Jan 17, 2022)

Hi everyone, I wanted to understand something about the i7-12700k CPU. Using CPU-Z on my CPU, the cache is described as in this image [https://ibb.co/kc22fvK]. But looking on the net I found it described like this [https://ibb.co/R0ntLqH].
Is there an explanation for this? Are there perhaps 2 different variants? Or is mine a faulty model?
I also ask because I am having some small problem with this CPU, for example when I start a video (using MPC-HC, MPC-BE, VLC or even the one integrated in Win11), I always have a little stutter when starting the video (and even in the middle if I zap) problems I didn't have with the previous CPU (i7-3770).


----------



## The King (Jan 17, 2022)

Jonny21 said:


> Hi everyone, I wanted to understand something about the i7-12700k CPU. Using CPU-Z on my CPU, the cache is described as in this image [https://ibb.co/kc22fvK]. But looking on the net I found it described like this [https://ibb.co/R0ntLqH].
> Is there an explanation for this? Are there perhaps 2 different variants? Or is mine a faulty model?
> I also ask because I am having some small problem with this CPU, for example when I start a video (using MPC-HC, MPC-BE, VLC or even the one integrated in Win11), I always have a little stutter when starting the video (and even in the middle if I zap) problems I didn't have with the previous CPU (i7-3770).


One CPU-Z screenshot version is 1.98 the other is 1.99. There are changes between the two versions.

Try with version 1.98 and see if it shows the same.


----------



## Jonny21 (Jan 17, 2022)

The King said:


> One CPU-Z screenshot version is 1.98 the other is 1.99. There are changes between the two versions.
> 
> Try with version 1.98 and see if it shows the same.


I had already tried, no difference.


----------



## Deleted member 24505 (Jan 17, 2022)

Here is mine on 1.99.0 i think it's correct. 8x2+4=12 cores/20 threads

Tried vieos with VLC and no stutter when starting up or skipping around, it is instant.


----------



## Jonny21 (Jan 17, 2022)

Tigger said:


> Here is mine on 1.99.0 i think it's correct. 8x2+4=12 cores/20 threads
> 
> Tried vieos with VLC and no stutter when starting up or skipping around, it is instant.
> View attachment 232803


In the windows task manager how is the cache reported? For me is:


----------



## Deleted member 24505 (Jan 17, 2022)

Jonny21 said:


> In the windows task manager how is the cache reported? For me is:


----------



## Jonny21 (Jan 17, 2022)

Tigger said:


> View attachment 232806


Thanks for the screenshot, at this point I am wondering is it really possible that it has a faulty CPU ???


----------



## The King (Jan 17, 2022)

Jonny21 said:


> Thanks for the screenshot, at this point I am wondering is it really possible that it has a faulty CPU ???


Maybe the issue it with your motherboard. Are you running the latest BIOS?


----------



## Jonny21 (Jan 17, 2022)

The King said:


> Maybe the issue it with your motherboard. Are you running the latest BIOS?


The motherboard is the ROG STRIX Z690-G GAMING WIFI, I have already updated the BIOS to the latest version, 0811.


----------



## The King (Jan 17, 2022)

Intel Core i7 12700 @ 4688.53 MHz - CPU-Z VALIDATOR
					

[s0ida0] Validated Dump by Anonymous (2022-01-15 02:06:15) - MB: MSI MAG B660M MORTAR WIFI DDR4 (MS-7D42) - RAM: 32768 MB




					valid.x86.fr
				
















						Intel Core i5 12500 @ 4389.26 MHz - CPU-Z VALIDATOR
					

[93qy2b] Validated Dump by Anonymous (2022-01-14 03:09:32) - MB: MSI MAG B660M MORTAR WIFI DDR4 (MS-7D42) - RAM: 16384 MB




					valid.x86.fr


----------



## Jonny21 (Jan 17, 2022)

I tinkered with the BIOS a bit, if I disable the virtualization the cache becomes correct (but I have anyway the stutters in the videos ...). Curiously both CPU-Z and Intel Processor Identification Utility say my CPU doesn't support VT-x if I keep it on, but they say it supports it if I turn it off. I'm not understanding anything anymore.


----------



## Deleted member 24505 (Jan 17, 2022)

Mine does not support hyper threading


----------



## The King (Jan 17, 2022)

Intel Celeron Overclocked by 57% and new 4-Core Records - non-K OC


----------



## arabus (Jan 17, 2022)

Asrock PG Velocita also with BCLK oc.


----------



## thelawnet (Jan 17, 2022)

The King said:


> Intel is going to lock this down IMHO get the BIOS if you have the Z690 Apex or Hero.


Meh.

12400f = $175 (4.4GHz)
12400 = $200
12500 = $210 (4.6GHz)
12600kf = $270 (4.9GHz)
12600k = $295

So if they take a $700 motherboard, then they can get the 12400f or 12400 to 5.1 GHz, which will lose to a 12600k, which is barely slower but has more cores. And they only needed to spend $500 more on the board.


----------



## The King (Jan 17, 2022)

thelawnet said:


> Meh.
> 
> 12400f = $175 (4.4GHz)
> 12400 = $200
> ...


Has already stated it is possible other board vendors may enable this in there BIOS at some point. Provided they have an external clock gen/hardware support on the board.
Asrock are known for doing things like this even if Intel has somethings to say about it.

I would be surprised if intel does not do somethings about it. However if they leave ASUS then there is nothing stopping other vendors from doing the same with cheaper priced boards..


----------



## phanbuey (Jan 17, 2022)

So this used to be how OG overclocking was.  I think they will let it trickle down.  The effort to curb it and the mindshare damage wouldn't be worth it -- plus, they will need all the tricks to stay competitive in the mid range in a few months.

I could even see them enable it on purpose on 700 chipsets and support it officially for raptor lake to entice ppl to switch over.


----------



## Mindweaver (Jan 17, 2022)

So, I bit the bullet and bought an i7 12700k ($382) and a ASUS ROG Strix Z690-A Gaming WiFi D4 DDR4 last week. I'm still waiting on the board to get here. I will be upgrading my Ryzen 2600 system. The funny part is that 5 weeks ago I wasn't looking to upgrade.. Until.. I have to share this story.. lol So, I get home from the office and my daughter is sitting at my desk playing Minecraft and watching youtube on the other monitor. I go over to her and she flips her headphones down and looks at me straight faced and says these magic words.... "Dad, I'm a gamer now.". I want my own pc. lol 

I was stoked! well at first but now she has been on my computer playing games and watching youtube ever since.. lol I haven't sat down to play any games since. So, last week when Newegg had the i7 on sale for 382 i jumped on it to get my desktop back. lol The crazy part is that I just built a gaming pc for her and her older sister that I put in the loft over a year ago. It's an i7 4770, 16gb, 1tb hhd, with an AMD R9 something I think 270x or 280x. They only use it to play youtube, Hulu, and Netflix.. lol

So, now I'll have a 12th gen this weekend. lol and my 9 year old daughter is a gamer. :d I was really going to hold out for the new Ryzen, but finding out it will be ddr5 only kind of pushed me to bit the bullet now. I do feel like the 12th gen is the Core Duo and the next Gen will be the sweet spot kind of like the Core 2 Duo of processors. You know less power and runs way cooler and has better overclocking head room. Anyway, I just seen this thread and really haven't been keeping up with the new alder lake other than they are hot and fast... So, what did I get myself into? I have already spec'd out an i9 to replace my i7 5820k. lol


----------



## lexluthermiester (Jan 17, 2022)

Mindweaver said:


> So, I bit the bullet and bought an i7 12700k ($382) and a ASUS ROG Strix Z690-A Gaming WiFi D4 DDR4 last week. I'm still waiting on the board to get here. I will be upgrading my Ryzen 2600 system. The funny part is that 5 weeks ago I wasn't looking to upgrade.. Until.. I have to share this story.. lol So, I get home from the office and my daughter is sitting at my desk playing Minecraft and watching youtube on the other monitor. I go over to her and she flips her headphones down and looks at me straight faced and says these magic words.... "Dad, I'm a gamer now.". I want my own pc. lol
> 
> I was stoked! well at first but now she has been on my computer playing games and watching youtube ever since.. lol I haven't sat down to play any games since. So, last week when Newegg had the i7 on sale for 382 i jumped on it to get my desktop back. lol The crazy part is that I just built a gaming pc for her and her older sister that I put in the loft over a year ago. It's an i7 4770, 16gb, 1tb hhd, with an AMD R9 something I think 270x or 280x. They only use it to play youtube, Hulu, and Netflix.. lol
> 
> So, now I'll have a 12th gen this weekend. lol and my 9 year old daughter is a gamer. :d I was really going to hold out for the new Ryzen, but finding out it will be ddr5 only kind of pushed me to bit the bullet now. I do feel like the 12th gen is the Core Duo and the next Gen will be the sweet spot kind of like the Core 2 Duo of processors. You know less power and runs way cooler and has better overclocking head room. Anyway, I just seen this thread and really haven't been keeping up with the new alder lake other than they are hot and fast... So, what did I get myself into? I have already spec'd out an i9 to replace my i7 5820k. lol


That is a very common reason to upgrade. You've got a great CPU that should last you a while. And your daughter inherited(presumably) a solid system as well.


----------



## Mindweaver (Jan 17, 2022)

lexluthermiester said:


> That is a very common reason to upgrade. You've got a great CPU that should last you a while. And your daughter inherited(presumably) a solid system as well.


Thanks! Yeah she will be getting either my R5 2600 or my wife's i7 4790k. Probably R5 2600, 16gb 3000, 1tb ssd, GTX 970, corsair 600 modular, Phantek P400A or P300. I just bought a P400A to replace my Cooler Master Storm Scout. I haven't even taken it out of the box yet. I did buy a 360 AIO cooler master Master Liquid ML360 Mirror ARGB. I'd have to order the 1700 bracket if I didn't get an Asus board. They seem to be the only ones using 1200 and 1700.


----------



## Deleted member 24505 (Jan 17, 2022)

Not that hot, my 12700k is fine gaming. obvs if running benches balls out it will be hot, but otherwise is fine. What do benches matter anyway, only epeen, nothing of value. The 12700k is a fine chip. I have exactly the same board too btw, had zero problems with it.


----------



## lexluthermiester (Jan 17, 2022)

Mindweaver said:


> Thanks! Yeah she will be getting either my R5 2600 or my wife's i7 4790k.


Ah! Unless your wife is also a gamer, I would give the 2600 to your daughter. Especially given the other parts you mention.


----------



## phanbuey (Jan 17, 2022)

I would upgrade that 2600 with a used 5800x when the 3d v cache comes out.


----------



## TheoneandonlyMrK (Jan 17, 2022)

The King said:


> Intel Celeron Overclocked by 57% and new 4-Core Records - non-K OC


For this to be useful in a gaming rig your going to need quite an expensive board, with not only one external bclk clock generator chip , you will need two like for example a crosshair hero7 has, because pciex and in general motherboard attached device's like usb chips don't like their base clock increasing by 57%.
At all.
Makes for an awesome news piece and I love old school OC's but it's important to inform anyone thinking of doing this they'll need the right board regarding bios ,and overclocking features, not many have separate clock generators for core and pciex buss.


----------



## MxPhenom 216 (Jan 17, 2022)

The red spirit said:


> This is what you hear today, when Intel can't engineer for shit and when their flagship isn't 77W chip anymore. Rationalization at its finest.
> 
> 
> *They are not really "7nm" or "10nm".*


This.

Also, Intel Superfin (10nm) has the same density as TSMC 7.


----------



## Mindweaver (Jan 17, 2022)

Tigger said:


> Not that hot, my 12700k is fine gaming. obvs if running benches balls out it will be hot, but otherwise is fine. What do benches matter anyway, only epeen, nothing of value. The 12700k is a fine chip. I have exactly the same board too btw, had zero problems with it.


Yeah, I don't care about benches either. I was just saying that's the only thing I've heard so far. I did check out a few reviews and saw idle and even gaming was fine heat wise. The only one that really has heat problems is the i9, but I don't see anyone buying an i9 and not buying decent cooler, but yeah man I can't wait to get the board. Just wondering which EK block did you use 1200 or 1700? That's the only real problem I've seen with Alder Lake is the lack of cooler options. If I wanted a 1700 mounting bracket for my 360 aio I'd have to order it from Cooler Master. I'm surprise more board makers didn't do what Asus did with adding 1200 and 1700 support. 



phanbuey said:


> I would upgrade that 2600 with a used 5800x when the 3d v cache comes out.


I really debated on getting the 5800x but since I have to get a board and cpu to build my daughters rig I just went with the newer setup. If I didn't then I would have just bought the 5800x. Also, I would get it for her but she is only playing Minecraft and Roblox.. lol Now, if the 5800x ever has a really great deal then I might but by the time she needs something new then ddr5 or ddr6 should be here.. lol I'm just stoked that AMD is finally switching to LGA over PGA.


----------



## fevgatos (Jan 17, 2022)

Mindweaver said:


> The only one that really has heat problems is the i9, but I don't see anyone buying an i9 and not buying decent cooler


Nah, don't believe what you are hearing around. At first I was worried as well after seeing the reviews and users on forums saying how it's scorching and that it needs a custom water cooler, since I had just bought a u12a. Here are my results running every stress I had available. Granted, no prime 95, but I doubt it can get much worse than AIDA CPU+FPU.

Max temperature was 70c, there was a spike, usually it was sitting between 58 and 65.

It's not a cheap cooler, but you can get similar or even better results with cheaper ones (like the ak620 from deepcool).


----------



## Deleted member 24505 (Jan 17, 2022)

Mindweaver said:


> Just wondering which EK block did you use 1200 or 1700?



My block is the EK supremacy classic nickel, might rebuy the copper version maybe. It is LGA 1200 but there is a LGA 1700 lovely thick back plate you can get for it, or other EK blocks. I already had the loop, so just bought a new CPU block.





At some point i would like to get the quantum though as it looks cleaner, but this is not too expensive, and temps are fine. loop pic in my sig


----------



## Mindweaver (Jan 18, 2022)

fevgatos said:


> Nah, don't believe what you are hearing around. At first I was worried as well after seeing the reviews and users on forums saying how it's scorching and that it needs a custom water cooler, since I had just bought a u12a. Here are my results running every stress I had available. Granted, no prime 95, but I doubt it can get much worse than AIDA CPU+FPU.
> 
> Max temperature was 70c, there was a spike, usually it was sitting between 58 and 65.
> 
> It's not a cheap cooler, but you can get similar or even better results with cheaper ones (like the ak620 from deepcool).


That's great to know! I'm really debating on upgrading my 5820k to a 12900k. I really like my 5820k but some of the usb ports are starting to flake out.



Tigger said:


> My block is the EK supremacy classic nickel, might rebuy the copper version maybe. It is LGA 1200 but there is a LGA 1700 lovely thick back plate you can get for it, or other EK blocks. I already had the loop, so just bought a new CPU block.
> View attachment 232900View attachment 232901
> At some point i would like to get the quantum though as it looks cleaner, but this is not too expensive, and temps are fine. loop pic in my sig
> View attachment 232902


Nice!


----------



## lexluthermiester (Jan 18, 2022)

Mindweaver said:


> That's great to know! I'm really debating on upgrading my 5820k to a 12900k.


That's a no brainer... Go for it, although, if you're willing to overclock a little, get another 12700k instead and save some of your money for a GPU upgrade(if you need one).


----------



## Fleurious (Jan 18, 2022)

Quite happy with my 12700k's performance under a Noctua NH-D15s.  Hottest core was 79c after a 10min Cinabench R23 run.  All stock CPU settings paired with 32GB of G.Skill 6000 CL36 memory.


----------



## fevgatos (Jan 19, 2022)

Fleurious said:


> Quite happy with my 12700k's performance under a Noctua NH-D15s.  Hottest core was 79c after a 10min Cinabench R23 run.  All stock CPU settings paired with 32GB of G.Skill 6000 CL36 memory.


Very similar results on a u12a and 12900k (got 77c package temp). What's your ambient?


----------



## Fleurious (Jan 19, 2022)

fevgatos said:


> Very similar results on a u12a and 12900k (got 77c package temp). What's your ambient?



Ambient temps in my computer room are usually 23-24c


----------



## birdie (Jan 20, 2022)

B660 BCLK OC'ing is working!

Intel Core i5 10400F rivals ... Ryzen 7 5800X!










And power consumption is just around 115W (vs ~144W for Ryzen).

Golden Cove is really Golden.


----------



## Deleted member 24505 (Jan 20, 2022)

birdie said:


> B660 BCLK OC'ing is working!
> 
> Intel Core i5 10400F rivals ... Ryzen 7 5800X!
> 
> ...



Yikes this is a real bonus.


----------



## Mindweaver (Jan 21, 2022)

My board is out for delivery! I'm ready to start building it. Also, I was checking out Newegg and they have the i7 12700k on sale for 389 then a 25 dollar discount making it 364! I'm tempted to return mine and buy it but I don't want to wait another week to get it to only save 25 bucks.. lol


----------



## Deleted member 24505 (Jan 21, 2022)

Mindweaver said:


> My board is out for delivery! I'm ready to start building it. Also, I was checking out Newegg and they have the i7 12700k on sale for 389 then a 25 dollar discount making it 364! I'm tempted to return mine and buy it but I don't want to wait another week to get it to only save 25 bucks.. lol
> 
> View attachment 233408



What cooler you using? My bundle cost me £699-£75 from Asus cashback


----------



## lexluthermiester (Jan 21, 2022)

Mindweaver said:


> Also, I was checking out Newegg and they have the i7 12700k on sale for 389 then a 25 dollar discount making it 364! I'm tempted to return mine and buy it but I don't want to wait another week to get it to only save 25 bucks.. lol


Wouldn't be worth the hassle.


----------



## Mindweaver (Jan 21, 2022)

Tigger said:


> What cooler you using? My bundle cost me £699-£75 from Asus cashback


360 AIO Cooler Master Master Liquid ML360 Mirror ARGB. It has LGA 1200 I would have to order the LGA 1700 from Cooler Master's website. 



lexluthermiester said:


> Wouldn't be worth the hassle.


Yeah, I'm not waiting another week.


----------



## Deleted member 24505 (Jan 21, 2022)

Mindweaver said:


> 360 AIO Cooler Master Master Liquid ML360 Mirror ARGB. It has LGA 1200 I would have to order the LGA 1700 from Cooler Master's website.
> 
> 
> Yeah, I'm not waiting another week.



The asus z690-a wifi d4 has lga 1200 holes so will fit till you get the 1700 plate.

You got your board yet?


----------



## Mindweaver (Jan 21, 2022)

Tigger said:


> The asus z690-a wifi d4 has lga 1200 holes so will fit till you get the 1700 plate.
> 
> You got your board yet?


No, it's still out for delivery now. I don't think I'm going to get the 1700 plate. That's one of the biggest reasons I picked Asus is because they have 1200 and 1700 holes. I wish more would have went that route due to the lack of 1700 coolers.


----------



## Deleted member 24505 (Jan 21, 2022)

Mindweaver said:


> No, it's still out for delivery now. I don't think I'm going to get the 1700 plate. That's one of the biggest reasons I picked Asus is because they have 1200 and 1700 holes. I wish more would have went that route due to the lack of 1700 coolers.



There is no reason for the others not doing it.


----------



## sneekypeet (Jan 21, 2022)

Tigger said:


> There is no reason for the others not doing it.



There is an argument out there that LGA115x/1200 holes do not allow for proper pressure and surface mating, which is improved using LGA1700 hardware. Have seen this in the wild and even hear it from cooler manufacturers.


----------



## Deleted member 202104 (Jan 21, 2022)

Mindweaver said:


> No, it's still out for delivery now. I don't think I'm going to get the 1700 plate. That's one of the biggest reasons I picked Asus is because they have 1200 and 1700 holes. I wish more would have went that route due to the lack of 1700 coolers.



Just check that you've got good contact.  I bought an Asus originally for the same reason.  Found out that Noctua still recommended using their LGA1700 mounting kit due to a slight height difference between 1200 and 1700.

Sounds like @Tigger's works fine though so you'll probably be good.


----------



## lexluthermiester (Jan 21, 2022)

sneekypeet said:


> There is an argument out there that LGA115x/1200 holes do not allow for proper pressure and surface mating, which is improved using LGA1700 hardware. Have seen this in the wild and even hear it from Cooler manufacturers.


There has been evidence shown that lends to this, but it seems conditional on the mounting mechanism and how the pressure is applied.


----------



## Deleted member 24505 (Jan 21, 2022)

weekendgeek said:


> Just check that you've got good contact.  I bought an Asus originally for the same reason.  Found out that Noctua still recommended using their LGA1700 mounting kit due to a slight height difference between 1200 and 1700.
> 
> Sounds like @Tigger's works fine though so you'll probably be good.



My Block is not actually stated as 1700 compatible. The holes are elongated for the other compatible boards/sockets. Because of this it fits fine on the 1700 with the hefty thick 1700 back plate. I have pics of the paste spread which imo is very good. i will post them if anyone wants to see them.

If you need me to, i will even try mine with the 1200 backplate and post the paste spread. Yours should be fine @Mindweaver

Edit, it actually does not even say on the box LGA1200 but sure it does on the web page.


----------



## lexluthermiester (Jan 21, 2022)

Tigger said:


> i will post them if anyone wants to see them.


You should, it might be helpful to someone.


----------



## Deleted member 24505 (Jan 21, 2022)




----------



## lexluthermiester (Jan 21, 2022)

Tigger said:


> View attachment 233423View attachment 233424


Looks like your CPU has a very slight dome shape to it.


----------



## phanbuey (Jan 21, 2022)

Mine looks like this too... im pretty close to lapping it...


----------



## Deleted member 24505 (Jan 21, 2022)

lexluthermiester said:


> Looks like your CPU has a very slight dome shape to it.



Could be the jet plate in the block, iirc they do make the block bulge don't they?


----------



## Mindweaver (Jan 21, 2022)

The board has arrived! Also, I'll look at getting the 1700 mounting bracket if my temps look off.


----------



## Deleted member 24505 (Jan 21, 2022)

Mindweaver said:


> The board has arrived! Also, I'll look at getting the 1700 mounting bracket if my temps look off.



If you have enough paste, fit it then take it off so we can see what your spread  is like.


----------



## Fouquin (Jan 21, 2022)

phanbuey said:


> Mine looks like this too... im pretty close to lapping it...



If you're going to lap it, also check on how your temps respond to the ILM washer mod. Lapping the dome down and leaving the stock ILM tension may lead to you inadvertently warping the IHS, leaving you again with worse thermals.


----------



## Deleted member 24505 (Jan 22, 2022)

Fouquin said:


> If you're going to lap it, also check on how your temps respond to the ILM washer mod. Lapping the dome down and leaving the stock ILM tension may lead to you inadvertently warping the IHS, leaving you again with worse thermals.



I saw a video of that washer mod by Der8auer and it made virtually no difference and i am pretty sure he is reliable. I am not lapping mine, don't want to void the warranty(yet) and my temps are very acceptable.


----------



## Fouquin (Jan 22, 2022)

Tigger said:


> I saw a video of that washer mod by Der8auer and it made virtually no difference and i am pretty sure he is reliable. I am not lapping mine, don't want to void the warranty(yet) and my temps are very acceptable.



Igor and Buildzoid are both as reliable in my book. Two out of three say it works, and they also stated it's not going to affect every board as it's down to a variety of factors from ILM manufacturer to board design. The entire reason for mentioning it is that _all _information on the subject is worth looking into before taking action.


----------



## lexluthermiester (Jan 22, 2022)

Tigger said:


> iirc they do make the block bulge don't they?


That's a good question. You could always eyeball it..



Mindweaver said:


> The board has arrived! Also, I'll look at getting the 1700 mounting bracket if my temps look off.


I predict you will be having a bunch of fun tonight!


----------



## phanbuey (Jan 22, 2022)

Intel Core i5 12600K @ 5486.58 MHz - CPU-Z VALIDATOR (x86.fr)

These chips are fun...

@Fouquin -- i did try that actually but it didn't do much Im afraid...

I have 9C difference between hottest and coldest cores at full load - contact is definitely iffy.  my 24/7 settings clip using TVB at 80C -1x to 52x and then again to -2 = 51x at 85C - so it never breaks 83C and 90% of the time stays below 80 during CB R23 runs.




Can't get too mad at those temps coming from the 10850K.

TVB clipping settings during avx are a little weird though since it doesn't drop multis consistently, still fiddling with it.  it offsets it from the default clocks of the chip :/ and even if I OC on the bclk the TVB is offestting based on clocks.

Would be nice to have a 5.5ghz all core until a certain point then just have it gently drop to 5.4, 5.3, 5.2 to keep it below 85C.


----------



## The King (Jan 22, 2022)

Intel Core i5 12490F @ 5701.16 MHz - CPU-Z VALIDATOR
					

[cscyap] Validated Dump by FUN (2022-01-20 13:24:11) - MB: Asus ROG MAXIMUS Z690 APEX - RAM: 32768 MB




					valid.x86.fr


----------



## Shrek (Jan 22, 2022)

phanbuey said:


> Mine looks like this too... im pretty close to lapping it...



Could it be intentional? given that the center will get hottest and so expand more?

Like the way pistons are made off round.


----------



## Deleted member 24505 (Jan 22, 2022)

The King said:


> View attachment 233489
> 
> 
> 
> ...



Nice but 1.69v hope you have good cooling.


----------



## fevgatos (Jan 22, 2022)

phanbuey said:


> View attachment 233481
> Intel Core i5 12600K @ 5486.58 MHz - CPU-Z VALIDATOR (x86.fr)
> 
> These chips are fun...
> ...


5.5 all cores is kind of a no go unless you have an extremely good sample. 5.4 is doable on a 12900k, dunno about your 12600k. Probably easier to handle the temps but still requires more volts to reach cause of better binning on the i9. 

I'm running 5.3 right now cause I want it to be whisper quiet, and at 5.4 it definitely isn't, even in gaming. So 5.3 and -1 bin at 85 -2 at 90c. Can run cbr23 at 5.2, but for anything more extreme than that it drops to 5.1. Single is at 5.6 at 1.51 volts cause i don't want to push it to the 1.58 required for 5.7. 

What's your cooler?


----------



## TheoneandonlyMrK (Jan 22, 2022)

Tigger said:


> Could be the jet plate in the block, iirc they do make the block bulge don't they?


No , the block doesn't bulge unless the jetplate gets blocked and it's not the copper base that deforms , it wouldn't affect your CPU IHS.
At any rate if it's not causing a temperature issue it's not an issue. 
Unless that's already CPU warping.

Could/ did you get your eye down on it, from the top it's hard to judge but with the CPU and Hsf fitted to the motherboard what did the compression amount to, warped board CPU socket or all fine and level, Personally I would want to know since it could start causing issues down the road.


----------



## Deleted member 24505 (Jan 22, 2022)

I'm running my 12700k Gasp stock. It is a big step up from my previous setup, and even stock, a 12700k is a pretty good CPU


----------



## phanbuey (Jan 22, 2022)

fevgatos said:


> 5.5 all cores is kind of a no go unless you have an extremely good sample. 5.4 is doable on a 12900k, dunno about your 12600k. Probably easier to handle the temps but still requires more volts to reach cause of better binning on the i9.
> 
> I'm running 5.3 right now cause I want it to be whisper quiet, and at 5.4 it definitely isn't, even in gaming. So 5.3 and -1 bin at 85 -2 at 90c. Can run cbr23 at 5.2, but for anything more extreme than that it drops to 5.1. Single is at 5.6 at 1.51 volts cause i don't want to push it to the 1.58 required for 5.7.
> 
> What's your cooler?



This is pretty much my situation my chip is a bit worse binned than yours at the higher volts (I cant complete CPUz runs @ 1.56v at 5.6) - 5.3 (well 5.287 due to bclk clock gen) is silent and super easy to cool @ 1.28v - to bump up to 5.4 i need 1.36v and 5.5v needs close to 1.5v.  So i just stay at 5.3 for day to day and gaming .

My cooler is a push pull 280mm loop with a ek supremacy evo block from forever ago.


----------



## Deleted member 24505 (Jan 22, 2022)

The best cpu now from ADL is a 12400 and a DDR5 board. Hopefully BCLK OC will come to DDR4 boards, but if it does, AMD will be in trouble indeed as there would be no point buying a 56/58 or maybe even 5900x unless you need them for other than gaming.


----------



## lexluthermiester (Jan 22, 2022)

Andy Shiekh said:


> Could it be intentional? given that the center will get hottest and so expand more?


Unlikely. It's just imperfect manufacturing. This has been going on since the Pentium MMX days.


----------



## TheoneandonlyMrK (Jan 22, 2022)

Tigger said:


> The best cpu now from ADL is a 12400 and a DDR5 board. Hopefully BCLK OC will come to DDR4 boards, but if it does, AMD will be in trouble indeed as there would be no point buying a 56/58 or maybe even 5900x unless you need them for other than gaming.


Bclk clocking in useful terms requires a dual clock generator for two domains and the control in bios.
You can't run pciex devices like nvme drives and GPU on a buss clocked up 50% few board's will have this.
bCLK is CPU benchmark useful only in reality because nothing on the pciex buss will take more than 5/10 MHz increase without dying or failing hard.


----------



## Deleted member 24505 (Jan 22, 2022)

birdie said:


> Looks like Intel is not going to close this loophole but they took the "it's dangerous and outside of specs" stance. So, they sort of showed they are against it to placate their investors and fans but in the end they'll most likely be more financially successful this way, not to mention all the love from their fans who want to save on CPUs and motherboards.



I might even experiment with my 12700k with BCLK, i know it is possible on this, should be interesting with a combination of multi and BCLK. Maybe lower multi and higher BCLK


----------



## phanbuey (Jan 24, 2022)

Tigger said:


> I might even experiment with my 12700k with BCLK, i know it is possible on this, should be interesting with a combination of multi and BCLK. Maybe lower multi and higher BCLK



My peripherals start acting up when messing with BCLK -  keyboard turns off web cam etc.


----------



## Deleted member 24505 (Jan 24, 2022)

phanbuey said:


> My peripherals start acting up when messing with BCLK -  keyboard turns off web cam etc.



How high have you gone up?

How are they getting to 137 using BCLK OC on the 12400's?


----------



## phanbuey (Jan 24, 2022)

Tigger said:


> How high have you gone up?
> 
> How are they getting to 137 using BCLK OC on the 12400's?



So ive gone to just north of120 because i wanted to have the TVB clipping work properly on the chip 45x at 5.4 - it does by knocking down to 43~ multi during cinebench runs.  It ran fine for a day and then when i woke up the computer the next morning my webcam just stopped working during a Teams call - so I unplugged it and plugged it back in, and as soon as I did that my keyboard shut off.

Unplugging that and plugging it back in fixed that as well... so I thought, "ok, too high - might just want to tweak a few hz to optimize the OC".  so I went to like 103ish... everything seemed fine, then one of my sata drives randomly stopped showing up... not sure if related or what.  When I put it back to 100/ default I have no issues so i just left it there.


----------



## The King (Jan 24, 2022)

@Tigger
I believe the limit for most boards with ADL K series CPUS are below 103 eg 102.98. The highest I seen on HWBOT is 103.9. You will not be able to do 137 BCLK on a K series ADL only on the non K ADL. AFAIK








						Splave`s HWBOT x265 Benchmark - 4k score: 32.66 fps with a Core i5 12600KF
					

The Core i5 12600KF @ 6694MHzscores getScoreFormatted in the HWBOT x265 Benchmark - 4k benchmark. Splaveranks #3 worldwide and #2 in the hardware class. Find out more at HWBOT.




					hwbot.org
				




If it was possible to do high BCLK on ADL K CPUS the guys on HWBOT will be the first to do so.


----------



## phanbuey (Jan 24, 2022)

The King said:


> @Tigger
> I believe the limit for most boards with ADL K series CPUS are below 103 eg 102.98. The highest I seen on HWBOT is 103.9. You will not be able to do 137 BCLK on a K series ADL only on the non K ADL. AFAIK
> 
> 
> ...



I can post easily at 137 and go into windows, they definitely work at that bclk, just on my specific board weird stuff starts happening.


----------



## The King (Jan 24, 2022)

phanbuey said:


> I can post easily at 137 and go into windows, they definitely work at that bclk, just on my specific board weird stuff starts happening.


Most likely because your board does not have an external clock gen. The high end ASUS D5 boards have external clock gens, So when you
OC the BCLK on those boards only the RAM and CPU are affected not the other components.


----------



## Deleted member 24505 (Jan 24, 2022)

The King said:


> Most likely because your board does not have an external clock gen. The high end ASUS D5 boards have external clock gens, So when you
> OC the BCLK on those boards only the RAM and CPU are affected not the other components.



Mine is a D4 board so it would probably cause problems too


----------



## The King (Jan 24, 2022)

Tigger said:


> Mine is a D4 board so it would probably cause problems too


I'm not sure if you board has an external clock gen on not. It may have one like the D5 version.


----------



## Deleted member 24505 (Jan 24, 2022)

The King said:


> I'm not sure if you board has an external clock gen on not. It may have one like the D5 version.



For some weird reason my ram usually shows not exact numbers, like 1798/99 for 1800, so i have my BCLK at 101. i guess there is no need but it bugs me. It does give a slight boost to mem, and core frequencies though lol, and had no ill effects.


----------



## phanbuey (Jan 24, 2022)

Tigger said:


> For some weird reason my ram usually shows not exact numbers, like 1798/99 for 1800, so i have my BCLK at 101. i guess there is no need but it bugs me. It does give a slight boost to mem, and core frequencies though lol, and had no ill effects.
> View attachment 233740


Interesting.. mine is actually the opposite way-- My ram actually runs slightly lower due to bclk but reports just fine on that screen.


----------



## Deleted member 24505 (Jan 24, 2022)

I have a few pics where my bus is under 100, only by a tad, but it bugs me, so i added 1 to balance it out


----------



## Mindweaver (Jan 24, 2022)

Wow, ok so I finally have my rig up and running. I did not start building it until yesterday. I decide to grabbing another PSU. I bought a Corsair RM Series RM750. I just swapped gpu's and Ram out of my old rig to keep all of my daughter stuff setup. I ordered some more stuff today.. lol I bought 3x argb fans for my 360 radiator. The 3x rgb on my Phantek 400A aren't that great.. plus only 2x work out of the 3. The lights work on all 3 but one will not start. The biggest reason I bought new fans is that no matter what I did I could not get all three phantek fans to work with the 3 way splitter that came with my Cooler master fans. Right now I have all three fans going to 3x non cpu fan header. I have my pump on one and then I have the 3x corsair mounted to the back and 2x top and have it in the CPU header. It's crazy it sits around 29-33c degrees.

I did almost screw up. I downloaded our RealTemp and was monitoring temps with it. I was messing around with the sensor test and downloaded Prime 95c.. I haven't used P95 in around 8-10 years... lol I started it and wow it sounded like a Huey taking off.. I started watching my temps rise very quickly. I noticed my fan speeds kept ramping up to then realize I had the case fans connected to the CPU header to trick bios. I think it hit 95 before I was able to stop it.. Well for a second or two. lol I kicked myself a few times for that but everything looks fine.

Oh and WOW this thing is fast! I need to test out some games, but so far everything is much faster. I've tested SQL, jumped on a few Terminal Server sessions.. Had multiple Chrome, Firefox, IE Chrome tabs going. I'm very impressed.










As requested here is my spread.


----------



## Deleted member 24505 (Jan 24, 2022)

Mindweaver said:


> Wow, ok so I finally have my rig up and running. I did not start building it until yesterday. I decide to grabbing another PSU. I bought a Corsair RM Series RM750. I just swapped gpu's and Ram out of my old rig to keep all of my daughter stuff setup. I ordered some more stuff today.. lol I bought 3x argb fans for my 360 radiator. The 3x rgb on my Phantek 400A aren't that great.. plus only 2x work out of the 3. The lights work on all 3 but one will not start. The biggest reason I bought new fans is that no matter what I did I could not get all three phantek fans to work with the 3 way splitter that came with my Cooler master fans. Right now I have all three fans going to 3x non cpu fan header. I have my pump on one and then I have the 3x corsair mounted to the back and 2x top and have it in the CPU header. It's crazy it sits around 29-33c degrees.
> 
> I did almost screw up. I downloaded our RealTemp and was monitoring temps with it. I was messing around with the sensor test and downloaded Prime 95c.. I haven't used P95 in around 8-10 years... lol I started it and wow it sounded like a Huey taking off.. I started watching my temps rise very quickly. I noticed my fan speeds kept ramping up to then realize I had the case fans connected to the CPU header to trick bios. I think it hit 95 before I was able to stop it.. Well for a second or two. lol I kicked myself a few times for that but everything looks fine.
> 
> ...



Nice, the 12700k even stock is zippy. spread looks pretty good. Isn't that a heavy board though.


----------



## lexluthermiester (Jan 24, 2022)

phanbuey said:


> I can post easily at 137


I think this is why Intel is concerned and unhappy.








						Intel Not Happy About BCLK Overclocking of 12th Gen CPUs, Warns of Damage
					

You may, or may not have noticed that in certain parts of the interweb, groups of people that are generally referred to as "Overclockers" have managed to get their cheap Celeron G6900's and Core i3-12100's to run at much higher clock speeds than Intel intended and now the company is unhappy...




					www.techpowerup.com
				




Normally I would tell Intel to put a cork in it, but the way CPU's and motherboard components work these days there is a very real potential for damage of the hardware.

Folks the general word to the wise statement is not to push these chips above 118mhz bclk otherwise you risk permenant damage. I've seen this statement in a few different places.



Mindweaver said:


> I think it hit 95 before I was able to stop it.. Well for a second or two.


You're fine, you didn't hurt anything.


----------



## Mindweaver (Jan 24, 2022)

Tigger said:


> Nice, the 12700k even stock is zippy. spread looks pretty good.


Yeah it's much faster than my R5 2600. Thanks! Yeah I cleaned up the spread to I might pull it again later to check things. It's crazy how thick MX-5 is and difficult to spread.. lol


----------



## phanbuey (Jan 24, 2022)

My intel MacBook basically spent the last 5 years regularly sitting at 92 C during use and that thing is still going...  Intel chips can take some heat.  I think that was the only good thing to come out of the Pentium 4.


----------



## Deleted member 24505 (Jan 24, 2022)

Mindweaver said:


> Yeah it's much faster than my R5 2600. Thanks! Yeah I cleaned up the spread to I might pull it again later to check things. It's crazy how thick MX-5 is and difficult to spread.. lol



I had a 2600x before this, the difference is mighty, kinda like going from a 2600x to a 5900x.


----------



## Mindweaver (Jan 24, 2022)

Tigger said:


> I had a 2600x before this, the difference is mighty, kinda like going from a 2600x to a 5900x.


My old board did not like my memories XMP profile and I had to manually set it to 3200mhz.. I couldn't go any higher. This board has no problem at 3600mhz. I'll probably play around with timings and try to overclock it a bit. Also, this new 1tb 980 M.2 SSD is blazing fast. I need to connect the rest of my drives.


----------



## Cutechri (Jan 24, 2022)

I'd love to play around with one of these new Alder Lake chips, first generation from Intel lately to legitimately impress me. However I've already got a 5900X and no real reason to buy a CPU.

Also, the i3-12100F looks like the definite value king right now, shame the board pricing situation is still in shambles. Also shame that AMD has nothing to compete at this price point.



Tigger said:


> I had a 2600x before this, the difference is mighty, kinda like going from a 2600x to a 5900x.


Well, I felt a difference going from a 3900X to a 5900X let alone that lol. Though I don't know if it was because that specific 3900X had some weird quirks or the generational improvement is just that good.


----------



## Deleted member 24505 (Jan 24, 2022)

Mindweaver said:


> My old board did not like my memories XMP profile and I had to manually set it to 3200mhz.. I couldn't go any higher. This board has no problem at 3600mhz. I'll probably play around with timings and try to overclock it a bit. Also, this new 1tb 980 M.2 SSD is blazing fast. I need to connect the rest of my drives.



It seems for ADL the sweet spot is 3600c14, i guess even c16 is ok. Mine will run at 4000c16 but it is slower than 3600, so i have mine at 3600c14. I have a WD sn850 1tb for my games which is blazing fast too, and a gen3x4 for boot. So nice having 4xgen4 m.2 slots


----------



## Mindweaver (Jan 25, 2022)

Holy mother of god this thing is fast! I know these are just benchmarks but it's so much smoother now.

Ryzen R5 2600 - RTX 2070
3D Mark - Firststrike




I7 12700k - RTX 2070




Ryzen R5 2600 - RTX 2070
VRMark




I7 12700k - RTX 2070


----------



## Cutechri (Jan 25, 2022)

Indeed it is. Had I not already been on an AM4 platform & found the 5900X at 380€ brand new, I would have held out with my i7-8700 rig to build a new Alder Lake rig, probably with the i7-12700K. Very impressed with these CPUs and I hope both Intel and AMD are going to go at it like this for years to come.


----------



## The King (Jan 26, 2022)

Some ADL 12700H benchmarks for those who may be interested.






						MACHENIKE S17T 7M  - Geekbench Browser
					

Benchmark results for a MACHENIKE S17T 7M with an Intel Core i7-12700H processor.



					browser.geekbench.com
				












						Intel Core i7 12700H @ 4090 MHz - CPU-Z VALIDATOR
					

[345kh8] Validated Dump by Anonymous (2022-01-21 09:22:37) - MB: MACHENIKE S17T 7M - RAM: 32768 MB




					valid.x86.fr


----------



## birdie (Jan 28, 2022)

A deep dive into ADL, including performance, power efficiency, IPC and other stuff. Much recommended:









						Alder Lake’s Power Efficiency – A Complicated Picture
					

Reviews across the internet show Alder Lake getting very competitive performance with very high power consumption. For example, Anandtech measured 272 W of package power during a POV-Ray run. Our o…




					chipsandcheese.com
				




A commentary from a Russian forum:



> Another interesting read, but here the author has been visited by some kind of madness. Okay, in the first line he repeats the illiterate tale "Reviews across the internet show Alder Lake getting very competitive performance with very high power consumption". But then it gets even worse. For some reason he wanted to compare one P-core to one E-core (more precisely, four to four) in terms of power consumption to see how much lower consumption it would take to make the E-core faster than the P-core. A pointless exercise for the senile, since the positioning of the cores is quite different - four E-cores vs. two P-cores (not four). And his graphs show that perfectly - that four E-cores do outperform half the performance of four P-cores with four times less power consumption (or comparable for vectorized code). (And if you subtract the uncore power, the picture improves even further).
> 
> Such a smart guy in terms of measurements and so clueless in terms of understanding the essence...
> 
> Then comes the interesting comparisons with "Zen" (only for some reason he did not find "Zen-3" and tested "Zen-2"). True, here, too, the silly incorrect statement about "Golden Cove is a vector monster, and Zen 2 can't beat it with the same core power draw" - only in this case vector performance is the same. The most interesting thing about the mobile "Zen" is at low power consumption - people already knew that they are efficient here, and the graphs confirm it (although it would be more correct to compare it with mobile ADLs, which are not available yet). And the rest about the "Zens" is of little use, as it is limited to modest consumption, below the typical.


----------



## Deleted member 24505 (Jan 29, 2022)

Mindweaver said:


> Wow, ok so I finally have my rig up and running. I did not start building it until yesterday. I decide to grabbing another PSU. I bought a Corsair RM Series RM750. I just swapped gpu's and Ram out of my old rig to keep all of my daughter stuff setup. I ordered some more stuff today.. lol I bought 3x argb fans for my 360 radiator. The 3x rgb on my Phantek 400A aren't that great.. plus only 2x work out of the 3. The lights work on all 3 but one will not start. The biggest reason I bought new fans is that no matter what I did I could not get all three phantek fans to work with the 3 way splitter that came with my Cooler master fans. Right now I have all three fans going to 3x non cpu fan header. I have my pump on one and then I have the 3x corsair mounted to the back and 2x top and have it in the CPU header. It's crazy it sits around 29-33c degrees.
> 
> I did almost screw up. I downloaded our RealTemp and was monitoring temps with it. I was messing around with the sensor test and downloaded Prime 95c.. I haven't used P95 in around 8-10 years... lol I started it and wow it sounded like a Huey taking off.. I started watching my temps rise very quickly. I noticed my fan speeds kept ramping up to then realize I had the case fans connected to the CPU header to trick bios. I think it hit 95 before I was able to stop it.. Well for a second or two. lol I kicked myself a few times for that but everything looks fine.
> 
> ...



There's a new bios 1003 for these boards today. Lots of changes it seems, just flashed it, as i think with ADL being so new it is worth it for the first 6mths.


----------



## The King (Jan 29, 2022)

Tigger said:


> There's a new bios 1003 for these boards today. Lots of changes it seems, just flashed it, as i think with ADL being so new it is worth it for the first 6mths.
> View attachment 234436


If anyone is doing ADL non K OC on some of the Z690 mobos they should watch out for number 5.
Not sure if Asus can do this but if they prevent or block BIOS downgrades then some boards
may loose the BCLK OC permanently.

Just something to be on the look out for especially on the Asus D5 boards.


----------



## Deleted member 24505 (Jan 29, 2022)

The King said:


> If anyone is doing ADL non K OC on some of the Z690 mobos they should watch out for number 5.
> Not sure if Asus can do this but if they prevent or block BIOS downgrades then some boards
> may loose the BCLK OC permanently.
> 
> Just something to be on the look out for especially on the Asus D5 boards.



My board is a D4 so cant do it anyway, and have a K CPU


----------



## Mindweaver (Feb 1, 2022)

It's crazy at how much my Ryzen 5 2600 was holding back my RTX 2070. This new rig feels so much snapper.


----------



## fevgatos (Feb 1, 2022)

Mindweaver said:


> It's crazy at how much my Ryzen 5 2600 was holding back my RTX 2070. This new rig feels so much snapper.


Yeah, especially if you play on anything but ultra settings, for example high vs ultra may net you a 30-40% fps gain, and suddenly your 2070 is getting the same fps as a 2080ti would at ultra.


----------



## Mindweaver (Feb 1, 2022)

fevgatos said:


> Yeah, especially if you play on anything but ultra settings, for example high vs ultra may net you a 30-40% fps gain, and suddenly your 2070 is getting the same fps as a 2080ti would at ultra.


Yeah it really feels like I upgraded my GPU with the gains I have over my old system. It would have been a waste to upgrade my GTX 2070 on the Ryzen 2600 system.


----------



## Deleted member 24505 (Feb 1, 2022)

Mindweaver said:


> Yeah it really feels like I upgraded my GPU with the gains I have over my old system. It would have been a waste to upgrade my GTX 2070 on the Ryzen 2600 system.



The 12700k is certainly punchy, even my 980ti is much better, about 20-30fps boost in FarCry 6 over the 2600x


----------



## Mindweaver (Feb 2, 2022)

Tigger said:


> The 12700k is certainly punchy, even my 980ti is much better, about 20-30fps boost in FarCry 6 over the 2600x


I need you to fill out your specs.  I keep forgetting what you have.. lol It would be cool if everyone that upgraded to Alder Lake post what they upgraded from and can they tell any difference.


----------



## Deleted member 24505 (Feb 2, 2022)

Mindweaver said:


> I need you to fill out your specs.  I keep forgetting what you have.. lol It would be cool if everyone that upgraded to Alder Lake post what they upgraded from and can they tell any difference.



There ya go, fitting the whoppa EK 360 XE 60mm radiator tomorrow  hoping to see a temp drop.


----------



## 80-watt Hamster (Feb 2, 2022)

Is anyone else finding themselves surprised that 2c/2t CPUs are still a thing as of Alder Lake?  When 6c/12t is available for $200, $80 for 1/3 the cores and 1/6 the threads seems bananas.  I guess they still need to differentiate Celeron from Pentium from i3 from i5.  Feels like they should just drop the Celeron brand and its configuration at this point. :shrug:


----------



## phanbuey (Feb 27, 2022)

80-watt Hamster said:


> Is anyone else finding themselves surprised that 2c/2t CPUs are still a thing as of Alder Lake?  When 6c/12t is available for $200, $80 for 1/3 the cores and 1/6 the threads seems bananas.  I guess they still need to differentiate Celeron from Pentium from i3 from i5.  Feels like they should just drop the Celeron brand and its configuration at this point. :shrug:



I think they are still good for POS systems and very basic office PCs -- but yeah outside of some niche uses these are pretty pointless outside the OEM space.  They don't really make a ton of financial sense for individuals, maybe if you're ordering a few hundred of them then it adds up.


----------



## ThrashZone (Feb 27, 2022)

Hi,
Price point filler.


----------



## Deleted member 24505 (Feb 28, 2022)

Well i did the washer mod on my board. I will test over the next day or so on the results. I have added pics of my CPU flatness as it was when i took it out of the socket, it is pretty flat. the ruler is not perfect, but it is a straight edge.


----------



## birdie (Apr 9, 2022)

*Intel Comments On Alder Lake's Warping and Bending Issues, Mods Void Warranty*

"_We have not received reports of 12th Gen Intel Core processors running outside of specifications due to changes to the integrated heat spreader (IHS). Our internal data show that the IHS on 12th Gen desktop processors may have slight deflection after installation in the socket. Such minor deflection is expected and does not cause the processor to run outside of specifications. We strongly recommend against any modifications to the socket or independent loading mechanism. Such modifications would result in the processor being run outside of specifications and may void any product warranties."_

Q&A:

*Are there any planned changes to the ILM design? This condition might only exist with certain versions of the ILM. Can you confirm that these ILM are to spec?*

_"Based on current data, we can’t attribute the IHS deflection variation to any specific vendor or socket mechanism. However, we are investigating any potential issues alongside our partners and customers, and we will provide further guidance on relevant solutions as appropriate."_

*Some users report reduced thermal transfer from the deflection issue, which makes sense as it clearly impacts the ability of the IHS to mate with the cooler. Would Intel RMA the chip if the mating was poor enough to lead to thermal throttling?*

_"Minor IHS deflection is expected and does not cause the processor to run outside of specifications or prevent the processor from meeting published frequencies under the proper operating conditions. We recommend users who observe any functional issues with their processors to contact Intel Customer Service."_

*The chip deflection issue also impacts motherboards – as a result of the deflection on the chip, the socket ends up bending the rear of the socket, and thus the motherboard. This raises the possibility of damage to the traces running through the motherboard PCB, etc. Is this condition also within spec?*

_"When there’s backplate bending occurring on the motherboard, the warping is being caused by the mechanical load being placed on the motherboard to make electrical contact between the CPU and the socket. There’s no direct correlation between IHS deflection and backplate bending, other than they can both be caused by the mechanical socket loading."_

Source.


----------



## Vario (Sep 24, 2022)

Are the CPU and Ram voltages in the attached image ideal? My goal is a 24/7 undervolt on air.  The image depicts an R23 Multi run, so it should have a full range of idle to load data.  I have a number of settings left on auto on this board.  Some motherboards have a reputation of aggressive overvolting, but I am not familiar with Alderlake voltages.  Furthermore, I am not familiar with EVGA's bios, it has a lot fewer settings than my AsRock Z370 Taichi.  There appears to be no way to adjust PL1,PL2 for example (they are uncapped/4000W by default).

Processor: 12900KS
Motherboard: EVGA Z690 Classified
Ram: F5-6000J3636F16GX2-TZ5RK
Power Supply: Superflower Leadex SE 1000W
Air Cooler: NHD15S
Graphics: EVGA 3060 12GB

The settings I have changed from "default/auto" are
50x All P core multi
1.275 Adaptive Voltage
-.100 Offset
Ram XMP Profile
90C TJ Max


----------



## ShrimpBrime (Sep 24, 2022)

Vario said:


> Are the CPU and Ram voltages in the attached image ideal? My goal is a 24/7 undervolt on air.  The image depicts an R23 Multi run, so it should have a full range of idle to load data.  I have a number of settings left on auto on this board.  Some motherboards have a reputation of aggressive overvolting, but I am not familiar with Alderlake voltages.  Furthermore, I am not familiar with EVGA's bios, it has a lot fewer settings than my AsRock Z370 Taichi.  There appears to be no way to adjust PL1,PL2 for example (they are uncapped/4000W by default).
> 
> Processor: 12900KS
> Motherboard: EVGA Z690 Classified
> ...


Is the memory at default in this screen shot? Looks like VddQ TX is reporting 1.1v, I believe that's the Vdimm voltage correct?


----------



## Vario (Sep 24, 2022)

ShrimpBrime said:


> Is the memory at default in this screen shot? Looks like VddQ TX is reporting 1.1v, I believe that's the Vdimm voltage correct?


No that should be VDD2, 1.350V


----------



## ShrimpBrime (Sep 24, 2022)

Vario said:


> No that should be VDD2, 1.350V


That's interesting. Mine is under the Vddq tx, don't even have VDD2 listed for my system. My version is different, 7.26-4800

Though a lot of the SPD voltages SWA SWB SWC ect are way off and no SPD HUB temperature given. 

I'm wondering why this information is missing or not valid. Must be the board?


----------



## Vario (Sep 24, 2022)

ShrimpBrime said:


> That's interesting. Mine is under the Vddq tx, don't even have VDD2 listed for my system. My version is different, 7.26-4800
> 
> Though a lot of the SPD voltages SWA SWB SWC ect are way off and no SPD HUB temperature given.
> 
> I'm wondering why this information is missing or not valid. Must be the board?


It has to be because it doesn't particularly make sense, ~32 watts for DDR5? They don't feel warm.


----------



## ShrimpBrime (Sep 24, 2022)

Vario said:


> It has to be because it doesn't particularly make sense, ~32 watts for DDR5? They don't feel warm.


They won't get warm until you use them intensively. 32w at idle would be way off I think too.

Well unless you have another board to play with, maybe a different version of HWInfo64. And I could update mine and see if anything changes despite I have a different board.

Nope, no changes - Here's a screeny. 
Same hw in system specs under my avatar.


----------



## Vario (Sep 24, 2022)

ShrimpBrime said:


> They won't get warm until you use them intensively. 32w at idle would be way off I think too.
> 
> Well unless you have another board to play with, maybe a different version of HWInfo64. And I could update mine and see if anything changes despite I have a different board.
> 
> ...


Yeah I think its an erroneous reporting. Unfortunately the Z690 Classified does not report voltages at all to HWMonitor or AIDA, so this is the best I've been able to find.

Edit: I just noticed that I was able to see it in HWiNFO64, not sure what was the trigger, but I now have open XTU, HWiNFO64, Aida64, and HWMonitor all at once and that did the trick.  My hunch is I can see the sensor now because XTU is open at the same time as HWiNFO64.


____
As an update, for whatever reason, it works going forward.  I reformatted to rebuild OS with less bloat and still working, correct temps and volts displayed.  Odd, but I am glad it is.


----------

