• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Alder Lake Doesn't Look Like an Overclockers Dream

What it really lacked was a Windows scheduler that knew the FX was a hybrid 2+1 ALU/FPU design. Windows wouldn’t dispatch an FPU thread to an idle FPU, but would instead assign it to one of the ALUs that was already waiting on its shared FPU to complete a task. I think I got that right anyway. Basically, the chips could have performed better in their day, but just didn’t get the OS support. AMD was essentially broke at that time and couldn’t get that kind of support. I bet Adler Lake will perform worse on Windows 10. It’s just how much that remains to be seen.
The "leaks" says Win10 scheduler would only use the E-cores in Alder Lake, and leave all the P-cores idle.
 
NGL, I already accepted that "CPU overclock" is dead and buried for anything outside of merely demonstrating how far one can take a CPU, at least if you're not willing to invest in some very beefy cooling (step aside, 240 mm radiators, 360 mm is the new minimum for trying your hand at overclock.) Plus with the way the CPUs get smarter in handling their own clocks, it might as well become an exercise in futility soon.
Right there with you. For me, overclocking used to be all about grabbing cheaper parts and making them perform even better than the most expensive extreme edition parts. That died once Sandy Bridge hit, locking everyone out of overclocking options unless you paid for the best shit. Now even overclocking the best shit is dead because there's just no headroom left with the manufacturers pushing everything as far as they can and with advanced boosting algorithms.
 
The "leaks" says Win10 scheduler would only use the E-cores in Alder Lake, and leave all the P-cores idle.
Then I guess that would make performance pretty terrible!
 
Right there with you. For me, overclocking used to be all about grabbing cheaper parts and making them perform even better than the most expensive extreme edition parts. That died once Sandy Bridge hit, locking everyone out of overclocking options unless you paid for the best shit. Now even overclocking the best shit is dead because there's just no headroom left with the manufacturers pushing everything as far as they can and with advanced boosting algorithms.
If I recall C2D was the first time cpu's that weren't memory dependant on your OC.
 
I had tweaked PBO on my 5900X on an Asus X570 Dark Hero. Couple of cores would clock up to 5.150 and the rest at 4.650 all very stable and temps under 80 (triple 360 rads) Got Cinbench R23 to 24,200 and CPUZ single core to 688 and I was happy.

Then I made a big mistake...Windows 11

Even with today's fixes from MSFT and AMD I cant get the clocks or the scores back to happy no matter how much tweaking I do.
 
I had tweaked PBO on my 5900X on an Asus X570 Dark Hero. Couple of cores would clock up to 5.150 and the rest at 4.650 all very stable and temps under 80 (triple 360 rads) Got Cinbench R23 to 24,200 and CPUZ single core to 688 and I was happy.

Then I made a big mistake...Windows 11

Even with today's fixes from MSFT and AMD I cant get the clocks or the scores back to happy no matter how much tweaking I do.

Roll-back

Don't touch any new products from M$ in the very first year after launch.
We have had enough lessons ( Vista / 7 / 8 / 10 ) for that...
 
The "leaks" says Win10 scheduler would only use the E-cores in Alder Lake, and leave all the P-cores idle.
@EarthDog Toldya it'd go poop on Win 10

Honestly with 250-350W, just buy a threadripper system
 
Isn't it far better to fix a power level (say 200 watts for 16 cores) and try to maximize performance (through lithography and architecture) within that power limit?
I run my 10500H at 100 milivolt undervolt offset, underclocked to 3200 MHz and I pretty happy. I get decent performance for 35 watts. Fast enough and cool enough to actually be a lap top.

Sure it is, but then you can't fool the idiots with big numbers.

Intel and AMD both abandoned the specced TDP as a number we rely on. All marketing is directed at it, and in reality, peak usage far exceeds that and is then managed on a different metric: thermals and power consumed per X time. They push the burden on to the quality of the cooling to hide the lack of quality improvements on chip efficiency at high frequencies. So now you're left there spending 3-4x the cash on cooling for a hundred or two mhz, so we undervolt :)

Intel has been the worst offender though, what with all their misty turbo limits these days. The turbo used to have some headroom, now you're lucky if you even see it fully. The result of many generational baby steps to hide lack of progress since Skylake.

I get that overclocking and tweaking is fun and many people enjoy it for multiple different reason, so I certainly wouldn't want to take that away from anyone (I used to be one of those people long ago). But to be honest-- the older I get, the more I just want the stuff to work to 98% potential out of the box with no fuss. I do like that Intel/AMD are attempting to maximize their silicone and not leaving much on the table.

Sure but a bit more up front about what's really happening would be good too. We've all been discovering the hard way how these CPUs behave.
 
Last edited:
We came from 75W max peak to now 400W.... And guess what, the power bill also increased by an amount too....
 
Just buy 1000 watts power supply... electricity is very cheap nuclear power plants are cooking
Tell that to my $150 electricity bill every month for the last 3 months. And I dont even have my computer going 24/7.
 
Tell that to my $150 electricity bill every month for the last 3 months. And I dont even have my computer going 24/7.
people forget that in some places (like Brazil) we're going trhu a massive drought, the cost of electricity went up more than 4 times in the last 6 months and still raising. A lot of places in the world are going tru the same shit, it sucks.
 
We have been wetter than normal, just this past summer.
 
Back
Top