• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Core i5-12600K

Think people are forgetting that these are K series chips. Those and AMD X are the enthusiast chips, unlocked, customizable.

You can turn a K into a non-K if you so desire.

You can clock it till you can't cool it. Same on AMD side, PBO2 is power unlocked, as far as you can clock it and cool it.

Saw a geekbench for a 6.2 Ghz 5950X this morning. Betchya it pulls way upwards of 500W. I hear it is not unusual for them to pull 250-300W when overclocking.

Again, not sure why people are so worried about what these chips pull when stock and power limited.
 
There is no difference in power draw in game. Stop making shit up
Power draw absolutely matters.

Heat too high?
VRMs too hot?
PL2 active for too long?
Didnt buy that $500 Z series motherboard? (Every single prebuilt/OEM owner goes here)

Then you don't reach that review performance.

Don't look at it as "well it's not ALWAYS there so its fine"
Look at it as "What if it wants that, and cant get it?"




A 5600x and 12600k have my tick of approval, because no matter what, they should be able to reach their max performance with standard cooling, on a standard motherboard in a standard system.

Even the 5950x can run off an air cooler (Noctua NH-U14S is a popular example, but the U12S works too, if you dont max out PBO) on the majority of B450/B550 motherboards (barring a few really bad examples like MSI who have boards that are bad pure garbage for both brands)

Edit: We have an entire section of the forum dedicated to the poor sods with OEM's and laptops who are fighting against all the arbitrary limits and locks set by intel. 25W CPU, 120W power brick, low temps, alls good: nope 15W power limit for you. And then after 6 months its 100C and themal throttling because the cooling solution cant handle the heat long term, or in summer.
 
The 4790k is two generations newer, so it should.
You could buy an fx8350 brand new in 2014. So if your friend chose the fx8350 cause it was cheaper (it was around 150€ at the time) then he is spending less money, but he is losing longevity. The same applies today to the 5600x vs the 12600k.
 
Think people are forgetting that these are K series chips. Those and AMD X are the enthusiast chips, unlocked, customizable.

You can turn a K into a non-K if you so desire.

You can clock it till you can't cool it. Same on AMD side, PBO2 is power unlocked, as far as you can clock it and cool it.

Saw a geekbench for a 6.2 Ghz 5950X this morning. Betchya it pulls way upwards of 500W. I hear it is not unusual for them to pull 250-300W when overclocking.

Again, not sure why people are so worried about what these chips pull when stock and power limited.
I'm not sure on that, if all you have is 11 on the power dial, to get this Leap in performance with all that new technology , hopefully rapter lake is more on point.

@fevgatos yeah exactly ,not, the 5600X is only one year older than 12600k not two

Surprise, the competitor decided to show up at last, now about those GPUs.
 
Power draw absolutely matters.

Heat too high?
VRMs too hot?
PL2 active for too long?
Didnt buy that $500 Z series motherboard? (Every single prebuilt/OEM owner goes here)

Then you don't reach that review performance.

Don't look at it as "well it's not ALWAYS there so its fine"
Look at it as "What if it wants that, and cant get it?"




A 5600x and 12600k have my tick of approval, because no matter what, they should be able to reach their max performance with standard cooling, on a standard motherboard in a standard system.

Even the 5950x can run off an air cooler (Noctua NH-U14S is a popular example, but the U12S works too, if you dont max out PBO) on the majority of B450/B550 motherboards (barring a few really bad examples like MSI who have boards that are bad pure garbage for both brands)

Edit: We have an entire section of the forum dedicated to the poor sods with OEM's and laptops who are fighting against all the arbitrary limits and locks set by intel. 25W CPU, 120W power brick, low temps, alls good: nope 15W power limit for you. And then after 6 months its 100C and themal throttling because the cooling solution cant handle the heat long term, or in summer.
I never said power draw doesn't matter. I'm saying that during gaming, or 99% of any other task, alder lakes are extremely efficient, way more efficient than zen 3 in most tasks. The only thing they are not efficient at is basically rendering, and that's because of the insanely high stock clockspeeds. It's obvious that nobody should or would run a 20 hour long blender render on 5+ghz. You either power limit it or downclock it. I mean igorslab has some interesting numbers, performance normalized (basically having the same performance as 5900x) teh 12900k completely evaporates it in terms of efficiency. The 5900x needs 40% more power to finish the same rendering workload.

So yeah, long story short, alder lakes are extremely efficient in 99% of tasks at stock. If you work for disney and do rendering 24/7 then power limiting them to 150w would do wonders, if you care about efficiency. You'll lose 5% performance but you'll decrease the power output by a truckload.

@fevgatos yeah exactly ,not, the 5600X is only one year older than 12600k not two

Surprise, the competitor decided to show up at last, now about those GPUs.
It doesn't matter how new or old it is. My point is a faster CPU will last you longer, so comparing price to performance only in today's games with today's graphics cards in 1440p resolution is just the wrong way of doing it. Back in 2017 my R5 1600 had the same performance as the 8700k with a 1080ti @ 1440p. Fast forward to today, you can easily use a modern graphics like a 3080 on an 8700k. You can't do that on an R5 1600, even at 4k you will get bottlenecked in some games. So spending 150€ to get the 8700k instead of the R5 1600 would be a better choice.
 
I never said power draw doesn't matter. I'm saying that during gaming, or 99% of any other task, alder lakes are extremely efficient, way more efficient than zen 3 in most tasks. The only thing they are not efficient at is basically rendering, and that's because of the insanely high stock clockspeeds. It's obvious that nobody should or would run a 20 hour long blender render on 5+ghz. You either power limit it or downclock it. I mean igorslab has some interesting numbers, performance normalized (basically having the same performance as 5900x) teh 12900k completely evaporates it in terms of efficiency. The 5900x needs 40% more power to finish the same rendering workload.

So yeah, long story short, alder lakes are extremely efficient in 99% of tasks at stock. If you work for disney and do rendering 24/7 then power limiting them to 150w would do wonders, if you care about efficiency. You'll lose 5% performance but you'll decrease the power output by a truckload.


It doesn't matter how new or old it is. My point is a faster CPU will last you longer, so comparing price to performance only in today's games with today's graphics cards in 1440p resolution is just the wrong way of doing it. Back in 2017 my R5 1600 had the same performance as the 8700k with a 1080ti @ 1440p. Fast forward to today, you can easily use a modern graphics like a 3080 on an 8700k. You can't do that on an R5 1600, even at 4k you will get bottlenecked in some games. So spending 150€ to get the 8700k instead of the R5 1600 would be a better choice.
It does matter because it peaks to those values, and if it cant do so - the performance is lower

Zen 2 began AMD's trend of polling 1000 times a second, every 1ms.
How much performance loss would you expect from one of these 300W monsters if any of the performance criteria isnt met?
 
I never said power draw doesn't matter. I'm saying that during gaming, or 99% of any other task, alder lakes are extremely efficient, way more efficient than zen 3 in most tasks. The only thing they are not efficient at is basically rendering, and that's because of the insanely high stock clockspeeds. It's obvious that nobody should or would run a 20 hour long blender render on 5+ghz. You either power limit it or downclock it. I mean igorslab has some interesting numbers, performance normalized (basically having the same performance as 5900x) teh 12900k completely evaporates it in terms of efficiency. The 5900x needs 40% more power to finish the same rendering workload.

So yeah, long story short, alder lakes are extremely efficient in 99% of tasks at stock. If you work for disney and do rendering 24/7 then power limiting them to 150w would do wonders, if you care about efficiency. You'll lose 5% performance but you'll decrease the power output by a truckload.


It doesn't matter how new or old it is. My point is a faster CPU will last you longer, so comparing price to performance only in today's games with today's graphics cards in 1440p resolution is just the wrong way of doing it. Back in 2017 my R5 1600 had the same performance as the 8700k with a 1080ti @ 1440p. Fast forward to today, you can easily use a modern graphics like a 3080 on an 8700k. You can't do that on an R5 1600, even at 4k you will get bottlenecked in some games. So spending 150€ to get the 8700k instead of the R5 1600 would be a better choice.
There's no wrong way except every way other than yours?!

If you buy what you need and can afford in CPU terms it has to last ten years, because I said so,! :p
 
"So yeah, long story short, alder lakes are extremely efficient in 99% of tasks at stock"


... only vs Intel 11th gen, famously known as a waste of sand. Higher is *bad* here. The 12600K in particular? Middle of the pack at best.
The 5600x is energy efficient, AL is not.

1636247173304.png1636247314188.png
 
"So yeah, long story short, alder lakes are extremely efficient in 99% of tasks at stock"


... only vs Intel 11th gen, famously known as a waste of sand. Higher is *bad* here. The 12600K in particular? Middle of the pack at best.
The 5600x is energy efficient, AL is not.

View attachment 224068View attachment 224070

There isn't anything in the Alder Lake Lineup yet that is competing with the 5600X. In fact the 5800X isn't really competitive vs 12600K except in multi-core, it gets demolished in single.

So proper comparison here would be 12600K vs 5800X and 5900X.

And 12600K matches the 5800X on one chart and lands in-between the 5800X and 5900X on the other.

Those charts really don't tell you what you want them to.

Lets re-do that comparison when we have a 12400.
 
"So yeah, long story short, alder lakes are extremely efficient in 99% of tasks at stock"


... only vs Intel 11th gen, famously known as a waste of sand. Higher is *bad* here. The 12600K in particular? Middle of the pack at best.
The 5600x is energy efficient, AL is not.

View attachment 224068View attachment 224070
Cause that's so many different tasks. Oh wait, it's just 1, cinebench. The amount of misinformation going around in this forum is sad.

Here you go, a bunch of different tasks including gaming.


82-Power-Efficiency-Mixed.png
84-Power-Efficiency-Max-Load.png
05-720-Efficiency-1.png
 
I disagree, the 12400F is priced at $200 ( 6 Pcores, 0 Ecores), and as far as price performance it's the real winner of this generation. Modern CPUs have really slid up way too high on prices, and according to Steam hardware survey over 90% of people are still gaming at 1440p and below, perfect for a 6 core CPU.

Price Citation: https://wccftech.com/canadian-retailer-lists-alder-lake-intel-core-i5-12400f-for-249-cad-200-usd/
While the price is attractive, we don't have the performance numbers for it yet. You can't call something a winner when it's not in the race.
 
These are starting to show up in major OEM rigs now. That means there's some volume behind Alder Lake.

1636258153932.png
 
Oh and for relevance to pricing:

Au launch prices say "shit no, stay zen 3"
At least the top tier models price match for the first time in a while: a 5950x and a 12900K are only $50 apart (at $1050 and $1099)
1636258286412.png
 
While the price is attractive, we don't have the performance numbers for it yet. You can't call something a winner when it's not in the race.
Although this isn't a good indicator to use it looks hopeful.

 
Cause that's so many different tasks. Oh wait, it's just 1, cinebench. The amount of misinformation going around in this forum is sad.

Here you go, a bunch of different tasks including gaming.
Is there another review to corroborate igor's LAB's result, as it's the only one I've seen quoted as proof of AL's low power draw?
 
lol I feel kinda dumb to realize it this late, but looks like for playing at 60Hz VSync, any i5/Ryzen 5 will do, the rest goes to GPU money when the prices are no longer crazy
But why would you game at 60 fps in 2021? That's something we did 20 years ago.

Oh yeah, the human eye can't see more than 4 GB of RAM anyway.
 
Is there another review to corroborate igor's LAB's result, as it's the only one I've seen quoted as proof of AL's low power draw?
I absolutely trust Igor, but yes it's odd to see disagreeing information
Perhaps he tested something differently.
Cause that's so many different tasks. Oh wait, it's just 1, cinebench. The amount of misinformation going around in this forum is sad.

Here you go, a bunch of different tasks including gaming.
Look, heres a screenshot. It's of you. I want you to see the "less is better" part and then see the first thing you linked, with the Intels at the top with the biggest numbers...
1636270697737.png
 
But why would you game at 60 fps in 2021? That's something we did 20 years ago.

Oh yeah, the human eye can't see more than 4 GB of RAM anyway.

What a dumb question. Because most people only have 60Hz monitors and gaming at >60FPS requires exponentially more expensive hardware.

I absolutely trust Igor, but yes it's odd to see disagreeing information
Perhaps he tested something differently.

Look, heres a screenshot. It's of you. I want you to see the "less is better" part and then see the first thing you linked, with the Intels at the top with the biggest numbers...

Something is not right with that chart, because right above it, he shows that Intel is using less power during Autocad testing:

81-Power-Draw-Mixed.png


It looks like that particular power efficiency chart has the wrong CPUs listed. He even says that Intel is doing better:

"Once again, you can put the score in relation to the power consumption in order to map the efficiency. The Core i9-12900KF is even 71 percentage points more efficient than the Ryzen 9 5950X. I’d rather not even write anything about the Core i5-12600K."

Here's his other chart with correct CPU names listed next to the scores:

84-Power-Efficiency-Max-Load.png
 
Last edited:
I absolutely trust Igor, but yes it's odd to see disagreeing information
Perhaps he tested something differently.
His results do not seem to disagree with others. AutoCAD is rather specific in its CPU usage. It does multithread but generally poorly and relies on single-thread performance which is where Alder Lake excels. It is worth noting that games are quite similar to that usage pattern in many ways and Alder Lake's quite good gaming efficiency is corroborated by a bunch of different sources.

While AutoCAD one was focused upon, Igor also has the Blender power efficiency chart in that review:

Edit:
"So yeah, long story short, alder lakes are extremely efficient in 99% of tasks at stock"
... only vs Intel 11th gen, famously known as a waste of sand. Higher is *bad* here. The 12600K in particular? Middle of the pack at best.
The 5600x is energy efficient, AL is not.
View attachment 224068View attachment 224070
As I noted before in the thread - Alder Lake does very bad in SuperPi. Using that as the benchmark for power efficiency is probably quite misleading. Not sure what the power consumption in the other tests was but assuming similar power draw at quite noticeably better performance compared to others would put it in a different place in the efficiency chart.

For example, SuperPi vs CB R23 ST:
super-pi.png
cinebench-single.png
 
Last edited:
What? The 10400f has been around for 2 years at ~140€. What are youtalking about?

Sorry for the misunderstanding. I thought you wrote about the upcoming 12400F that could cost lower than 5600X. As for 10400F, its upgrading path reaches up to the 10900 whereas the AM4 platform will be compatible even with the upcoming Zen3D. Big difference me thinks.

As for the efficiency topic for heavy apps using all cores and threads, check that power draw below from GN. Double the power draw from the competition's same performance-tier CPU isn't something to argue about for so long me thinks. Gaming never comsumes much from the CPU as we all know for years now but good cooling is needed when you need the CPU to always perform to its max and that costs more for AL CPUs.
1636279227550.png
 
Last edited:
What a dumb question. Because most people only have 60Hz monitors and gaming at >60FPS requires exponentially more expensive hardware.
What a dumb reply. I obviously know that most people still have 60 Hz monitors. What I meant was, how can people still use 60 Hz monitors? I can't even stand moving my cursor on the desktop at 60 Hz anymore. It physically hurts me. It is not smooth.

And no, gaming at 75Hz or 120 Hz doesn't require exponentially more expensive hardware but requires a careful planning and selection of parts before building a PC, something most people don't know how to do. 1080p 120 Hz is easily doable on a GTX 1070 or RTX 2060, cards that aren't even mid-range anymore. Yes, the shortage does change the equation a bit but you shouldn't be building a PC right now anyway unless you already have a graphics card.
 
What a dumb reply. I obviously know that most people still have 60 Hz monitors. What I meant was, how can people still use 60 Hz monitors? I can't even stand moving my cursor on the desktop at 60 Hz anymore. It physically hurts me. It is not smooth.

And no, gaming at 75Hz or 120 Hz doesn't require exponentially more expensive hardware but requires a careful planning and selection of parts before building a PC, something most people don't know how to do. 1080p 120 Hz is easily doable on a GTX 1070 or RTX 2060, cards that aren't even mid-range anymore. Yes, the shortage does change the equation a bit but you shouldn't be building a PC right now anyway unless you already have a graphics card.
I was using 72Hz-90Hz in the CRT days. LCD sent us backwards for a bit there.

As much as it may hurt some people who are used to 1080p60, 4k60 and 4k120 are the literal new standards thanks to TV's progressing, and the new consoles.
 
Back
Top