Friday, December 27th 2019

Intel Enthusiast-Grade K Processors in the Comet Lake-S Family Rumored to Feature 125 W TDP

This piece of news shouldn't surprise anyone, except for the fact that Intel is apparently signing on a TDP of 125 W for even its K-series unlocked processors for their next-generation Comet Lake-S family. Intel's current Comet Lake 9900K CPU features a TDP of "only" 95 W - when compared to the rumored 125 W of the 10900K), whilst their current top offering, the i9-9900KS, features a 127 W TDP. Remember that Intel's 10900K should feature 10 cores and 20 threads, two extra cores than their current 9900K - this should explain the increased TDP, a mathematical necessity given that Intel can only count on marginal improvements to its 14 nm fabrication process to frequencies and power consumption of its CPUs.

A leaked slide from momomo on Twitter shows, if real, that Intel's future enthusiast-grade CPUs (likely i5-10600K, i7-10700K and i9-10900K) will feature this 125 W TDP, while other launches in that family will make do with the more traditional 65 W TDP (interesting to see that Intel has some 10-core CPUs with 65 W TDP, the same as their current 9900, despite two more cores). A footnote on the leaked slide shows that these K processors can be configured for a 95 W TDP, but this would likely come at a significant cost to operating frequency. Intel seems to be bringing a knife to a gunfight (in terms of core counts and TDP) with AMD's Ryzen 3000 and perhaps Ryzen 4000 CPUs, should those and Intel's future offerings actually coexist in the market.
Sources: user momomo @ Twitter, via Videocardz
Add your own comment

96 Comments on Intel Enthusiast-Grade K Processors in the Comet Lake-S Family Rumored to Feature 125 W TDP

#26
Berfs1
cucker tarlsonit helps with noise levels


well,that's not true since boost is temp realted.



so TPU can't make a cpu review :rolleyes:
Performance wise it doesn’t matter. Fan speed sure, but not necessarily for performance. That’s an indirect comparison. Yes, boost is temp related but, I’m pretty sure you don’t just get more than 200 MHz from a 20C drop. Depends on chip and cooling and a lot of stuff. And explain how, a CPU a designed for 210W, has 2 extra cores, has higher frequencies, RUNS ON THE SAME ARCHITECTURE (yes 6-9th gen basically the same arch) as an 8700K (which is rated for I think 120-131W) runs cooler?
Posted on Reply
#27
cucker tarlson
Berfs1Performance wise it doesn’t matter. Fan speed sure, but not necessarily for performance. That’s an indirect comparison. Yes, boost is temp related but, I’m pretty sure you don’t just get more than 200 MHz from a 20C drop. Depends on chip and cooling and a lot of stuff. And explain how, a CPU a designed for 210W, has 2 extra cores, has higher frequencies, RUNS ON THE SAME ARCHITECTURE (yes 6-9th gen basically the same arch) as an 8700K (which is rated for I think 120-131W) runs cooler?
cause 9900k is soldered,like every 9th gen K chip,while neither 9700k nor 8086k are.
Posted on Reply
#28
Berfs1
notbQuite a lot of people run these with no manual OC. The CPU takes care of itself pretty well.

No. TDP is given in specs and that's it. It's measured by Intel for a non-overclocked processor under some "real life scenario".
The 210W figure at 4.7 GHz all core is power consumption.
Yea 95W and 3.6 GHz is totally realistic for a stock 9900K running stock bios settings. You have to disable turbo boost in order to get the 95W limit working all the time.
cucker tarlsoncause 9900k is soldered,like every 9th gen K chip,while neither 9700k nor 8086k are.
9700K is soldered, 8th gen is not. 9th gen locked cpus are not soldered, and 9600K may depend on the revision, not sure.
Posted on Reply
#29
Dave65
JismHello Intel Defender. AMD's TDP is way more within specs compared to Intel. The 2700x for example is a 105W TDP chip but could exceed 140W once boost (unlimited) kicks in.
As an AMD fan I can defend him, AMD is just as guilty in this, Linus and Gamers Nexus has videos on this.
Posted on Reply
#30
Turmania
Letme be very clear, I dont care if its Intel or AMD, if a processor stated as 95w piece of a product, I expect that to be delivered and not exceeded. if the boost exceeds it, then the product is falsely advertised. I dont mind if they give tdp with its boost clocks even if its 150w as long as consumeres know what they are getting into and right both companies are no way near innocent.
Posted on Reply
#31
Berfs1
TurmaniaLetme be very clear, I dont care if its Intel or AMD, if a processor stated as 95w piece of a product, I expect that to be delivered and not exceeded. if the boost exceeds it, then the product is falsely advertised. I dont mind if they give tdp with its boost clocks even if its 150w as long as consumeres know what they are getting into and right both companies are no way near innocent.
I agree with that, however intel DOES HAVE a turbo TDP, they just don’t state it. With AMD, it’s variable because it is a variable boost, while intel cpus have a boost table they follow. That’s why amd can’t really say a boost TDP cus there isn’t one. However, for EXPECTED USE, Intel’s power consumption is much more than 95W. AMD’s 3950X under expected loads are much closer to their TDP ratings. Oh and let’s not forget how many different times Intel changed their definition of TDP.
Posted on Reply
#32
cucker tarlson
Berfs1Yes, boost is temp related but, I’m pretty sure you don’t just get more than 200 MHz from a 20C drop.
Berfs1Run a 3700X at 60C or 80C, you get about the same performance.
pick one set of standards for every brand,will you.
Posted on Reply
#33
davideneco
Love see people argument to TDP

Its easy

TDP intel = base clock
TDP amd = all core boost

Simple
Posted on Reply
#34
Turmania
davidenecoLove see people argument to TDP

Its easy

TDP intel = base clock
TDP amd = all core boost

Simple
3600 or 3700x draws no where near close to their 65w ratings so all I can say is you have a easy blind eye for AMD.
Posted on Reply
#35
Berfs1
cucker tarlsonpick one set of standards for every brand,will you.
I expected those intel temps from a D14 or D15, not a NH U12S. The NH U12S isn’t the worlds strongest cooler that could cool a 9900K at turbo at 57C. If you can keep the core speeds constant at least, then it would be more consistent. Perhaps deleting the 3.6/5.0 because that sounds like turbo was used, and disable turbo, and run 3.6 GHz constantly. At a constant voltage. That is a true test. And obviously lower temps are better, HOWEVER that only applies to the specific cpu. A 9700K at 75C doesn’t make it better than a 3900X at 80C, because then you need to look at leakage, and leakage can vary from cpu to cpu. That can affect power consumption pretty substantially, it can potentially make the cpu take another 5-10W just from the extra heat (going from, say, 60C to 80C on a high core count cpu)
Posted on Reply
#36
cucker tarlson
Berfs1I expected those intel temps from a D14 or D15, not a NH U12S. The NH U12S isn’t the worlds strongest cooler that could cool a 9900K at turbo at 57C. If you can keep the core speeds constant at least, then it would be more consistent. Perhaps deleting the 3.6/5.0 because that sounds like turbo was used, and disable turbo, and run 3.6 GHz constantly. At a constant voltage. That is a true test. And obviously lower temps are better, HOWEVER that only applies to the specific cpu. A 9700K at 75C doesn’t make it better than a 3900X at 80C, because then you need to look at leakage, and leakage can vary from cpu to cpu. That can affect power consumption pretty substantially, it can potentially make the cpu take another 5-10W just from the extra heat (going from, say, 60C to 80C on a high core count cpu)
i prefer to trust TPU in matters like these.

if they can't get temp testing right,what are we even talking about.
the difference between 9700k and 3900x is 26 degrees,not 5 degrees.
it will affect noise levels substantially.
Posted on Reply
#37
londiste
Berfs1they ALSO won’t mention the TDP that is in place when running all that turbo stuff. 210W. Versus AMD’s 105W all-time TDP. For intel, in turbo mode they have a different TDP that they never mention in public docs, and the only way to truly find out is through XTU, and it shows how much more power intel cpus take for those performance numbers.
Ryzen 3000 105W CPU-s have 142W power limit in usual circumstances. 65W CPU models will consume 88W.

Intel's K-models have motherboard-specific "optimized" settings that generally mean disabled power limits. Non-K models generally perform as per spec - 25% increased power limit for 8 seconds. For example, a 65W Intel non-K CPU will do 81W for 8 seconds at load.

Any Intel 14nm CPU will lose out to AMD's 7nm CPU in all-core loads, especially at 6+ cores. Frequencies in these circumstances will not favor Intel CPUs any more. Not because they are not capable of it but because they hit power limits.
Posted on Reply
#38
moproblems99
cucker tarlsoni prefer to trust TPU in matters like these.

if they can't get temp testing right,what are we even talking about.
the difference between 9700k and 3900x is 26 degrees,not 5 degrees.
it will affect noise levels substantially.
The 3rd gen temp numbers are slightly misleading. Average load temperatures are much lower than nano peak temps. I was super nervous until I looked at what the rolling load temp was.
Posted on Reply
#39
notb
Berfs1Yea 95W and 3.6 GHz is totally realistic for a stock 9900K running stock bios settings. You have to disable turbo boost in order to get the 95W limit working all the time.
Once again: TDP is a figure connected to heat dissipation, not power consumption. If you don't know this by now, there's no way I can convince you.
Posted on Reply
#40
londiste
notbOnce again: TDP is a figure connected to heat dissipation, not power consumption. If you don't know this by now, there's no way I can convince you.
That is a technicality and a matter of nickpicking on terminology. In practice, sticking to TDP would mean sticking to power limit as the vast majority of power consumed by CPU will be converted to heat. That is the entire point of being angry with Intel's (and current AMD's) idea of TDP - the stated amount does not match the actual heat output of the CPU.
Posted on Reply
#41
Darmok N Jalad
Don't Ryzen CPUs report their temps differently than Intel chips? I thought I recall Zen incorporating an array of temp sensors across the CPU as part of Precision Boost.
Posted on Reply
#42
notb
londisteThat is a technicality and a matter of nickpicking on terminology. In practice, sticking to TDP would mean sticking to power limit as the vast majority of power consumed by CPU will be converted to heat. That is the entire point of being angry with Intel's (and current AMD's) idea of TDP - the stated amount does not match the actual heat output of the CPU.
No and this is a very common misunderstanding.
But you're right in one thing: pretty much all of energy consumed is emitted as heat.

So, you have a consumer CPU with TDP=65W. You run it as intended. It stays near base clocks most of the time and boost for few seconds from time to time. During that boost it consumes 200W.
So, what cooler do you need? A 95W one. Someone tested it on a 95W CPU and it was fine.
What PSU do you need? A 200W one (just for the CPU).
That's the difference.

You should not look at it as if Intel or AMD deceived you. They've sold you a CPU with 65W or 95W TDP. You can buy a cooler based on that TDP.
But they've also given you a bonus (not a lie!). Because they've made their CPUs so rapid and flexible, they can boost instantly for a short time when you need it in your typical consumer-ish PC using: to load a website, open a file, apply an effect in a photo editor etc. It's so short that the extra heat produced is tiny and your 95W cooler won't explode.
And if, during that short boost, your PSU can provide just 150W, not 200W? It won't explode either. The CPU knows when to stop pushing. It's all though out really well.

We test consumer CPUs by running hours of 100% load benchmarks, which is not how these CPUs are used in real life. Of course that's how we learn their performance limits (which is good), but the resulting average power consumption figures are unrealistic.
This is exactly the reason why workstation/server CPUs turn out (in similar tests) to be very conservative when it comes to TDP. Because their purpose is exactly to run at 100% all the time. That's how their TDP was calculated.

And now moving to 7nm Zen2 issue, which I really can't pass over.
The coolers we have today were tested on CPUs available before 7nm arrived.
It turns out that these CPUs, despite consuming under 150W, are so tightly packed that the heat concentration is much larger than we've seen earlier. That's why 3900X and 3950X are so hot.
And that's why AMD recommends to pair a CPU with TDP=105W with coolers that have TDP ratings 2-3 times larger.
So suddenly the TDP stops making any sense at all. It's lower than what these CPUs actually pull (140-150W) and has absolutely no meaning when it comes to choosing a sufficient cooler.

When Intel joins with desktop 10nm CPUs, the whole TDP rating will have to be adjusted. Dark Rock 4 will not be a 200W cooler anymore. It'll be called a 100W cooler, maybe less.
Posted on Reply
#43
cdawall
where the hell are my stars
cucker tarlsonlol,another tdp debate.
why doesn't anyone pay attention to what actually matters




K-series will be entusiast only,I like the mainstream ones though.
+4.5ghz out of the box,HT on every chip.fast,cool and quiet.
Load temps don’t really matter though. CPU stability is stability.
davidenecoLove see people argument to TDP

Its easy

TDP intel = base clock
TDP amd = all core boost

Simple
AMD TDP is based off of P0 which is max clocks without boost per AMD.

here is a link confirming that information

www.gamersnexus.net/guides/3525-amd-ryzen-tdp-explained-deep-dive-cooler-manufacturer-opinions
Posted on Reply
#44
Tomorrow
notbIt looks acceptable. With good pricing this will make Intel competitive in the mass "up to 8 cores" market for another year. That's all they can hope for until 7/10nm arrive.
250W+ mainstream parts incoming. Yeah. Very "acceptable" indeed. My 3800X does not exceed 140W (according to HWInfo) even when overclocked to 4.5 Ghz. Comet Lake will problably double that for 5Ghz+ overclock on the 10c/20t part and will lose to Ryzen 4000 in performance regardless.

Ah and don't forget to buy your new motherboards too...
Posted on Reply
#46
MrAMD
An 10 core 9900k running 5.2GHz+? With a iGPU? (accelerated video rendering). Yes please. Sucks I'll have to change out the mobo though.
Posted on Reply
#47
Berfs1
notbNo and this is a very common misunderstanding.
But you're right in one thing: pretty much all of energy consumed is emitted as heat.

So, you have a consumer CPU with TDP=65W. You run it as intended. It stays near base clocks most of the time and boost for few seconds from time to time. During that boost it consumes 200W.
So, what cooler do you need? A 95W one. Someone tested it on a 95W CPU and it was fine.
What PSU do you need? A 200W one (just for the CPU).
That's the difference.

You should not look at it as if Intel or AMD deceived you. They've sold you a CPU with 65W or 95W TDP. You can buy a cooler based on that TDP.
But they've also given you a bonus (not a lie!). Because they've made their CPUs so rapid and flexible, they can boost instantly for a short time when you need it in your typical consumer-ish PC using: to load a website, open a file, apply an effect in a photo editor etc. It's so short that the extra heat produced is tiny and your 95W cooler won't explode.
And if, during that short boost, your PSU can provide just 150W, not 200W? It won't explode either. The CPU knows when to stop pushing. It's all though out really well.

We test consumer CPUs by running hours of 100% load benchmarks, which is not how these CPUs are used in real life. Of course that's how we learn their performance limits (which is good), but the resulting average power consumption figures are unrealistic.
This is exactly the reason why workstation/server CPUs turn out (in similar tests) to be very conservative when it comes to TDP. Because their purpose is exactly to run at 100% all the time. That's how their TDP was calculated.

And now moving to 7nm Zen2 issue, which I really can't pass over.
The coolers we have today were tested on CPUs available before 7nm arrived.
It turns out that these CPUs, despite consuming under 150W, are so tightly packed that the heat concentration is much larger than we've seen earlier. That's why 3900X and 3950X are so hot.
And that's why AMD recommends to pair a CPU with TDP=105W with coolers that have TDP ratings 2-3 times larger.
So suddenly the TDP stops making any sense at all. It's lower than what these CPUs actually pull (140-150W) and has absolutely no meaning when it comes to choosing a sufficient cooler.

When Intel joins with desktop 10nm CPUs, the whole TDP rating will have to be adjusted. Dark Rock 4 will not be a 200W cooler anymore. It'll be called a 100W cooler, maybe less.
So if I understand this correctly, a 95W cpu has more cooling efficiency than a 65W chip? If that is the case, let’s compare the temps of the 9900KS vs the 9900K, since one has 127W and the other has 95W TDP. The 9900KS theoretically should offer lower temps correct?
MrAMDAn 10 core 9900k running 5.2GHz+? With a iGPU? (accelerated video rendering). Yes please. Sucks I'll have to change out the mobo though.
I don’t think the 10 core chips were supposed to come with iGPUs, I believe those were all F/KF cpus, but I may be wrong. Then again, if you need 10 cores, ur probs gonna get urself a GPU anyways.
Posted on Reply
#48
eidairaman1
The Exiled Airman
Berfs1So if I understand this correctly, a 95W cpu has more cooling efficiency than a 65W chip? If that is the case, let’s compare the temps of the 9900KS vs the 9900K, since one has 127W and the other has 95W TDP. The 9900KS theoretically should offer lower temps correct?


I don’t think the 10 core chips were supposed to come with iGPUs, I believe those were all F/KF cpus, but I may be wrong. Then again, if you need 10 cores, ur probs gonna get urself a GPU anyways.
Think 5775...

Eitherway tdp is a metric used by the manufacturers for coolers. It's not a viable determination for us to buy a cpu.
The cpu by intel is a potato.
Posted on Reply
#49
Crackong
lol another TDP debate.

Gamersnexus already explained both Intel and AMD TDP.
They are either calculated from base clock or cooler thermal resistance.
They are both OFF.
Posted on Reply
#50
hat
Enthusiast
notbYou should not look at it as if Intel or AMD deceived you. They've sold you a CPU with 65W or 95W TDP. You can buy a cooler based on that TDP.
But they've also given you a bonus (not a lie!). Because they've made their CPUs so rapid and flexible, they can boost instantly for a short time when you need it in your typical consumer-ish PC using: to load a website, open a file, apply an effect in a photo editor etc. It's so short that the extra heat produced is tiny and your 95W cooler won't explode.
And if, during that short boost, your PSU can provide just 150W, not 200W? It won't explode either. The CPU knows when to stop pushing. It's all though out really well.

We test consumer CPUs by running hours of 100% load benchmarks, which is not how these CPUs are used in real life. Of course that's how we learn their performance limits (which is good), but the resulting average power consumption figures are unrealistic.
This is exactly the reason why workstation/server CPUs turn out (in similar tests) to be very conservative when it comes to TDP. Because their purpose is exactly to run at 100% all the time. That's how their TDP was calculated.
You... can't be serious. You expect me to look at a chip like the 9900k with a base clock of 3.6GHz and a turbo clock of 5.0GHz and take that 5GHz speed as a bonus for loading web pages faster? I can open web pages and open files pretty quickly with even a 15 year old computer. I don't need to be at 5GHz for a fraction of a second to do that only to slow down to 3.6GHz the moment I put any real load on the system. A lot of people are building systems with Intel processors for gaming, these days. It's one of their last few advantages, being able to get a few more FPS in games. As you know, running a game can be a pretty demanding task... and typically lasts for hours, not seconds. This is just one example of something very common that might happen with such a system. What about other examples of prolonged load? I hope you're not going to suggest we should be building workstations or servers to do transcoding, or running applications such as World Community Grid?
Posted on Reply
Add your own comment
Nov 27th, 2024 13:17 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts