# Intel Core i9-10900K 10-core Processor and Z490 Chipset Arrive April 2020



## btarunr (Dec 10, 2019)

Intel is expected to finally refresh its mainstream desktop platform with the introduction of the 14 nm "Comet Lake-S" processors, in Q2-2020. This sees the introduction of the new LGA1200 socket and Intel 400-series chipsets, led by the Z490 Express at the top. Platform maps of these PCI-Express gen 3.0 based chipsets make them look largely similar to current 300-series platform, with a few changes. For starters, Intel introducing its biggest ACPI change since C6/C7 power states that debuted with "Haswell;" with the introduction of C10 and S0ix Modern Standby power-states, which give your PC an iPad-like availability while sipping minimal power. This idea is slightly different from Smart Connect, in that your web-connected apps and processor work at an extremely low-power (fanless) state, rather than waking your machine up from time to time for the apps to refresh. 400-series chipset motherboards will also feature updated networking interfaces, such as support for 2.5 GbE wired LAN with an Intel i225-series PHY, 802.11ax WiFi 6 WLAN, etc.

HyperThreading will play a big role in making Intel's processor lineup competitive with AMD's given that the underlying microarchitecture offers an identical core design to "Skylake" circa 2015. The entry-level Core i3 chips will be 4-core/8-thread, Core i5 6-core/12-thread, Core i7 8-core/16-thread; and leading the pack will be the Core i9-10900K, a 10-core/20-thread processor. According to a WCCFTech report, this processor will debut in April 2020, which means at CES 2020 in January, we'll get to see some of the first socket LGA1200 motherboards, some even based on the Z490. The platform also mentions an interesting specification: "enhanced core and memory overclocking." This could be the secret ingredient that makes the i9-10900K competitive with the likes of the Ryzen 9 3900X. The LGA1200 platform could be forwards-compatible with "Rocket Lake," which could herald IPC increases on the platform by implementing "Willow Cove" CPU cores. 





*View at TechPowerUp Main Site*


----------



## TheLostSwede (Dec 10, 2019)

So two more cores and maybe integrated 802.11ax Wi-Fi...
Huge changes in the platform...


----------



## GoldenX (Dec 10, 2019)

So lame, it's almost not even news worth it.


----------



## TheLostSwede (Dec 10, 2019)

GoldenX said:


> So lame, it's almost not even news worth it.


I mean, there's the new socket as well...


----------



## btarunr (Dec 10, 2019)

TheLostSwede said:


> I mean, there's the new socket as well...



Modern standby on desktop could be interesting. This was originally designed for Ultrabooks to match iPad in terms of high availability, battery-life, and background refresh of web-connected apps. The snappiness with which an iPad comes to life even after 2 weeks of sitting in your drawer, with all its apps and notifications; is something that suspend-to-RAM couldn't match (also no background app refresh with STR). So Microsoft created an ACPI standard to achieve that kind of device behavior.

Interested to see how it benefits home desktops that don't stay powered up all day long.


----------



## ZoneDymo (Dec 10, 2019)

Soooo this is another 7700k with again, 2 more cores added?


----------



## Mats (Dec 10, 2019)

ZoneDymo said:


> Soooo this is another 7700k with again, 2 more cores added?


6700K even.


----------



## madness777 (Dec 10, 2019)

If they can sell the i5-10500K *whatever the bloody name is* for ~200-220$ it's gonna be quite good for the gaming market.
The problem is you still need to get a new motherboard, and the way AMD is right now, you can get an absolutely solid B450 board for 120$ or less. Keep in mind it has all the OC features unlocked.
Don't think intel is gonna sell any decent Z490 boards for less than 200 bones. And you can't get any other chipset with OC support unless intel decides otherwise.
They're struggling and I love it


----------



## Penev91 (Dec 10, 2019)

Oh look, a new socket from Intel!


----------



## Hyderz (Dec 10, 2019)

i9 - 10c/20t? 
i7 - 8c/16t?
i5 - 6c/12t?
i3 - 4c/8t? 

or maybe intel might have some non hyper threaded 10 series cpu.
e.g - i5 might be a straight 8cores and i3 might be a pure 6 cores with no hyper threading
pentium 4c/8t celeron 4c/4t

as for pricing i think intel is gonna ask $649 or $699 for its i9 10 cores, but it will be interesting what intel priced them at.


----------



## Turmania (Dec 10, 2019)

Disappointed with Intel recently but still rest assured it will still and easily best any ryzen 4xxx cpu's launched next year easily and with a bigger margin than this year as far as gaming is concerned with no misleading advertisement on the product.


----------



## TheLostSwede (Dec 10, 2019)

btarunr said:


> Modern standby on desktop could be interesting. This was originally designed for Ultrabooks to match iPad in terms of high availability, battery-life, and background refresh of web-connected apps. The snappiness with which an iPad comes to life even after 2 weeks of sitting in your drawer, with all its apps and notifications; is something that suspend-to-RAM couldn't match (also no background app refresh with STR). So Microsoft created an ACPI standard to achieve that kind of device behavior.
> 
> Interested to see how it benefits home desktops that don't stay powered up all day long.



Doesn't seem to be unique to this platform and it requires a compatible PSU...


----------



## R0H1T (Dec 10, 2019)

Turmania said:


> Disappointed with Intel recently but still rest assured it will still and easily best any ryzen 4xxx cpu's launched next year easily and with a bigger margin than this year as far as gaming is concerned with *no misleading advertisement on the product*.


Yeah all the while consuming just ***95W* of power, never mind the fact that it'll likely consume *250W* w/MCE & all cores loaded


----------



## The Quim Reaper (Dec 10, 2019)

Well if nothing else it will mean that Intel will have a $350-$400 8c16t CPU to go up against the 3700X/3800X, as price tiers will will no doubt drop down 1 notch, so as to keep the 10c20t part at the $500 mark.


----------



## Turmania (Dec 10, 2019)

R0H1T said:


> Yeah all the while consuming just ***95W* of power, never mind the fact that it'll likely consume *250W* w/MCE & all cores loaded



Who you are trying to fool? it is not like as if AMD stays in their TDP specs, in fact they are worse in that aspect to Intel, and there is the boost speeds fracas and somehow slow boot up and countless bios headaches... i did not want to go over this but since it has been brought up.... there is this ill faited, turning a blind eye to one company and they get away with everything.... AMD is no whiter than white, in fact they are more darker then Intel is.


----------



## R0H1T (Dec 10, 2019)

Turmania said:


> in fact they are more darker then Intel is.


Yeah sure if you so.


----------



## rawadinozor (Dec 10, 2019)

Turmania said:


> Who you are trying to fool? it is not like as if AMD stays in their TDP specs, in fact they are worse in that aspect to Intel, and there is the boost speeds fracas and somehow slow boot up and countless bios headaches... i did not want to go over this but since it has been brought up.... there is this ill faited, turning a blind eye to one company and they get away with everything.... AMD is no whiter than white, in fact they are more darker then Intel is.



what bios headaches? i have a 3700x and i am very happy with it, no issues at all.
also did you compare how intel and AMD calculate tdp? can you maybe elaborate.


----------



## Melvis (Dec 10, 2019)

Turmania said:


> Disappointed with Intel recently but still rest assured it will still and easily best any ryzen 4xxx cpu's launched next year easily and with a bigger margin than this year as far as gaming is concerned with no misleading advertisement on the product.



Erm the 3950X already beats the 18 Core 10980XE so how is this 10 core going to beat the 4000 Series from AMD? and if you say "gaming" your just an idiot....


----------



## Calmmo (Dec 10, 2019)

10th gen 9900k equivalent at 300-350 to complete with 3700x will save them some lost sales, but knowing intel i doubt that, more like 400-450 and at that price range it wont be worth it.


----------



## springs113 (Dec 10, 2019)

Turmania said:


> Who you are trying to fool? it is not like as if AMD stays in their TDP specs, in fact they are worse in that aspect to Intel, and there is the boost speeds fracas and somehow slow boot up and countless bios headaches... i did not want to go over this but since it has been brought up.... there is this ill faited, turning a blind eye to one company and they get away with everything.... AMD is no whiter than white, in fact they are more darker then Intel is.


  what are you talking about.   I have 3 ryzen systems and they all work just fine.   Boot time is the fastest on my zen 2 system.   It sounds to me like you're shilling for the other team.   Now back to the subject matter,  although I think this thing is DOA, it's price will dictate how well it does.  Unfortunately for Intel Zen 3 is rumored to have about 15+% in improvements.   If AMD can get some much needed clock speed boosts, I don't see anyone in their right minds  getting this.


----------



## Tomgang (Dec 10, 2019)

Wow exciting news... Glad I chose to go AMD this round.


----------



## john_ (Dec 10, 2019)

You know why April, right?


Intel: "And now we announce out first 10nm desktop processor, the Intel Core i9-10900K"

......audience goes silent......

Intel: Ahahahahahaha..... you are so easy to be fooled. LOOOOOOOOOOOOOOOLLLLLLLLLLLLLLLLLLLLLL! It's still 14nm++++++. April fool's day! 
Oh, you are so easy....... so easy to be fooled......

.....By the way. We have some new 32nm i5s if you are interested...."


----------



## notb (Dec 10, 2019)

GoldenX said:


> So lame, it's almost not even news worth it.


What do you expect? It's just a new product. It's not meant to be "exciting" or "innovative".
They launch generations on regular basis. It has the improvements they had at hand. That's it.

Big changes will come when possible. Intel wants that as well.

10900K itself doesn't look half bad if priced in line with AMD. At $500 it'll sell like hot cakes.
As usual, it's more important how the high volume, mid range models stack up ($150-300 range).


----------



## Tomgang (Dec 10, 2019)

biffzinker said:


> Why would you want existing news when you could have _exciting news_ instead.



Haha that's a typo. My bad.

Edit: corrected.


----------



## Zach_01 (Dec 10, 2019)

The high clock boost time is over people... I doubt you will ever see, any time soon a 5+GHz clock after this "so called" new series from Intel. Moving to smaller nodes, clocks will be hitting a wall. IPC gains and core counts is the way for the next few years.



Turmania said:


> Who you are trying to fool? it is not like as if AMD stays in their TDP specs, in fact they are worse in that aspect to Intel, and there is the boost speeds fracas and somehow slow boot up and countless bios headaches... i did not want to go over this but since it has been brought up.... there is this ill faited, turning a blind eye to one company and they get away with everything.... AMD is no whiter than white, in fact they are more darker then Intel is.


Who are "you" trying to fool? TDP is about heat dissipation first of all and not power draw, and while Intel's numbers refering to base clocks, AMD's is for the average boost clocks/workloads. A stock 3900X/3950X will stay under 150W total power draw/consumption vs a stock 9900K that surpasses 170W easily. And even a 3900X will drive circles around a 9900K in 90+% of all core work loads.










For the slow boot times... you are about 2 months behind news because that was improved with a UEFI update at some point.

AMD did not lie about anything. Users believed by their own "Intel mindshare" (like you) that boost will be Intel alike...
Advertised performance is real. And boost is also real, just not in the way you wanted/expected it to be.
Dont talk about misleading advetrisement because Intel has a whole division in false, misleading and narrow perspective adv...


----------



## Kokotas (Dec 10, 2019)

I like this timing since there will be enough benchmarks to show how well Intel's lineup will fare against AMD until ampere comes out in June. Sitting on a 6700k atm, I'm still excited about this release but I get that people who've already moved to Intel's 9th gen or Zen 2 won't have much interest in such an upgrade most likely.


----------



## Zach_01 (Dec 10, 2019)

Kokotas said:


> I like this timing since there will be enough benchmarks to show how well Intel's lineup will fare against AMD until ampere comes out in June. Sitting on a 6700k atm, I'm still excited about this release but I get that people who've already moved to Intel's 9th gen or Zen 2 won't have much interest in such an upgrade most likely.


IMHO... this release is mostly for Intel to fill-in the gap (to show something) until they manage to get 10nm or even 7nm in line and with a all-new architecture. And most of the new features that the OP stated benefit the mobile market mainly where Intel has actually something to show. Desktop/HEDT/Server market is a lost cause for Intel for the next 1, maybe 2 years.


----------



## Object55 (Dec 10, 2019)

Should have kept it on z390, at least people with existing boards would have bought it. But now, nobody cares.


----------



## DeathtoGnomes (Dec 10, 2019)

Penev91 said:


> Oh look, a new socket from Intel!


and it hasnt been 2 years! 

What bothers me is that the chipset still on PCIe 3.0, I wonder if a BIOS update will give bring it to 4.0


----------



## Metroid (Dec 10, 2019)

hahahahhahahahahhha, def 1st april hahahahahaha


----------



## ratirt (Dec 10, 2019)

Kokotas said:


> I like this timing since there will be enough benchmarks to show how well Intel's lineup will fare against AMD until ampere comes out in June. Sitting on a 6700k atm, I'm still excited about this release but I get that people who've already moved to Intel's 9th gen or Zen 2 won't have much interest in such an upgrade most likely.


Not sure what you are talking about. The 10900 (and other 10gen CPUs) has been benchmarked already. You know what these 10th gen Intel CPU can do. New chipset new socket. No surprise here though.
According to OP the new chipset will have mostly power efficiency settings in place and Better OC for cores and memory. We will see how the last one will pan out.
What is interesting is why wait till April? Intel needs to design the chipset first? I thought, all for the 10th gen was planned way in advance but I guess it wasn't.



Metroid said:


> hahahahhahahahahhha, def 1st april hahahahahaha


I wonder what the April release concerns here. The release date of the new chipset as Aprils fools or the chipset itself including 10th gen is Aprils fools for those who buy it.


----------



## GreiverBlade (Dec 10, 2019)

". The platform also mentions an interesting specification: "enhanced core and memory overclocking." This could be the secret ingredient that makes the i9-10900K competitive with the likes of the Ryzen 9 3900X. "
ohhhh, i read it like that 

"intel will make it compete with their own HEDT platform" ... since a R9 3900X/3950X already beat the I9-10980XE (yeah yeah i know i know .... Mainstream CPU are not HEDT CPU.... still does not change the fact that it can beat it soundly in most case scenario ... )

Intel is still sitting on their hands, it seems ...


----------



## laszlo (Dec 10, 2019)

intel will sell no matter node&performance vs competition; they afford to pump a lot of money  in marketing ,discounts, etc... as usually 

 "Pentium is better" is still present


----------



## Basard (Dec 10, 2019)

Could be compatible with Rocket Lake.....   who knows?  Amazingly the 9900k is compatible with Coffee Lake mobos.  Then there was the whole 6700k to 7700k nonsense.


----------



## Flanker (Dec 10, 2019)

"14nm"
"new socket"

I just threw up...


----------



## Vortigaunt (Dec 10, 2019)

So your i9 9900K/KF/KS will become i7. Bwahaha...


----------



## Zach_01 (Dec 10, 2019)

Oh this is nothing. The one time king of the hill (3 years ago) i7 7700k has become a humble i3...
If AMD did not release the ZEN platform, Intel could still marketing a 4c/8t as i7 at 2020


----------



## jgraham11 (Dec 10, 2019)

I wonder what CPU bugs lay waiting for unsuspecting customers just like Core2Duo, Sandy Bridge, Ivy Bridge, Haswell, Broadwell, Skylake, Coffee Lake, even the latest with Cascade Lake.

Maybe we just say all of them!

From Skylake forward, Intel knew they were selling faulty CPUs... you might as well just disable Hyper-Threading now...


----------



## KarymidoN (Dec 10, 2019)

i wonder what AMD will bring next, looks like there's only one company trying to bring new stuff to the market anyways...


----------



## bonehead123 (Dec 10, 2019)

WORD:

f.A.i.L....

I'm still rockin a 6700k, and unless & until blue boys get their sh*t together & make something with 7nm++, pcie 4/5, AND decent priced mobo's to go with them, then I'm gonna stay put for now....


----------



## Darmok N Jalad (Dec 10, 2019)

I welcome these products. Whatever they are priced, AMD will probably counter with a price cut, so it’s a win for all. I got a 2700X for just $129 on Black Friday. Yes, 8C/16T for $129!

And this new standby feature reminds me of Apple’s Power Nap feature, which has been around since 2013.


----------



## KarymidoN (Dec 10, 2019)

bonehead123 said:


> WORD:
> 
> f.A.i.L....
> 
> I'm still rockin a 6700k, and unless & until blue boys get their sh*t together & make something with 7nm++, pcie 4/5, AND decent priced mobo's to go with them, then I'm gonna stay put for now....



decent priced mobos and intel new cpus don't sound right in the same sentence... idk if its my bad english sorry


----------



## AeonMW2 (Dec 10, 2019)

Object55 said:


> Should have kept it on z390, at least people with existing boards would have bought it. But now, nobody cares.



i willl buy this. i am sitting on a high refresh rate 1080p monitor and need 140fps+ in every game. if 10900k will be priced at 500$ it will be a killer for high refresh rate gaming.
9900k is fine too, but there is 2 more cores for probably same price


----------



## spnidel (Dec 10, 2019)

Turmania said:


> Who you are trying to fool? it is not like as if AMD stays in their TDP specs, in fact they are worse in that aspect to Intel, and there is the boost speeds fracas and somehow slow boot up and countless bios headaches... i did not want to go over this but since it has been brought up.... there is this ill faited, turning a blind eye to one company and they get away with everything.... AMD is no whiter than white, in fact they are more darker then Intel is.



yeah dude, my 3900x (105w tdp, 12 cores) running at max load at 4.375ghz consuming only 130W is way worse than the 9900K (95w tdp, 8 cores) consuming 200W at 5ghz lmao
the 9900K stays 100% in spec, meanwhile the 3900x? absurdly out of spec
also, what bios headaches? so far I've had no problems with anything related to bios with the 3900x


----------



## neatfeatguy (Dec 10, 2019)

notb said:


> 10900K itself doesn't look half bad if priced in line with AMD. At $500 it'll sell like hot cakes.
> As usual, it's more important how the high volume, mid range models stack up ($150-300 range).



What Intel tells consumers:
"We put out the 10900K at $529 and she sold out like hot cakes! They're moving so fast off the shelves we can't keep them in stock!"

What Intel isn't telling consumers:
Those couple of wafers that yielded about 50 good CPUs for the 10900K - we sent one to each state in the US and they already sold. There's at least 1 rube in every state, let's try to push out more!

What Intel tells consumers:
"Our recent supply shortages has no impact on the availability of our 10th gen CPUs. We are working hard to keep up with demand and we are busy working on filling orders as fast as possible to get these amazing processors in everyone's hands."

What Intel isn't telling consumers:
*Frank*: Where'd you guys put the key to get into our fab? It's been 12 months and no one has found the key yet. Those ES we marked as legit CPUs are all sold out and we need to start mass producing. Someone please tell me you found the key.
*Bob*: (from the back of the crowd, you here an excited yell) I have a key!
*Frank*: Is that Bob? Hurry up here and open the door so we can actually start to make more.
*Bob*: Key doesn't work....Hmmm. I'll try it on the side entrance. Wait here. (Bob runs off around the end of the building. A moment later you hear Bob yelling) Everyone! The key works on this door! Hurry up!
*Frank*: Good job, Bob. Let's get in and get to work....(Frank swings open the door and silence sweeps over the crowed of employees as they enter the building and turn on the lights)
*Bob*: Frank? What is this place? Doesn't look like it's been used for a few years...the dust on all the equipment is kind of thick.
*Frank*: Well....seems like 22nm is back on the plate, boys. Let's at least get some Haswell back out there since we seem to be locked out of our 14nm labs.


Hahaha....all kidding aside. Hopefully Intel and AMD keep up the competition. Financially I'm in no spot to upgrade anytime soon so neither Intel nor AMD are really on my radar. Maybe by the time AMD's next Zen generation comes out I will have some money to finally upgrade to something newer than my 4670k.


----------



## ppn (Dec 10, 2019)

AeonMW2 said:


> i willl buy this. i am sitting on a high refresh rate 1080p monitor and need 140fps+ in every game. if 10900k will be priced at 500$ it will be a killer for high refresh rate gaming.
> 9900k is fine too, but there is 2 more cores for probably same price



Unlikely that more than 6 core makes >10% difference. but considering 6 core will be carved out of the 10 core die, such a waste, better buy the 10 core. And then what, willow/golden cove arrives with the 50% IPC uplift and the old CPu for resale at half the price of I3-11100. Better wait for the chase to settle down.


----------



## fancucker (Dec 10, 2019)

Eagerly waiting to see if the 10900K can beat the vaunted 9350K on userbenchmark. They need to issue a press statement on the updated necessity for higher core counts.


----------



## TheDeeGee (Dec 10, 2019)

Sandy Bridge performance after all security fixes are applied i guess?


----------



## notb (Dec 10, 2019)

TheDeeGee said:


> Sandy Bridge performance after all security fixes are applied i guess?


Yeah, putting aside that Coffee Lake is already 50% faster than Sandy Bridge in single thread and you can get twice as many cores on mainstream platform, it's pretty much the same.


----------



## GoldenX (Dec 10, 2019)

TheDeeGee said:


> Sandy Bridge performance after all security fixes are applied i guess?








						The Mitigation Impact Difference On AMD Ryzen 9 3900X vs. Intel Core i9 9900K Performance - Phoronix
					






					www.phoronix.com
				



On some areas, the difference is quite big.


----------



## Chrispy_ (Dec 10, 2019)

Three questions (well okay, four because Q2 is a two-parter):

Does this finally have enough hardware mitigations for specultaive-execution attacks? Without them, all these hyperthreading improvements are meaningless because the vulnerability patches hurt performance and for a secure system, HT needs to be disabled _in its entirety_.
Why do we need a new socket? It's still using DDR4, not DDR5 (that's AMD's excuse for changing to socket AM5) and it's still using PCIe 3.0. Is Intel unable to physically fit more than 8 cores on a package at 14nm?
Is 10C the upper limit or are there hints that Intel have 12C and 16C models in the pipeline for later in 2020?


----------



## The Egg (Dec 10, 2019)

Object55 said:


> Should have kept it on z390, at least people with existing boards would have bought it. But now, nobody cares.


Yeap.  Folks such as myself might have considered them, were they a drop-in on current boards.  Forget it now.


----------



## Patriot (Dec 10, 2019)

price it <$300 and I might be interested, otherwise fuck it.


----------



## notb (Dec 10, 2019)

Chrispy_ said:


> 1. Does this finally have enough hardware mitigations for specultaive-execution attacks? Without them, all these hyperthreading improvements are meaningless because the vulnerability patches hurt performance and for a secure system, HT needs to be disabled _in its entirety_.


It'll have more than earlier generations. Hard to say at the moment.
Disabling HT does NOT make a system secure. CPU has other security flaws that haven't been found or revealed yet.


> 2. Why do we need a new socket? It's still using DDR4, not DDR5 (that's AMD's excuse for changing to socket AM5) and it's still using PCIe 3.0. Is Intel unable to physically fit more than 8 cores on a package at 14nm?
> 3. Is 10C the upper limit or are there hints that Intel have 12C and 16C models in the pipeline for later in 2020?


They're adding pins. This could be because of added PCIe, higher IGP transfers or other features unknown at this point.
Socket/package size is the same as in LGA1151 and LGA1150.

As for socket size being a limiting factor for core count...
A 4-core SB die was 24% larger than an 8-core CL (below, from wikichip).
Without GPU, 12 cores should fit in SB's 216mm2.
And, of course, both cores' and cache's shape/size can change, so only Intel and partners know the answer at this point.


----------



## ppn (Dec 10, 2019)

We can get 16 core, but it would sacrifice the iGPU, it can't fit, outside the limits. and it will draw 500 watts in prime 95 at 5.00Ghz, 9900K draws 250 watts.

For now, only gets 10 core. next year 12, every year is +2  then +2.


----------



## torsoreaper (Dec 10, 2019)

Still on DMI 3.0, a 5 year old technology...  Great job Intel, way to push the envelope.


----------



## GoldenX (Dec 10, 2019)

torsoreaper said:


> Still on DMI 3.0, a 5 year old technology...  Great job Intel, way to push the envelope.


Run, the blue squad is coming for you.


----------



## Steevo (Dec 10, 2019)

I figured they would glue some dies onto the NB for morez corez, but that would be different than their strategy of "lets make a new socket, and the fanbois will rejoice as they pay" or whatever it is they are doing now since they have no new process and are also starting production of older chips.


----------



## Nater (Dec 10, 2019)

Still on a 6700K at work.  Boss has green-lit a new build for the end of January.  I can't see waiting around for this.  Probably go 9900KS or maybe 3900X.  Lean Intel because SolidWorks wants clockspeed/IPC, not cores.

Not seeing any reason to wait till April to spend even MORE capital on 2 more cores I don't need.


----------



## chodaboy19 (Dec 10, 2019)

How many PCIe lanes?


----------



## Chrispy_ (Dec 10, 2019)

ppn said:


> View attachment 139016
> 
> We can get 16 core, but it would sacrifice the iGPU, it can't fit, outside the limits. and it will draw 500 watts in prime 95 at 5.00Ghz, 9900K draws 250 watts.
> 
> For now, only gets 10 core. next year 12, every year is +2  then +2.



10nm would give them enough space
also, LOL; 10nm.


----------



## efikkan (Dec 10, 2019)

Chrispy_ said:


> Why do we need a new socket? It's still using DDR4, not DDR5 (that's AMD's excuse for changing to socket AM5) and it's still using PCIe 3.0. Is Intel unable to physically fit more than 8 cores on a package at 14nm?


Mostly to redesign it to be more suitable for higher power draw.
I don't think it will be much larger.



Chrispy_ said:


> Is 10C the upper limit or are there hints that Intel have 12C and 16C models in the pipeline for later in 2020?


Physically, Intel could easily have fitted 16 cores on this socket (in a 4x4 mesh), but it's not really practical nor is it really needed. On 14nm, their i9-9900K is already throttling unless the power limit is removed.

While it's good to have more than 4 cores on the mainstream platform, we really don't need 12-16 cores for non-server software (except for special use cases). Most applications will scale much better on a 30% faster CPU vs. a CPU with 30% more cores. Intel need to focus on bringing their new architectures to the market rather than winning "the core race".


----------



## medi01 (Dec 10, 2019)

btarunr said:


> to match iPad


Come on, which freaking ipad... Did you want to say "a tablet"?


----------



## R0H1T (Dec 10, 2019)

notb said:


> *They're adding pins*. This could be because of *added PCIe*, higher* IGP transfers* or other *features unknown* at this point.
> Socket/package size is the same as in LGA1151 and LGA1150.


You know that's BS, a large number of OC records on 8700k were on previous gen Zxxx boards.

For what exactly, they're still limited by DMI 3.0 *IIRC*.

Nope.

The only "unknown" feature they're adding is *CNVi* & I question it's much touted *utility* on desktops.


----------



## Tomorrow (Dec 10, 2019)

Turmania said:


> Disappointed with Intel recently but still rest assured it will still and easily best any ryzen 4xxx cpu's launched next year easily and with a bigger margin than this year as far as gaming is concerned with no misleading advertisement on the product.


Ladies and genteman. We have a Nostradamus here. I seriously hope this is a deliberate troll post and you're actually not that stupid.


----------



## notb (Dec 10, 2019)

ppn said:


> We can get 16 core, but it would sacrifice the iGPU, it can't fit, outside the limits. and it will draw 500 watts in prime 95 at 5.00Ghz, 9900K draws 250 watts.


It would not draw double of 9900K in the same way 3900X doesn't draw 250W at full load. It would be limited and utilize clever load allocation.

Also, why would anyone run 9900K at const 5GHz? It makes absolutely no sense. And that's exactly why people think Intel CPUs draw so much power.


----------



## scouserpcgamer (Dec 10, 2019)

I am hoping that PCIE 4.0 same as AMD boards are supported, I think it will be massive mistake if the new motherboards only support PCIE 3.0 for SSD’s


----------



## Darmok N Jalad (Dec 10, 2019)

notb said:


> It'll have more than earlier generations. Hard to say at the moment.
> Disabling HT does NOT make a system secure. CPU has other security flaws that haven't been found or revealed yet.
> 
> They're adding pins. This could be because of added PCIe, higher IGP transfers or other features unknown at this point.
> ...


Except Intel probably wants to produce today’s CPUs at less than Sandy Bridge sizes, as production costs and overhead costs increase over time. Not only that, Intel can’t price these like they want to, so the smaller the node, the better.


----------



## candle_86 (Dec 10, 2019)

Hmm i3, looks like i7 from 2 years ago, this should tank the used market entirely


----------



## EarthDog (Dec 10, 2019)

efikkan said:


> While it's good to have more than 4 cores on the mainstream platform, we really don't need 12-16 cores for non-server software (except for special use cases). Most applications will scale much better on a 30% faster CPU vs. a CPU with 30% more cores. Intel need to focus on bringing their new architectures to the market rather than winning "the core race".


Intel doesn't seem to be focused on a core race. They aren't stuffing 18c/36t CPUs down the mainstream pipe when 95% of consumers are good with half that (and will be for the next few years). I think their next gen will go up to 10c/20t on the mainstream platform. Who knows moving forward.

Blame AMD for starting a core race in the first place. They are the ones who brought this HCC crap to the mainstream segment.


----------



## candle_86 (Dec 10, 2019)

So does this mean Celeron can get an upgrade to 2c4t finally if Pentium is 4/4 and i3 is 4/8


----------



## notb (Dec 11, 2019)

candle_86 said:


> So does this mean Celeron can get an upgrade to 2c4t finally if Pentium is 4/4 and i3 is 4/8


Celerons serve their purpose in cheap, low power applications (general servers, NAS, HTPC, NUC etc).
There's very little need for more multi-thread performance in this segment... and even less will to sacrifice single thread potential.

It's hard to say how Intel will segment their next generation into traditional product lines. They'll have to stretch something. But it's extremely unlikely that 2-core, non-HT CPUs won't be included.


----------



## Darmok N Jalad (Dec 11, 2019)

notb said:


> Celerons serve their purpose in cheap, low power applications (general servers, NAS, HTPC, NUC etc).
> There's very little need for more multi-thread performance in this segment... and even less will to sacrifice single thread potential.
> 
> It's hard to say how Intel will segment their next generation into traditional product lines. They'll have to stretch something. But it's extremely unlikely that 2-core, non-HT CPUs won't be included.


I dunno, once they drop so far, the Atom-based architecture is able to step in. I think a 2C/2T Core-based architecture is finally being eclipsed by Gemini Lake in most tasks. I had an Apollo Lake quad core that could handle a 4K stream. It’s a low bar, but we’re talking about really small dies and super cheap prices.


----------



## Prima.Vera (Dec 11, 2019)

14+(+)nm, PCI-E 3.0, 16xPCI-E lines, new chipset.....


----------



## Zach_01 (Dec 11, 2019)

Oh come on people... give Intel a slack... Dont have enough "in the head" already struggling to spit a modern working chip/platform out? ...need this kicking while its down? _yes!_



medi01 said:


> Come on, which freaking ipad... Did you want to say "a tablet"?


No he said and ment iPad... If you never own one then you dont know how these "tablets" work and behave.


----------



## Prima.Vera (Dec 11, 2019)

No really, what would be the difference then between the *K *and the *X* series?!??
Please don't say the chipset and the nr of PCIE-E lines only...


----------



## viandyka (Dec 11, 2019)

AMD only need  3 generation Ryzen

and intel just screw their head with 4 "NEW" intel product xD and soon 5 skylake refresh


----------



## GoldenX (Dec 11, 2019)

Prima.Vera said:


> No really, what would be the difference then between the *K *and the *X* series?!??
> Please don't say the chipset and the nr of PCIE-E lines only...


Price.


----------



## Darmok N Jalad (Dec 11, 2019)

Prima.Vera said:


> No really, what would be the difference then between the *K *and the *X* series?!??
> Please don't say the chipset and the nr of PCIE-E lines only...


Don't the X series have quad channel memory? They are based on the Xeon line, not the desktop line.


----------



## Shatun_Bear (Dec 11, 2019)

This is sad from Intel.

April 2020 and it'll be couple months away from going up against Ryzen 4000 which is two generations ahead in terms of process node and well, even performance given the rumours.

- Less cores
- Slower IPC
- Much less efficient
- Slower multithreading
- Slower single core
- Loses in 720p gaming eith 2080 Ti tests (likely given the +10% IPC and +5% frequency rumours of 4000 series).

But 5Ghz all-core drawing 450W. Yay Intel!


----------



## Chrispy_ (Dec 11, 2019)

Shatun_Bear said:


> This is sad from Intel.
> 
> April 2020 and it'll be couple months away from going up against Ryzen 4000 which is two generations ahead in terms of process node and well, even performance given the rumours.
> 
> ...


Don't forget that Skylake rebranded for the 5th time is still a proven high-risk architecture with a fundamentally-exploitable design and new exploits popping up to haunt it faster than they're patched.

On the other side, AMD's frequent architectural jumps mean that by the time hackers glean enough info about Zen 2, 3, 4 to start exploiting it, AMD will have moved on to newer architecture anyway.


----------



## notb (Dec 11, 2019)

Darmok N Jalad said:


> I dunno, once they drop so far, the Atom-based architecture is able to step in. I think a 2C/2T Core-based architecture is finally being eclipsed by Gemini Lake in most tasks. I had an Apollo Lake quad core that could handle a 4K stream. It’s a low bar, but we’re talking about really small dies and super cheap prices.


Actually it's the opposite. Consumer Atoms aren't developed anymore. Intel replaced them with Celeron lineup (e.g. this is what you get in ~$300 laptops right now).

Server Atoms (C-series) are still in production, but haven't been updated since 2018.

Ultimately the Atom lineup will be replaced by ARM.


----------



## Darmok N Jalad (Dec 11, 2019)

notb said:


> Actually it's the opposite. Consumer Atoms aren't developed anymore. Intel replaced them with Celeron lineup (e.g. this is what you get in ~$300 laptops right now).
> 
> Server Atoms (C-series) are still in production, but haven't been updated since 2018.
> 
> Ultimately the Atom lineup will be replaced by ARM.


I’m not talking abut Atom the brand, but the architecture. Today, it is called Gemini Lake. Intel has been gradually adding IPC to the original architecture. I just can’t recall the code name. Something that ends in “mont.”


----------



## HenrySomeone (Dec 11, 2019)

ZoneDymo said:


> Soooo this is another 7700k with again, 2 more cores added?


And still faster than AMD in most things (when comparing same core count cpus), but especially in gaming


----------



## notb (Dec 11, 2019)

Darmok N Jalad said:


> I’m not talking abut Atom the brand, but the architecture. Today, it is called Gemini Lake. Intel has been gradually adding IPC to the original architecture. I just can’t recall the code name. Something that ends in “mont.”


Atom and Celeron/Pentium J-, N-series were all based on the same architecture (Goldmont). Super small cores - up to 16C in a package smaller than LGA1151.
Consumer Atom lineup was dropped.

I'm not sure what will happen when Tremont arrives.
I've seen rumors that server chips (and everything with ECC) will be unified under Xeon brand...


----------



## Steevo (Dec 11, 2019)

HenrySomeone said:


> And still faster than AMD in most things (when comparing same core count cpus), but especially in gaming


Wrong.

Intel is faster at our of order operations, due to AMD using a chiplet design.

When gaming at resolutions of 1080 or above AMD are neck and neck. At 720 or lower Intel wins, but who games at that resolution?

AMD is significantly faster at 80% of other actual work due to more cores, and more cache.









						Intel Core i9-9900KS Review - Impressive 5 GHz
					

The Core i9-9900KS is Intel's new consumer flagship processor. It runs at 5 GHz boost no matter how many cores are active, which translates into 10% application performance gained over the 9900K. Gaming performance is improved too, but pricing is high, especially compared to what AMD is offering.




					www.techpowerup.com
				




Do you even read reviews?


----------



## Shatun_Bear (Dec 11, 2019)

HenrySomeone said:


> And still faster than AMD in most things (when comparing same core count cpus), but especially in gaming



3.8% faster (using TPU's own numbers after chipset update) at 1080p gaming using a 2080 Ti, whilst consuming more power, being far less efficient and losing in multithreaded tasks by way more than 3.8%, right? I'll take the Ryzen thanks


----------



## John Naylor (Dec 11, 2019)

Shatun_Bear said:


> This is sad from Intel.
> 
> April 2020 and it'll be couple months away from going up against Ryzen 4000 which is two generations ahead in terms of process node and well, even performance given the rumours.
> 
> ...



Just picked a random post that centered on specs, so not addressing you directly, but the mindset that puts specs ahead of performance.

Rumors - not relevant
Cores - Don't care
IPC - Don't care
Efficiency - Don't care
Mutithreading - don't care
Single core - don't care
720p gaming - who plays @ 720p ... and "likely" has no place in real world discussions.

All that is relevant is how fast a CPU runs the apps they use and the games they play.  And won't pay any attention to any new release till i see it tested here... and then only in the apps I actually use and games I actually play ... chest beating and benchmark scores are meaningless.    Fanbois beating their chests about the chip they like being faster at tasks that are never or rarely done things that they never do isn't .  When I looked at the 9900KF versus the 3900x test results here in TPU ... here's what I see ... 

3900X kicks tail in rendering which would be relevant if my user did rendering
3900X kicks tail in game and software development which would be relevant if my user did those things
3900x shares wins in browser performance but differences are too small to observe anyway.
3900X kicks tail in scientific applications which would be relevant if my user did rendering
3900x shares wins in office  apps but differences (0.05 seconds) are too small to affect user experience.
3900X lose in Photoshop by 0.05 seconds... but loses by 10 seconds ... finally something that matters to my user
Skipping a few more things 99% of us don't ever do
File compression / media encoding  ...also not on the list
Encoding ... use does an occasional MP3 encode and the 3900x trailing by 12 secs might be significant if it was more than an occasional thing.
3900x loses in overall 720  game performance by 7% ... as he plays at 1440p, it's entirely irrelevant
3900x loses in 1440p overall game performance but 2.5% ... not a big deal but 2.5% is an advantage more than most of what we have seen so far.
3900x losses all the power consumption comparisons ... 29 watts in gaming
3900x runs 22 C hotter
3900X doesn't OC as well
3900x is more expensive.

AMD did good w/ the 3900x .... but despite the differences in cores, IPC, die size whatever ... the only think that matters is performance.  The are many things that the 3900x does better than the 9900KF, but most users aren't doing this things.   You can look at a football player and how much he can bench or how fast he can run the 40 ... but none of those things determine his value to the team, his contribution to the score or how much he gets paid.


----------



## r9 (Dec 11, 2019)

Those chiplets is what kicked Intel's ass.
In reversed positions AMD offered the Phenom I the TRUE quad core vs the glued together two C2D into Q6600.
Guess what we don't care what's true and what's not it's all about performance and this time around AMD are the ones who made those chiplets to work.
And for inter it's not just going to 10/7nm because I bet that they won't be able to hit those 5 GHz on those nodes for years to come.
So for them it will be a step back at first but I'm sure they will bounce back eventually like they did with C2Duo by getting "inspired" by AMD architecture.
And guys don't for get ARM is coming 8CX ....


----------



## Tomorrow (Dec 12, 2019)

John Naylor said:


> Just picked a random post that centered on specs, so not addressing you directly, but the mindset that puts specs ahead of performance.
> 
> Rumors - not relevant
> Cores - Don't care
> ...


Lets put it this way. Would you buy a CPU that does one thing really good (Gaming at lower resolutions) or would you buy a CPU that does gaming <10% worse but everything else >10-50% better than the other CPU?

3900X is more well rounded for all tasks. Not just one. And more secure. The difference in price is rather small. 9900 variants should cost no more than 3700X. Not what they are now.
Heat i would say is more of an issue for Intel due to higher power consumption. 3900X does not need to OC well because out of the box it already boosts to it's highest speed and with higher IPC it can afford to run at lower clocks. People need to let go of the 5Ghz or bust mentality. Remember Bulldozer was also 5Ghz. I hope no one is missing that.


----------



## notb (Dec 12, 2019)

Tomorrow said:


> Lets put it this way. Would you buy a CPU that does one thing really good (Gaming at lower resolutions) or would you buy a CPU that does gaming <10% worse but everything else >10-50% better than the other CPU?


If I were buying a CPU for gaming, i.e. gaming was the only tasks where I really wanted my PC to shine?
Of course I would buy the CPU that wins in games - even if it gives just few fps more. It doesn't matter if the other CPU can render 3D 50% faster. Why would it? Am I really building a PC for gaming, or am I more into screenshots from Cinebench?
This is exactly the problem with many people here. They buy the wrong CPU.

Lets say you're buying a GPU for gaming and you play just 3 games that TPU - luckily - tests in their reviews.
Would you buy the GPU that wins in these 3 games or the one with better average over 20 titles?

I think this is also why some people didn't understand low popularity of 1st EPYC CPUs. They had good value and looked very competitive in many benchmarks.
But they fell significantly behind in databases. So the typical comment on gaming forums was: but it's just one task. Yeah, but it's the task that 95% real life systems are bought for (or at least limited by).


> 3900X is more well rounded for all tasks. Not just one.


Whenever I see an argument like this one, I ask the same question.
I assume you game, right?
What are the 3 other tasks that you frequently do on your PC? But just those that you're really limited by performance.

Because having 56 vs 60 fps in games may not seem a lot, but it's a real effect that could affect your comfort.

Most people can't name a single thing.
Some say crap like: they encode videos. A lot? Nah, few times a year, just a few GB.


----------



## GoldenX (Dec 12, 2019)

Long live the i3 9350, why bother with anything else then?
I game, in emulators too, and Skylake is starting to become useless there too.


----------



## Tomorrow (Dec 12, 2019)

notb said:


> If I were buying a CPU for gaming, i.e. gaming was the only tasks where I really wanted my PC to shine?
> Of course I would buy the CPU that wins in games - even if it gives just few fps more. It doesn't matter if the other CPU can render 3D 50% faster. Why would it? Am I really building a PC for gaming, or am I more into screenshots from Cinebench?


"PC for gaming" statement is funny to me every time. Like really. You never open a browser? you never unpack anything game related (like mods), you never run any other programs on this PC? Obviously you do unless you run cracked games that do not require launchers. 

And this gap between Intel and AMD is greatly exaggerated in reviews due to low resolutions and using the fastest GPU around. I bet most of these people buying i7 or i9 for "only gaming" also do bunch of other stuff (even if lightly threaded) and do no ever notice the miniscule performance difference with a naked eye vs AMD. This is not a FX vs Sandy Bridge or Ryzen 1xxx vs 7700K situation any more where you can easily tell the difference. Since AMD is so close in performance and much better in nearly everything else people are buying AMD 9 to 1 compared to Intel.

Plus there is the matter of priorities. For a gaming the GPU is always #1. A person with a cheaper R7 3700X and a RTX 2080S will always achieve better performance than the next guy with i9 9900KS with a RTX 2070S. The only case where getting the i9 for gaming makes any sense is when money is not a problem and the person already owns a 2080 Ti.



notb said:


> Lets say you're buying a GPU for gaming and you play just 3 games that TPU - luckily - tests in their reviews.
> Would you buy the GPU that wins in these 3 games or the one with better average over 20 titles?


The one that gets the better average. Obviously. I won't be playing those 3 games forever. Case in point: AMD cards that are really fast in some games like Dirt Rally 4 or Forza Horizon. These are outlier results. I can't and won't base my purcase on one off results that are not representative of overall performance.



notb said:


> Whenever I see an argument like this one, I ask the same question.
> I assume you game, right?
> What are the 3 other tasks that you frequently do on your PC? But just those that you're really limited by performance.
> Because having 56 vs 60 fps in games may not seem a lot, but it's a real effect that could affect your comfort.
> ...


Is not everything performance limited?
I would say web browser is very much performance limited. I noticed massive speed boost when launching Firefox after upgrading from 2500K to 3800X. Going to 3900X or 3950X i would be able to give Firefox even more threads to work with.
Also i feel like im IO limited and need faster PCI-E 4.0 NVME SSD to replace my SATA SSD. Just waiting on Samsung to announce their client drives next year. Current Phison controller based drives are just a stopgap and not very compelling.
Also network speed is becoming a major bottleneck for me. What can i say - VDSL 20/5 just does not cut it for me. Ideally i would upgrade to symmetrical 300/300 or 500/500 speed. 1G/1G sounds nice but then the web itself would become a bottleneck.


----------



## Patriot (Dec 12, 2019)

Darmok N Jalad said:


> Don't the X series have quad channel memory? They are based on the Xeon line, not the desktop line.


lool, Xeons are not some special silicon... and they span the whole lineup.
There are Xeon-Ws that parallel the HEDT 2066 platform yes.
There are also Xeon E's on the 1151 socket.
So desktop platform intel has whatever-lake 2-chan mem
HEDT  refresh-lake 4chan mem
3647  stillnotfrozen-lake 6 chan mem
BGA abomination Glued-lake 12-chan mem


----------



## Manoa (Dec 14, 2019)

that is alote of channels, but do you realy need so many ?
I saw many tests done on channels and from 2 to 4 thare is diffrence in synthetic but real applications don't realy increase in speed
I think if intel would have unganged it could be a improvement with more channels


----------



## toyo (Dec 14, 2019)

"Modern Standby" is not new, it was already available in (some?) Z390 boards.

Other than that, big yawn. More 14nm. I assume more 200-300W power consumption with AVX, and probably impossible to fully stresstest your OCs unless you hit the super golden sample stuff. It's kind of hilarious to see so many people on the web complaining about how their 9900K throttles at 5GHz MCE/OC if they dare to try Prime95 or LinpackX. 

The sad part is that I saved money even for what I assumed to be a 8/16 9700K. When I saw it's now an i9, I kept the cash. Looks like I'll keep the cash even longer, which is fine, until we get a CPU that you can play with and tweak without the caveats of the Ryzen platform (like lower clocks to go with undervolting, no OC space) or those of Intel's (super high power consumption to the point where it's impossible to cool the beast and stresstest OCs properly).


----------



## Give.me.lanes (Dec 16, 2019)

if there are fewer lanes on the 10900k Im just going to stick with 10900x and overclock that. I dont understand how everyone is jumping on amd's dick at a loss of PCIE lanes, Im pretty sure you arent getting a chip this big unless you are running nvlink @4k  or using it as a WORKSTATION. or do people not know how to build pc's anymore?

and cmon, my sandy at 5 ghz still rocks. dont bash sandy


----------

