Friday, February 14th 2020

Intel Core i9-10900 10-core CPU Pictured

Intel's desktop Comet Lake-S lineup is close to being released and we are getting more leaks about the CPU models contained inside it. Perhaps one of the most interesting points for Comet Lake-S series is that it brings a boost in frequency and boost in core count, with the highest-end Core i9 processors going up to 10 cores. Thanks to Xfastest, a Hong Kong-based media outlet, we have first pictures of what appears to be an engineering sample of the upcoming Core i9-10900 processor.

Being a non-K version, this CPU is not capable of overclocking and has a fixed TDP rating of 65 Watts. Compared to 125 W of the K models like the upcoming Core i9-10900K, this CPU will output almost half the heat, thus requiring a less capable cooling solution. The CPU is installed in LGA1200 socket, which is a new home for Comet Lake-S CPUs and provides backward compatibility for coolers supporting LGA1151. In the sample processor pictured below, we can see a marking on the CPU that implies 2.5 GHz base clock. Previously rumors were suggesting that this CPU version has 2.8 GHz base clock, however, it can be an early engineering sample given that no official imprints are found on the CPU heat spreader.
Source: VideoCardz
Add your own comment

106 Comments on Intel Core i9-10900 10-core CPU Pictured

#76
TheoneandonlyMrK
heflys20Reminds me of why I stopped participating in gaming/pc forums for the most part. I got dragged into that "brand-centric" lifestyle. Made me stop caring, particularly with how fast-paced this industry is. Performance vs $ is the name these days. Someone told me yesterday that my thought process was equivalent to driving a Civic vs his Bugatti, computer wise. He's right, I couldn't imagine the stress of owning a Bugatti or the money required.
Like cars ,technology you can mess with and race will always garner fans and negative arguments , I don't now though ,I don't think it's helpful especially with poor opinions inflecting bad ideology into the debate.

Stress is relative I find the money for tech I want to mess with far easier than the money to pay the electric bill:p:):D.
Posted on Reply
#77
heflys20
theoneandonlymrkStress is relative I find the money for tech I want to mess with far easier than the money to pay the electric bill:p:):D.
It's still enjoyable, I just don't have a point to prove thru purchases anymore (computer wise). LOL.
Posted on Reply
#78
JackCarver
theoneandonlymrkAnd.

So that's an issue, you can't have. Three heavier loaded threads with superfluous activity distributed across the rest?.

What's curious to me is people's theory's of how processors work these days and why intel are better.

That boost algorithm which both use ,designed per SKU also has some say.

Some of the most vocal against multi core and. Ryzen in general are he'll bent on keeping their four core hyperthreaded chips relevant via downplaying multi core ryzens.

Let's see where that gets them.

It's the Aliens, simple.
For me, the Wolfenstein load distribution seems not to be that good at the Ryzen 3900X, but that's only my opinion. I'm not against multi core or HT but I mean load distribution should be even across all cores and not a few at high load and the rest at low. The Intel 9900K shows a better and more even Distribution.
Posted on Reply
#79
TheoneandonlyMrK
JackCarverFor me, the Wolfenstein load distribution seems not to be that good at the Ryzen 3900X, but that's only my opinion. I'm not against multi core or HT but I mean load distribution should be even across all cores and not a few at high load and the rest at low. The Intel 9900K shows a better and more even Distribution.
I understand that but in Ryzen specifically and likely Intel they have a neural network that's optimizing the work and throughput of the processor and it decided it was best to run like that.
There are a few scientists and researchers verses that opinion then.
And this also demolishes the ideology (quite true though it is) that some workloads will always favour one thread as fast as possible.
I have gamed while crunching on the same signature rig without initially realising just because of how Well thread and load management is these days.
Posted on Reply
#80
oxrufiioxo
phanbueyThe futureproofing thing almost never works out IMO -- sometimes - but very rare. - the only time it was REALLY spot on is when we first went dual core and the game got it's own core, but since then, it's been taking forever for games to use more threads and clocks/cache/memory have reigned supreme.

We said it about the Q6600 (oh get that quad in the future games will use more cores) - but dual cores were still the best for gaming, then the 1060T Phenom then 2600K, r7 1700, etc. etc. - basically 4 c / 4t thread CPUs really only started to show real limits in 2015, with the i5 4t still being the optimal gaming choice. Right now 6t is starting to show it's age, but I think it will be another few years before 8t/10t/12t starts to really limit anything, by then you're on to more cores and completely different tech anyways.

You may be right though - the 9700K is a little weird with it's frame pacing and sometimes has issues in some games due to the high performing core/ low thread count combo (red dead, GTA 4, farcry 5).
I would say the 2700k has lasted much longer than I expected.... Also getting 4 1/2+ years out of a 6700k isn't terrible. Really the only cpu I can think of in recent memory that now struggles is a 7600k which is only 3 years old and now loses to a cpu it dominated at launch in a lot of games so really From 2011-2017 getting an i7 over an i5 sorta paid off for future proofing.
Posted on Reply
#81
Ruru
S.T.A.R.S.
neatfeatguyYou have to remember that HT years ago actually hindered the performance of some games, so folks did indeed disable HT for that very specific reason.

Things have improved greatly over the years that either Intel's/MS/game developer (or some combination of them all) for writing better software so issues like HT causing issues doesn't seem to come up anymore, at least not that I'm aware of.
Yeah but everyone I knew, got a 2600K exactly for gaming.
Posted on Reply
#82
svan71
Hey guys, get a load of this pic of a processor i'm not getting wow its cool :)
Posted on Reply
#83
Melvis
Chloe PriceMore cores is better in the view of marketing. There's still so much people who doesn't understand that much about computers, so more is better, of course! :D

Brings me back to the days when more VRAM was better and Pentium 4 was better than Athlon XP/64 because of the higher clock speed.. :laugh:
Haha yeah very true good point! and the amount of people those P4's fooled then was unreal and still fool today.
Posted on Reply
#84
Jism
neatfeatguyYou have to remember that HT years ago actually hindered the performance of some games, so folks did indeed disable HT for that very specific reason.

Things have improved greatly over the years that either Intel's/MS/game developer (or some combination of them all) for writing better software so issues like HT causing issues doesn't seem to come up anymore, at least not that I'm aware of.
Disabling HT these days will only result in a negative overall performance. As seen on the bulldozer (where you would say a shared resource in one module) and the 2x00 series from AMD. 3x00 series is no better. There's no notice in power usage drop when you disable it too, the way AMD implements it does just a far better job to keep cores busy.
Posted on Reply
#85
biffzinker
MelvisHaha yeah very true good point! and the amount of people those P4's fooled then was unreal and still fool today.
The Pentium 4 wasn't a complete disaster for Intel. Had a rough start as Willamette then it did take off in it's Northwood form. Prescott was when the P4 took the wrong direction to push clockspeeds up.
Posted on Reply
#86
ppn
Mainstream cpus have hit 300watts.if this is not the wrong direction. To reach 5.3 ghz they sacrificed 300Mgz worth of IPC, not to mention the hardware mitigations fix that are worse than software, can't be disabled. Better release willow cove already. Comet is worse than preshot but at least it isn't chiplet based, like the first dualcore.
Posted on Reply
#87
Basard
If they could promise an upgrade path it might be worth looking into. At least they gave us that with the 8 and 9 series--it should have gone all the way back to the Z170 though, but maybe I'm reaching a little far.

For all the bugs and whatnot, at least AMD seems to be trying to offer a nice upgrade path.
Posted on Reply
#88
ppn
You can't do that there must be a new chipset every year, and new socket every 2 years. they must have some sort of deal with the motherboard makers or be really lazy not to think ahead and just add the extra 50 pins.

Also it offers at least one more upgrade path possibility with Rocket lake later this year.
Posted on Reply
#89
JackCarver
An upgrade path would be nice for one to the next generation, but as new CPUs draw more and more power you also need better and better vrms. Look only at the B450/X470 boards and compare it to X570 boards, there‘s a huge difference in vrm quality. So an X570 is definitely the better base for a big Ryzen cpu. So an upgrade path from z170/z270 to the first coffee lake would have been nice but the new coffee lake, especially the 8 cores draw way more power as my 8700k for example. So better boards are also needed here.
Posted on Reply
#90
Imsochobo
JackCarverAn upgrade path would be nice for one to the next generation, but as new CPUs draw more and more power you also need better and better vrms. Look only at the B450/X470 boards and compare it to X570 boards, there‘s a huge difference in vrm quality. So an X570 is definitely the better base for a big Ryzen cpu. So an upgrade path from z170/z270 to the first coffee lake would have been nice but the new coffee lake, especially the 8 cores draw way more power as my 8700k for example. So better boards are also needed here.
If you look at x570 vrm quality is better cause they're more expensive.
B350 boards run 3900x just fine and as we've seen 3950x too!

from 2nd to 3rd gen the cpu's use less power but Amd is THE platform for dyi thus they will get the most engineering time, best designs etc.
Posted on Reply
#91
JackCarver
ImsochoboB350 boards run 3900x just fine and as we've seen 3950x too!
Yes but means just fine that it worked, with minimal headroom, or that it runs just fine with enough headroom? I wouldn't use a B350 board for a Ryzen 3900X/3950X, but that's only my opinion. If you look here the Ryzen 2700X, a 105W TDP CPU, draws at max 86W.



The Ryzen 3700X, a 65W TDP CPU, draws at max 74W. So only 12W below the 105W TDP Ryzen 2700X.



The Ryzen 3950X, a 105W TDP CPU, draws at max 144W.



So a board, designed for a Ryzen 2700X could be at it's Limit with a Ryzen 3950X, whereas a X570 board has still enough headroom.
Posted on Reply
#92
TheinsanegamerN
JackCarverYes but means just fine that it worked, with minimal headroom, or that it runs just fine with enough headroom? I wouldn't use a B350 board for a Ryzen 3900X/3950X, but that's only my opinion. If you look here the Ryzen 2700X, a 105W TDP CPU, draws at max 86W.



The Ryzen 3700X, a 65W TDP CPU, draws at max 74W. So only 12W below the 105W TDP Ryzen 2700X.



The Ryzen 3950X, a 105W TDP CPU, draws at max 144W.



So a board, designed for a Ryzen 2700X could be at it's Limit with a Ryzen 3950X, whereas a X570 board has still enough headroom.
Having X570 chipset =! enough headroom. There are plenty of X570 boards that will struggle to keep a 3950x running without overheating. The asrock x570 pro 4 series is one example, where users report extremely high VRM temps and instability with a 3900x, let alone a 3950x.

Meanwhile, there are b450 boards like the asus strix-i b450 that can handle the output of a 3950x thanks to significantly higher VRM quality then some other B450 boards.
Posted on Reply
#93
londiste
JackCarverhere the Ryzen 2700X, a 105W TDP CPU, draws at max 86W.
The Ryzen 3700X, a 65W TDP CPU, draws at max 74W. So only 12W below the 105W TDP Ryzen 2700X.
You meant to say 3700X, a 65W TDP CPU, draws at max 90W, 4W more than 105W TDP 2700X?
Posted on Reply
#94
MikeZTM
JackCarverDepends of the game as there are games out there which perform better on 9900k. 8086 is only 6 Core




I also say that depends of the game and for future use 8 cores with HT will be better than without HT in my opinion.
Problem for Intel is more cores = less fps in games. 9900k is 50% slower than 8086k in PUBG both overclocked.

6 core = less RAM latency and overall better performance in games.

Sure if a game overwhelm a 6 core then 9900k will be faster but that's not happening now and if that's happening AMD will be much faster in that game.

Disabling HT will improve gaming performance but not too much if you are not playing old multi-threaded games like Battlefield 4. HT is not for gaming as it is really hard to optimize lightly threaded application like games to utilize SMT/HT. But it's pretty easy to avoid using HT so most dev went that way.

It's not impossible to optimize game for SMT but just really hard that economically doesn't make sense.
MrAMDReading comprehension problems? :laugh: Literally said wish my Z390 mobo would work with it. Obviously meaning it won't..
9900KS > everything else gaming. Yes even the 8086k. The 9900k/ks have higher clock ceilings. I'm running a 9900k myself at 5.2GHz all-core, 24/7.
Quick question: what's your ram latency in AIDA64 ram test? (you can do this in trail version)

Do you know that currently most games are RAM latency bounded? Intel's gaming advantage is mostly coming from better IMC that can run 4000+ rams ad have less than 40ns latency.

9900KS is not faster than 8086k in those games as less core = less latency for Intel.
Posted on Reply
#95
JackCarver
MikeZTMand if that's happening AMD will be much faster in that game.
Nice conclusion :D
In the most games Intel is clearly dominating AMD, read some Benchmarks at this stuff, and I don't think that will switch in the future, but let's see.



And here comes AMD:

Posted on Reply
#96
MikeZTM
JackCarverNice conclusion :D
In the most games Intel is clearly dominating AMD, read some Benchmarks at this stuff, and I don't think that will switch in the future, but let's see.



And here comes AMD:

If you ever read my reply you will know I claim this kind of game doesn't exist yet. Most game works perfectly fine with 6 cores.
If you know how to overclock then 8086k delided and repasted is the best CPU for gaming. Much better than 9900k.
Posted on Reply
#97
GlacierNine
MikeZTMIf you ever read my reply you will know I claim this kind of game doesn't exist yet. Most game works perfectly fine with 6 cores.
If you know how to overclock then 8086k delided and repasted is the best CPU for gaming. Much better than 9900k.
You keep saying that in a BUNCH of threads but I have yet to see you cite an actual benchmark to prove it. I googled "8086K vs 9900K" and didn't find anything either, as so few outlets actually benchmarked the 8086K.

Can you provide any actual figures to back yourself up or are you just going to keep asserting this everywhere you go and expect everyone to believe you?
Posted on Reply
#98
MikeZTM
GlacierNineYou keep saying that in a BUNCH of threads but I have yet to see you cite an actual benchmark to prove it. I googled "8086K vs 9900K" and didn't find anything either, as so few outlets actually benchmarked the 8086K.

Can you provide any actual figures to back yourself up or are you just going to keep asserting this everywhere you go and expect everyone to believe you?
It was done by my friend as a part vendor and they already got hands on LGA1200 motherboards so I trust their numbers.
I can not post the result as I didn't got the full result but anyone can test the 8086k vs 9900k. And it make sense as most esport title heavily rely on ram performance instead of core count.

I'm not saying 8086k is better in all games but the performance regression is indeed happening in some games especially those high frame rate esport shooters that are poorly optimized.
Posted on Reply
#99
ThrashZone
Hi,
Did this turn into an amd thread now :p
Posted on Reply
#100
GlacierNine
MikeZTMIt was done by my friend as a part vendor and they already got hands on LGA1200 motherboards so I trust their numbers.
Posted on Reply
Add your own comment
Dec 23rd, 2024 01:09 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts