# Intel Core i9-9900K



## W1zzard (Oct 19, 2018)

Today, Intel released their new flagship processor for the LGA 1151 platform. The Core i9-9900K finally comes with eight core and 16 threads, reaching parity with AMD's Ryzen offerings. Maximum Boost Clock has been increased as well, now to a staggering 5 GHz.

*Show full review*


----------



## R0H1T (Oct 19, 2018)

Great performance, relatively speaking, below average temps & worse power consumption. I think if you own this you may not want to upgrade your gaming rig for the foreseeable future.
Also I don't see OCed temps, are they listed anywhere


----------



## trparky (Oct 19, 2018)

W1zzard said:
			
		

> Intel's own marketing paraded the Core i9-9900K as the best processor for gamers. While that is technically true, the differences are rather slim in our own testing. For example, at 1080p resolution, with all games set to highest details, the difference between i7-8700K and i9-9900K is just 2%.


I'm going to go out on a limb here and say that Intel is not happy about this statement.


----------



## W1zzard (Oct 19, 2018)

R0H1T said:


> Also I don't see OCed temps, are they listed anywhere


OC temps on air 5.0 were right below throttle, with a medium-high quality air cooler. Didn't measure temps for water, probably somewhere between 60 and 70



trparky said:


> I'm going to go out on a limb here and say that Intel is not happy about this statement.


I know, but what else should I say, given the numbers. No we're not gonna switch from "ultra" to "high" for our testing.


----------



## dj-electric (Oct 19, 2018)

i9 9900K was a tough pill for media to swallow. 
A great piece of hardware for an almost criminally high price compered to the competition.


----------



## R0H1T (Oct 19, 2018)

trparky said:


> I'm going to go out on a limb here and say that *Intel is not happy* about this statement.


Next you'll tell us Intel wanted DDR4 2133 for other processors in testing


----------



## Vya Domus (Oct 19, 2018)

Power consumption is shockingly high once overclocked, I really did not expect that.


----------



## v12dock (Oct 19, 2018)

Unbelievable


----------



## Fleurious (Oct 19, 2018)

Wow, only skimmed through all the performance tables but this CPU looks pretty damn good.  Now to go look at pricing...

$660 on Newegg Canada, a little bit more than what I paid for the 3930k.  I could live with that if I ever move on from my 4790k.


----------



## trparky (Oct 19, 2018)

W1zzard said:


> I know, but what else should I say, given the numbers.


Nor should you mince words. Now tell us how you really feel!


----------



## SIGSEGV (Oct 19, 2018)

5Ghz...


----------



## M2B (Oct 19, 2018)

It's not intel's fault that today's games do not fully utilize an eight core CPU.
same for the 2700X.
there is nothing intel can do about gaming perfomance at this point, but 9900K will show its true benefits for gaming over years.


----------



## dcf-joe (Oct 19, 2018)

For those of us who prefer high refresh rate gaming, is there any way you could include minimum frames or an average of the minimums in future benchmarks? I feel like this would further highlight the differences in CPUs for gaming.


----------



## trparky (Oct 19, 2018)

Vya Domus said:


> Power consumption is shockingly high once overclocked, I really did not expect that.


I'm not surprised, there's eight cores in there! That's a lot of processor to power. As for the heat, again... not surprised. Intel is most definitely pushing 14nm to the breaking point (perhaps past it) with the 9900K.


----------



## TheinsanegamerN (Oct 19, 2018)

W1zzard said:


> OC temps on air 5.0 were right below throttle, with a medium-high quality air cooler. *Didn't measure temps for water*, probably somewhere between 60 and 70
> 
> 
> I know, but what else should I say, given the numbers. No we're not gonna switch from "ultra" to "high" for our testing.


Is there any way you could? Other reviewers are find that even with a water cooler the 9900k is just stupid hot.


----------



## Shatun_Bear (Oct 19, 2018)

My prediction of under 10% performance advantage over the 2700X in 1080p gaming was spot on. It's also only 1.9% faster or identical for all intents and purposes to an 8700K showing games do not scale well past 6-cores 12-threads as predicted.

With these figures it's one of the most overpriced CPUs I can remember at £600 here in the UK.


----------



## Manu_PT (Oct 19, 2018)

Shatun_Bear said:


> My prediction of under 10% performance advantage over the 2700X in 1080p gaming was spot on. It's also only 1.9% faster or identical for all intents and purposes to an 8700K showing games do not scale well past 6-cores 12-threads as predicted.
> 
> With these figures it's one of the most overpriced CPUs I can remember at £600 here in the UK.



Yeah sure. Average fps. If they had 1% low at least, ryzen would be crushed.


----------



## sergionography (Oct 19, 2018)

Solid CPU. But to market this for gaming is utterly dumb in my opinion. As for price; I can see it making sense around 400usd, but at 530+ that's not justified


----------



## Dante Uchiha (Oct 19, 2018)

Weird... how consumption of the 9900k can be less than the 8700k?


----------



## kastriot (Oct 19, 2018)

I think it would be great hit for 299$.


----------



## R0H1T (Oct 19, 2018)

Dante Uchiha said:


> Weird... how consumption of the 9900k can be less than the 8700k?


At stock it has lower base clocks, so not too surprising really.


----------



## Vya Domus (Oct 19, 2018)

v12dock said:


> Unbelievable



So let me get this right, even when they decided to solder the IHS they still somehow managed to do a bad job ?


----------



## Crowley (Oct 19, 2018)

I am going to say it up front, I prefer Intel over AMD. But looking at these benchmarks and going over the price difference, it makes me sad to admit that

I play my games at 1440p and when I compare the performance differences between the 9900k and 2700X, I just can't figure out the insane price difference

1440p:

i9-9900k vs AMD 2700x

AC:Origin
77.2 vs 74.5

Battlefield 1
134.4 vs 126.1

Civilization VI
130.5 vs 113.2

F1 2017
138.5 vs 133.3

Far Cry 5
113.0 vs 102.8

I could go on but I think you get my point. These FPS are so high that 1% difference that is happening makes no improvement during gameplay. But the 50%-60% increase in price, is just down right unbelievable

Yes, I am only looking at this from the gamer view point and I don't care about the other settings because at the end of the day my machine is for gaming


----------



## dirtyferret (Oct 19, 2018)

I'm sure some fan boy will come and say how its future proofing to buy this CPU (maybe Toms will will even write an article) but in the future I would expect more from $500 CPU.  Maybe there will be a rush of used 8700k for cheap on ebay from this....


----------



## Mescalamba (Oct 19, 2018)

Yay, someone found out that more cores doesnt mean more of everything in games. 

It would be nice piece for OC, provided you get one thats soldered right. And probably wait till it gets cheaper (if it gets cheaper at all).


----------



## R0H1T (Oct 19, 2018)

dirtyferret said:


> I'm sure some fan boy will come and say how its future proofing to buy this CPU (maybe Toms will will even write an article) *but in the future I would expect more from $500 CPU*.  Maybe there will be a rush of used 8700k for cheap on ebay from this....


That was never in doubt, what's a little bit surprising is how Intel pushed this further than many of us expected.
They literally have left no room on the table even for OCing enthusiasts, this is their FX9590 except with a *fake TDP*.


----------



## TheOne (Oct 19, 2018)

It's a shame that stock of the 8700K is low and the price is inflated.


----------



## kajson (Oct 19, 2018)

And now it´s all clear why 8700k could not be allowed to be available in substantial quantities at normal price, or noone considering money a factor in the purchase decision would ever buy the 9900k.(pretty sure 9700k will not hold up against the 8700k in most scenario's, and thus would be unsellable at a price that is not at least 50$ lower then the 8700k)


----------



## kings (Oct 19, 2018)

4%~5% faster than 2700X in 1440p gaming (resolution I use).

Great deal for 600€.


----------



## Shatun_Bear (Oct 19, 2018)

> Here we reached 5.1 GHz all-core completely stable (even with +105 mV). While the system booted fine at 5.2 GHz, it had a tendency to crash when heavily loaded, no matter the voltage (we tried up to +250 mV), *so we settled for 5.1 GHz as our stable overclock*, and include that in all performance comparisons.



Those saying that 5.4Ghz overclocks on this thing would be possible 'because these have improved silicon' from the 8700K were always living in fantasy land.

What's more, this 9900K tested is likely a golden sample as per usual Intel shenanigans/marketing tactics so retail chips might struggle to get past 5Ghz all-core. Intel really have pushed this old 14nm arch to its limit. Bring on 10nm I say.


----------



## E-curbi (Oct 19, 2018)

Outstanding review.  Yet stock vs stock reviews only tell us part of the story.

Realize the 9900K 5.1Ghz OC values are included, but we simply don’t know how high these 8C/16T chips can clock stable, 5.2Ghz, 5.3Ghz?

Results like 8086K/8700K at 5.4Ghz or 5.5Ghz vs 9900K at 5.3Ghz (or whatever clock speed the high bin tops out at), would be even more exciting.

The Cinebench ST score in this review is interesting - 9900K @5.1GHz of 223. Even if only a small percentage of the 9900Ks do clock up to 5.3Ghz, the single and slightly threaded performance I believe will still only overlap i.e. trade blows with an 8700K or 8086K also binned at 5.3Ghz. So if you already own a 8700K/8086K high bin, no real need to update, for me that means no new CPU until Ice Lake and 10nm which is 14months away. 

Multithreaded performance with the 9900K high bins on the other hand should be greatly improved over the 8700K/8086K high bins, if that’s something your work applications will benefit from.

Siliconlottery.com is estimating November 2nd, for the 9900K binned CPU availability. That day we will know MUCH MORE about this processor. Only problem is a 9900K high bin on launch day will probably be priced at around $1200. OUCH. 

My 8086K 5.3Ghz binned was $880, so yea, I'm sitting out this launch completely. *Be happy with what you have. lol *

For those who are still running a 3770K or even a 6700K, WOW, your ship has come in with the 9900K for gaming and just about anything else you’d like to perform with your PC.

Oh Happy Day!


----------



## S@LEM! (Oct 19, 2018)

never though that this chip will be better than Ryzen 2700x in efficiency. quit a good result for an ageing end of life architecture. explains what could Intel do in the first place.


----------



## Sandbo (Oct 19, 2018)

As someone who writes his own code who can do multi threading, there is no hesitation choosing 2700x over this. 

And then if I game, I can choose 8700k or even 8600k for their almost identical single threaded performance as 9900k. 

It’s a premium to pay to get the best out of both world; more or less for the rich only.


----------



## E-curbi (Oct 19, 2018)

Sandbo said:


> As someone who writes his own code who can do multi threading, there is no hesitation choosing 2700x over this.
> 
> And then if I game, I can choose 8700k or even 8600k for their almost identical single threaded performance as 9900k.
> 
> It’s a premium to pay to get the best out of both world; more or less for the rich only.



NO need to be rich, just create your gaming rig from hand-me down 2ndary components. Plus an outstanding GPU.

"Incremental upgrades over time" is the way to do it, no need to shell out lots of cash all at once. Time is NEVER our friend in this world EXCEPT when upgrading your PC(s).

Two individual PCs on your work desk is the only way to live brother.


----------



## R0H1T (Oct 19, 2018)

E-curbi said:


> Outstanding review.  Yet stock vs stock reviews only tell us part of the story.
> 
> Realize the 9900K 5.1Ghz OC values are included, but we simply don’t know how high these 8C/16T chips can clock stable, 5.2Ghz, 5.3Ghz?
> 
> ...


The biggest problem is temps, around 5GHz this thing is easily hitting 70~80 degrees even with high end liquid cooling, obviously depending on ambient temps & case airflow. Also power consumption is just way OTT, so if you're looking for *MT* beast then *HEDT* (Intel or AMD) is a much better option.

Nah 8700k is still the gaming chip to get in the Intel lineup, 9700k if you want 8 cores.


----------



## Dante Uchiha (Oct 19, 2018)

R0H1T said:


> At stock it has lower base clocks, so not too surprising really.


----------



## R0H1T (Oct 19, 2018)

Dante Uchiha said:


>


I'm not sure what you're trying to show me? That max power draw is probably using AVX code & all cores being loaded at 4.7GHz, at which point 9900k exceeds it's PL2 & consumes an overwhelming 221W & almost 60W more than 9700k all core @4.6 GHz.

The 9900k will be horrible at full load, wrt power consumption, but part of that can be mitigated using better cooling.


----------



## W1zzard (Oct 19, 2018)

E-curbi said:


> Realize the 9900K 5.1Ghz OC values are included, but we simply don’t know how high these 8C/16T chips can clock stable, 5.2Ghz, 5.3Ghz?


I reached 5.1 GHz stable with a 240 mm AIO, that's what's included in the data. 5.2 = unstable


----------



## Saxxter (Oct 19, 2018)

Long time follower & system enthusiast, first time poster! Just want to say thanks for the indepth review. I think something that could help with alleviating thermal performance would be to potentially include various types of Air & AIO coolers to get an idea of what potential buyers may need to invest in, especially if they are in search of the best bang for buck.

For gaming benchmarks...and don't shoot me - would it be possible to see Diablo 3 & World of Warcraft with these results? I actually know several raiders that are looking to come from much older gear and have been waiting to chose AMD or Intel, and I think some benchmarks for them would be of benefit as well.


----------



## Vya Domus (Oct 19, 2018)

R0H1T said:


> but part of that can be mitigated using better cooling.



How much better though ? Of the few reviews I have seen people were already using fairly high end coolers that were still not quite enough.


----------



## R0H1T (Oct 19, 2018)

Vya Domus said:


> *How much better though* ? Of the few reviews I have seen people were already using fairly high end coolers that were still not quite enough.


Depends on the chip & a whole host of other variables, including MB.


----------



## E-curbi (Oct 19, 2018)

R0H1T said:


> The biggest problem is temps, around 5GHz this thing is easily hitting 70~80 degrees even with high end liquid cooling, obviously depending on ambient temps & case airflow. Also power consumption is just way OTT, so if you're looking for *MT* beast then *HEDT* (Intel or AMD) is a much better option.
> 
> Nah 8700k is still the gaming chip to get in the Intel lineup, 9700k if you want 8 cores.



Yep, I agree 100%. The Intel 6core 12thread *may be the sweet spot for most of us *until 10nanometer next year at this time. I run my 8086K at 5.3Ghz/5.4Ghz all core all thread with nice temps all day long using a Noctua C14S air cooler (pic attached). Can do the same with a 5.6Ghz single core (per-core) boost 5.6 5.5 5.4 5.2 5.2 5.2 configuration in bios, still nice temps ON AIR.

... and the Noctua Industrial 140mm PWM fan dialed down to 680rpm - 700rpm is completely inaudible, with those nice temps and those sweet overclocks, not certain that can also be achieved with this 9900K, I need complete silence while working. 

The 8700K and 8086K might be the best overall CPU for some of us for the next 14months.

*I only hope we have a 6C/12T option with Ice Lake (10nm). With much higher clock speeds of course. I'll take a 5.8Ghz bin from SL.com please.*


----------



## W1zzard (Oct 19, 2018)

Saxxter said:


> Diablo 3 & World of Warcraft with these results?


WOW is a huge pain to bench due to the always online nature and patches. Never looked into Diablo 3, but seems a bit old to be relevant.

Adding some lighter games could be a good idea for the future. Anyone else has an opinion on that?


----------



## Agentbb007 (Oct 19, 2018)

I’m trying to find this cpu benched with a 2080 Ti, seems most reviews are using a 1080 which is definitely going to be GPU bound for all the gaming benchmarks.


----------



## HD64G (Oct 19, 2018)

So much talk for the gaming strenght of this ultra-expensive cpu that could give some excuse to that price. 7-8% more FPS in 1080P and 4-5% more FPS in 1440P vs the 40-50% cheaper 2700X. And not that much faster in productivity apart from sw optimised for Intel. 7nm are on the way. If AMD is true to their promise for their IPC increase for Zen2 vs Zen1, we will see the tables turned upside down for the performance crown, even in gaming. VFM-wise, AMD is king by far since Ryzen launched and their place has become even better lately with the increase in intel's cpu pricing.


----------



## Space Lynx (Oct 19, 2018)

what is the temperature at 1.35 volts 5.1 ghz?


----------



## Saxxter (Oct 19, 2018)

W1zzard said:


> WOW is a huge pain to bench due to the always online nature and patches. Never looked into Diablo 3, but seems a bit old to be relevant.
> 
> Adding some lighter games could be a good idea for the future. Anyone else has an opinion on that?



I can understand the pains of benching games built on spaghetti code for sure - I think a part of it stems also from comparing a 2700x vs 8700k w/32GB DDR4 3200 RAM at home on WoW at a resolution of 3440 x 1440 @ 100Hz on a 1080ti FTW3 Elite. (Using Ultra level 10 Settings)

I had severe raid stuttering on the 2700x running @ 4.4GHz and I would jump all over the place with FPS, but with Intel, it was pretty damn smooth.

So this is whats thrown me for a loop is that with most, if not alot of games, the 8700k and 2700x are relatively neck-to-neck and constantly tradiing blows. I guess why I've come here to ask is to get others opinions as well so I can help my friends with their purchase - I just didn't want to be that guy and say "just buy this and be done."


----------



## mcraygsx (Oct 19, 2018)

"9900K, which will automatically drop clocks when it senses too much power draw. For example when set to 5 GHz all-core, with some extra voltage, as soon as you put a serious multi-core load on the CPU, the clocks will drop instantly to around 4 GHz. To raise this limit, you'll have to adjust the power limit in BIOS or XTU — a first for Intel, but no problem, as long as you are aware of it."

You think this will be worked out with future BIOS updates?. My own 7700K or 8086K when running on Asus Maximux Hero boards do not automatically throttle. I mean I do not have to manually increase the TDP rating in BIOS when I overclock. Any thoughts or reasons why this default limitation was implemented at first place for very first time?. 

Great review as always on par with what I just read at Techspot.com.


----------



## Bluescreendeath (Oct 19, 2018)

Sounds like you need a $100-$200 200watt heatsink to prevent the CPU from throttling itself at stock-turbo boost clocks. So the total actual cost of this cpu is what, $700?



E-curbi said:


> NO need to be rich, just create your gaming rig from hand-me down 2ndary components. Plus an outstanding GPU.
> 
> "Incremental upgrades over time" is the way to do it, no need to shell out lots of cash all at once. Time is NEVER our friend in this world EXCEPT when upgrading your PC(s).
> 
> Two individual PCs on your work desk is the only way to live brother.



AMD's Zen 2 is only ~3 months away and is supposed to have 13% IPC improvements + slight bump in clockspeeds. If true, then we're basically going to get something that beats the Coffee Lake cpus in both single and multi threaded applications...without Intel's price gouging. For those who are trying to buy the best, they might as well wait 3 months and see why AMD has to offer with Zen 2.


----------



## fabtech (Oct 19, 2018)

HD64G said:


> So much talk for the gaming strenght of this ultra-expensive cpu that could give some excuse to that price. 7-8% more FPS in 1080P and 4-5% more FPS in 1440P vs the 40-50% cheaper 2700X. And not that much faster in productivity apart from sw optimised for Intel. 7nm are on the way. If AMD is true to their promise for their IPC increase for Zen2 vs Zen1, we will see the tables turned upside down for the performance crown, even in gaming. VFM-wise, AMD is king by far since Ryzen launched and their place has become even better lately with the increase in intel's cpu pricing.


Yes indeed, am I the only one to be disappointed by this CPU ? The scaling in Multithread is very poor, some other tests have shown a huge decrease in speed when used with heavy Multithreaded tasks... If OC is the only way to keep high frequency, a very expensive cooler will be needed and a powerful PSU, getting an increase of 140W in power consumption when OC only at 5.2 GHz is just not possible for a little workstation, a 7900x will do much better, even my 6950x (got it at $500) performs better in Multithread tasks @ just 4.4GHz (Cinebench score 2318).

Well, I am not a high core gamer, but I do play with a 1440p monitor, this 9900K gives no advantages at this resolution, so this cpu is not worth for gaming at this resolution in front of a 2600x, 2700x, 8700K...
So what is the real target of this CPU(sold 700€ in EU)  ? As little workstation, a 1920x way cheapper (220€ less expensive than the 9900K in EU), a 1950x (sold just a little bit more expensive) will be better choice for workstation.
Well the 9900K do well in all kind of task but the 2700x do well too and i IS for my opinion, the BEST CPU VALUE for all kind of tasks, the 9900K is crucified by its high price.


----------



## muSPK (Oct 19, 2018)

i7 8700K and i9 9900K is almost at same price now in Swedistan due to supply shortage,  only 100 USD difference in price. So I guess I am sticking with the i9 and get 2 extra cores.


----------



## W1zzard (Oct 19, 2018)

mcraygsx said:


> You think this will be worked out with future BIOS updates?


To me this looks like "working as intended". The CPU is rated for 95 W, so it'll run at 95 W max (it can go beyond that for a short duration).

If you have a better heatsink -> dial up the TDP. If Intel magically changed their 95 W parts to 150 W over night via BIOS update, all hell would break lose


----------



## v12dock (Oct 19, 2018)

Vya Domus said:


> So let me get this right, even when they decided to solder the IHS they still somehow managed to do a bad job ?



This is correct Intel can't even solder.


----------



## PopcornMachine (Oct 19, 2018)

Another review where I am dumbfounded with the conclusion of "Editor's Choice".

Leaves me wondering if the reviewer read his own review.


----------



## Dante Uchiha (Oct 19, 2018)

R0H1T said:


> I'm not sure what you're trying to show me? That max power draw is probably using AVX code & all cores being loaded at 4.7GHz, at which point 9900k exceeds it's PL2 & consumes an overwhelming 221W & almost 60W more than 9700k all core @4.6 GHz.
> 
> The 9900k will be horrible at full load, wrt power consumption, but part of that can be mitigated using better cooling.



The disparity between the TPU tests and other sites? The TPU review is the only one that makes the 9900k seem efficient.


----------



## B-Real (Oct 19, 2018)

So it's actually 8% faster in FHD with a 1080Ti compared to a 2700X and costs 250$ more. That's really the 30% average PT got. And consumes more power than a 2700X. Intel is getting to new lows.



Crowley said:


> I am going to say it up front, I prefer Intel over AMD. But looking at these benchmarks and going over the price difference, it makes me sad to admit that
> 
> I play my games at 1440p and when I compare the performance differences between the 9900k and 2700X, I just can't figure out the insane price difference
> 
> ...



Hail for your unbiased approach!



Manu_PT said:


> Yeah sure. Average fps. If they had 1% low at least, ryzen would be crushed.



Haha, yes for sure. You need to grap in something. 



Fleurious said:


> Wow, only skimmed through all the performance tables but this CPU looks pretty damn good.  Now to go look at pricing...
> 
> $660 on Newegg Canada, a little bit more than what I paid for the 3930k.  I could live with that if I ever move on from my 4790k.


LOL, trying to justify the nearly 600$ price...



fabtech said:


> Yes indeed, am I the only one to be disappointed by this CPU ? The scaling in Multithread is very poor, some other tests have shown a huge decrease in speed when used with heavy Multithreaded tasks... If OC is the only way to keep high frequency, a very expensive cooler will be needed and a powerful PSU, getting an increase of 140W in power consumption when OC only at 5.2 GHz is just not possible for a little workstation, a 7900x will do much better, even my 6950x (got it at $500) performs better in Multithread tasks @ just 4.4GHz (Cinebench score 2318).
> 
> Well, I am not a high core gamer, but I do play with a 1440p monitor, this 9900K gives no advantages at this resolution, so this cpu is not worth for gaming at this resolution in front of a 2600x, 2700x, 8700K...
> So what is the real target of this CPU(sold 700€ in EU)  ? As little workstation, a 1920x way cheapper (220€ less expensive than the 9900K in EU), a 1950x (sold just a little bit more expensive) will be better choice for workstation.
> Well the 9900K do well in all kind of task but the 2700x do well too and i IS for my opinion, the BEST CPU VALUE for all kind of tasks, the 9900K is crucified by its high price.



Yes, 1440P and above nearly wipes all the small differences on FHD to 1-5%. And most 1080Ti users do not use their cards for FHD gaming. And if you play on FHD, you will use a card usually up to 1070, which also gives zero upgrade from an 1500x-2600-2700x or any i5 or i7 from the past 2 or 3 gens compared to a 9900K.


----------



## Bluescreendeath (Oct 19, 2018)

muSPK said:


> i7 8700K and i9 9900K is almost at same price now in Swedistan due to supply shortage,  only 100 USD difference in price. So I guess I am sticking with the i9 and get 2 extra cores.



The i9 9900k needs a 200watt cooler to run at its stock turbo boost speeds properly or it'll throttle itself. That's another $100-$200. 

Why not get Ryzen 2 or wait for Zen 2 in early 2019?


----------



## B-Real (Oct 19, 2018)

muSPK said:


> i7 8700K and i9 9900K is almost at same price now in Swedistan due to supply shortage,  only 100 USD difference in price. So I guess I am sticking with the i9 and get 2 extra cores.


Why can't you jump of the blue train mate?


----------



## ShurikN (Oct 19, 2018)

I think Steve from Hardware Unboxed said it best. "For whom was this chip made?!"
It's marginally better than 8700K in gaming while costing $150-200 more.
It's marginally better than 2700X in productivity while costing almost two times more.

Not to mention you can get an entire 2700X system with mobo and 16GB of ram for the same preorder/inflated price


Not only that, but you can't hit the same clocks as a delided 8700K. Plus, you need a monster of a cooler for any type of OC. And even with it, it'll still hit 90C.

Gotta hand it to Intel, they made both 2700X and original Coffee Lake look amazing.


----------



## Zubasa (Oct 19, 2018)

v12dock said:


> Unbelievable





Vya Domus said:


> So let me get this right, even when they decided to solder the IHS they still somehow managed to do a bad job ?


IMO that solder job looks like something you would expect from 2011 not 2018.
I guess this is where all the micro-cracks / thermal expansion BS that have been spew around the internet came from.


----------



## EatingDirt (Oct 19, 2018)

From the review:


> The Core i9-9900K is currently listed on Amazon for $530. For some reason the Newegg price that we usually use is $580.




I don't see the 9900k for $530 anywhere. $580 on Amazon & Newegg. May want to adjust the Performance-Per-Dollar graph to reflect that? (or add another bar reflecting the current $580 price)

Otherwise good review, as expected the 9900k is fast, but it doesn't reaally fit anywhere besides consumer PC "Performance King". 

If you only game or need single threaded speed one could save $200 currently with an 8700k and get less than 2% less performance, and if one is looking for value the 2700x is around 15% slower in the majority of tasks but is 48% cheaper. 

If you need threads and don't care all that much about value you're generally better off with the x399/x299 platforms.


----------



## Hossein Almet (Oct 19, 2018)

If you could, try exporting 200 or 300 photos from Lightroom at 5.1Ghz and  4.8Ghz and see which one takes less time to finish to the job.


----------



## Mighty-Lu-Bu (Oct 19, 2018)

AMD will finally beat Intel in gaming with Zen 2 and it will be glorious!


----------



## W1zzard (Oct 19, 2018)

Hossein Almet said:


> If you could, try exporting 200 or 300 photos from Lightroom at 5.1Ghz and  4.8Ghz and see which one takes less time to finish to the job.


We'll add some Lightroom tests in our next CPU benchmark system revision (after new Windows update, probably in November)

Could you share some info on your workflow, so I can reproduce it as closely as possible?



EatingDirt said:


> I don't see the 9900k for $530 anywhere. $580 on Amazon & Newegg.


Looks like Amazon jacked up their prices, one of my staff members ordered today for $530. Let's give it one more day to see where prices end up and then I'll adjust


----------



## Hossein Almet (Oct 19, 2018)

W1zzard said:


> We'll add some Lightroom tests in our next CPU benchmark system revision (after new Windows update, probably in November)
> 
> Could you share some info on your workflow, so I can reproduce it as closely as possible?
> 
> ...



Well, my i7 6800K benchmark at incredible results 4.3Ghz, but when I exported 100 photos from Lightroom in the middle of Melbourne Winter and timed it, it took more time to finish the job than @ 4.1Ghz, Lightroom is a very taxing application, when exporting photos it loads the CPU between 92% and 98%.


----------



## 2big2fail (Oct 19, 2018)

Hi, @W1zzard ,

Could you provide some details on the Tensorflow benchmark? I'm a developer that uses Tensorflow. I'd be also interested to know if you did any combination of CPU only and CPU+GPU for Tensorflow.

Thanks


----------



## Octopuss (Oct 19, 2018)

_Once you excavate your product out of the packaging_

__


----------



## Shatun_Bear (Oct 19, 2018)

ShurikN said:


> I think Steve from Hardware Unboxed said it best. "For whom was this chip made?!"
> It's marginally better than 8700K in gaming while costing $150-200 more.
> It's marginally better than 2700X in productivity while costing almost two times more.
> 
> ...



I don't understand the point of this CPU, it was a surprise they rushed it to market. Perhaps they got spooked by the 2700X and TR2 so needed to scramble something onto the market to try and combat those. 

But at $550 it's DOA in terms of gaining any marketshare back. Even at $400 I could not recommend one as the 2700X can be had for $290 and doesn't need a $150 cooler. But at $550!?? Lol.



muSPK said:


> i7 8700K and i9 9900K is *almost at same price *now in Swedistan due to supply shortage,  *only 100 USD difference in price*. So I guess I am sticking with the i9 and get 2 extra cores.



 Come off the Intel koolaid dude you're not thinking straight.


----------



## Frick (Oct 19, 2018)

W1zzard said:


> Adding some lighter games could be a good idea for the future. Anyone else has an opinion on that?



Dwarf Fortress would be an outstanding addition, and honestly shouldn't be hard to achieve. Ask on their forums for an advanced save and just use that, or possibly measure world generation. That game will suffer the FPS death if you go on for long enough. I don't know how big the difference will be between CPUs though...


dcf-joe said:


> For those of us who prefer high refresh rate gaming, is there any way you could include minimum frames or an average of the minimums in future benchmarks? I feel like this would further highlight the differences in CPUs for gaming.



Not just high refresh gamers, I've been wanting that number for a while now, it's important on the lower end of things too. This is like the last site that doesn't include it tbh.



muSPK said:


> i7 8700K and i9 9900K is almost at same price now in Swedistan due to supply shortage,  only 100 USD difference in price. So I guess I am sticking with the i9 and get 2 extra cores.



That ain't close dude, that's like several plattor öl.


----------



## Recus (Oct 19, 2018)

Crowley said:


> I am going to say it up front, I prefer Intel over AMD. But looking at these benchmarks and going over the price difference, it makes me sad to admit that
> 
> I play my games at 1440p and when I compare the performance differences between the 9900k and 2700X, I just can't figure out the insane price difference
> 
> ...



Because competition increasing prices.


----------



## Mighty-Lu-Bu (Oct 19, 2018)

Shatun_Bear said:


> I don't understand the point of this CPU, it was a surprise they rushed it to market. Perhaps they got spooked by the 2700X and TR2 so needed to scramble something onto the market to try and combat those.
> 
> But at $550 it's DOA in terms of gaining any marketshare back. Even at $400 I could not recommend one as the 2700X can be had for $290 and doesn't need a $150 cooler. But at $550!?? Lol.
> 
> ...



People don't understand the point of this CPU just like they don't understand the point of RTX. It seems like Intel rushed out this CPU because they were worried about both 2700X (which is currently $304.99 and is expected to drop well below the  $300 mark by the end of year) and Zen 2 which will probably be in the $350-$380 for the flagship model.

This CPU is not impressive at all- worse power draw than Threadripper, only minimal  performance gains in gaming from the i7-8700k,  and it is expensive. Why shell out $500 for this CPU when you can get the 2700X for $200 cheaper, that has similar performance and better power consumption.... Zen 2 is going to ruin Intel's 2019 year.


----------



## mastershake575 (Oct 19, 2018)

Mighty-Lu-Bu said:


> Zen 2 is going to ruin Intel's 2019 year.


 I have a feeling this is going to be the case.

Zen 2 doesn't even have to be groundbreaking to ruin Intel . 

2700 successor with 300-350mhz increase on all cores + 5-7% increase in IPC for near $300 and it's probably all over to be honest


----------



## JRMBelgium (Oct 19, 2018)

For all visitors from the Netherlands & Belgium. Chart is based on this review + prices on Tweakers pricewatch:


----------



## Captain_Tom (Oct 19, 2018)

I just cannot conceive of an intelligent human who would buy this for any good reason.  It is only about 8% better in *1080p*  gaming than a 2700X that costs $300 less!   

*$300* base cost + *$100* for an AIO cooler + *$50* more for the expensive motherboards.   That's paying $450 for 8% better low-res gaming performance, and that * is nearly the difference between buying a 1080 Ti and a 2080 Ti!!!!*

There is no world where buying this makes sense.


----------



## xorbe (Oct 19, 2018)

Don't see a reason to upgrade my 5GHz 8086K for 4K/VR gaming.


----------



## Captain_Tom (Oct 19, 2018)

M2B said:


> It's not intel's fault that today's games do not fully utilize an eight core CPU.
> same for the 2700X.
> there is nothing intel can do about gaming perfomance at this point, but 9900K will show its true benefits for gaming over years.



It actually kind of _is_ Intel's fault.  They were the ones that continued to pressure devs to focus on quad-cores for gaming.



dirtyferret said:


> I'm sure some fan boy will come and say how its future proofing to buy this CPU (maybe Toms will will even write an article) but in the future I would expect more from $500 CPU.  Maybe there will be a rush of used 8700k for cheap on ebay from this....



If you want to "future proof", then get a 2700X since it is on the "future-proofed" AM4 socket and has the same thread count.



mastershake575 said:


> I have a feeling this is going to be the case.
> 
> Zen 2 doesn't even have to be groundbreaking to ruin Intel .
> 
> 2700 successor with 300-350mhz increase on all cores + 5-7% increase in IPC for near $300 and it's probably all over to be honest



We already know an early sample of  Zen 2 is _at least _ 4.5GHz + 13% IPC increase.  That alone will beat the 9900K and could be sold for $499 in massively higher numbers.  This is before we confirm if there will even be 12/16-core models or clocks above 4.8GHz...



W1zzard said:


> I reached 5.1 GHz stable with a 240 mm AIO, that's what's included in the data. 5.2 = unstable



Steve at Techspot also was limited to 5.1GHz on BOTH of his 9900K's.


----------



## ShurikN (Oct 19, 2018)

Captain_Tom said:


> We already know an early sample of  Zen 2 is _at least _ 4.5GHz


I don't remember seeing this info anywhere. Can you provide a source, leak, rumor, would like to check it out.


Captain_Tom said:


> Steve at Techspot also was limited to 5.1GHz on BOTH of his 9900K's.


As far as I can see from various reviews, no one can hit more than 5.1 stable. And even then the temps get insane.


----------



## Captain_Tom (Oct 19, 2018)

ShurikN said:


> I don't remember seeing this info anywhere. Can you provide a source, leak, rumor, would like to check it out.
> 
> As far as I can see from various reviews, no one can hit more than 5.1 stable. And even then the temps get insane.



https://wccftech.com/amd-zen-2-7nm-cpu-13-percent-ipc-increase-rumor/

https://wccftech.com/amd-zen-2-ryzen-8-core-16-thread-cpu-leak/


I suspect the 8-core model RTG has is just 1-CCX as well (Since RTG would want a single CCX for an APU).  We could be looking at an R9 3800X that has 16 cores, 4.8GHz clockspeeds, and 15% higher IPC...


----------



## trog100 (Oct 19, 2018)

it dosnt deserve the score it got.. dosnt clock as high as expected.. runs too hot and still needs deliding .. bit of a fail if you ask me.. 

the video about the soldered tim and thicker pcb was fascinated.. 


trog


----------



## ShurikN (Oct 19, 2018)

Captain_Tom said:


> https://wccftech.com/amd-zen-2-ryzen-8-core-16-thread-cpu-leak/
> 
> 
> I suspect the 8-core model RTG has is just 1-CCX as well (Since RTG would want a single CCX for an APU).  We could be looking at an R9 3800X that has 16 cores, 4.8GHz clockspeeds, and 15% higher IPC...


Oh yeah that one, I remember now.
I would personally like to see them focus on everything other than more cores. 8 is enough for now on the mainstream desktop platform.


----------



## Captain_Tom (Oct 19, 2018)

ShurikN said:


> Oh yeah that one, I remember now.
> I would personally like to see them focus on everything other than more cores. 8 is enough for now on the mainstream desktop platform.



I think it would be wise for AMD to _just _keep a core-count advantage if they can, simply because Infinity Fabric makes it so easy for them to win in that department and it is just good for marketing.  However I do agree that 16 cores is not needed on mainstream desktop.  I would prefer if they delivered 5 or 6 core CCX's so they can deliver a 5GHz 10 or 12-core instead of a 16-core @ 4.6GHz.  But again, I would trade a little bit of IPC to get 10 or 12 cores instead of just 8.   TBH though, I am starting to think that while it would be easy for them to give us a 16-core on AM4 next year - they may intentionally hold it back for Zen2+ so they can make a big deal out of it generation-to-generation.



trog100 said:


> it dosnt deserve the score it got.. dosnt clock as high as expected.. runs too hot and still needs deliding .. bit of a fail if you ask me..
> 
> the video about the soldered tim and thicker pcb was fascinated..
> 
> ...



I actually think the 9900K is more efficient than expected.   It does seem to be able to beat the 2700X by 10-20% while using nearly the same energy.   But yes it also doesn't overclock well, and it seems to require very expensive motherboards.


----------



## laszlo (Oct 19, 2018)

Captain_Tom said:


> I just cannot conceive of an intelligent human who would buy this for any good reason.



after Principled Tech testing a lot of "intelligent human" jumped and per-ordered them fast not to loose the opportunity ho have the "best gaming processor" asap...marketing and preconception dictate the herd behavior and in this case intelligence is unquantifiable


----------



## GoldenX (Oct 19, 2018)

Hot and expensive. All hail the new Pentium EE.


----------



## moob (Oct 19, 2018)

E-curbi said:


> For those who are still running a 3770K or even a 6700K, WOW, your ship has come in with the 9900K for gaming and just about anything else you’d like to perform with your PC.


Lmao. I'm still running a 3770K and there's no chance in hell I'd get this CPU. Like *ShurikN *mentioned*, *I could literally buy a 2700X + motherboard + 16GB RAM for the same price as this CPU alone, with only a small performance decrease across the board. But I've held off on that for this long so I can wait a few more months for Zen 2. The 9900K isn't even a consideration with that price/performance.


----------



## Shatun_Bear (Oct 19, 2018)

laszlo said:


> after Principled Tech testing a lot of "intelligent human" jumped and per-ordered them fast not to loose the opportunity ho have the "best gaming processor" asap...marketing and preconception dictate the herd behavior and in this case intelligence is unquantifiable



That's a good point.

(Un)Principled Technology: '9900K up to 50% faster in 1080p gaming vs 2700X'
Reality: under 10% faster.

What a load of BS.


----------



## Captain_Tom (Oct 19, 2018)

moob said:


> Lmao. I'm still running a 3770K and there's no chance in hell I'd get this CPU. Like *ShurikN *mentioned*, *I could literally buy a 2700X + motherboard + 16GB RAM for the same price as this CPU alone, with only a small performance decrease across the board. But I've held off on that for this long so I can wait a few more months for Zen 2. The 9900K isn't even a consideration with that price/performance.



My favorite comparison is that you can choose between a 2700X + 2080 Ti, or a 9900K + 1080 Ti!    Thus unless you already have a 2080 Ti, your money is being thrown away on a 9900K.


----------



## GoldenX (Oct 19, 2018)

I think we can finally say that 14nm++ is dead.


----------



## HTC (Oct 19, 2018)

@W1zzard : would you consider including benches of multiple simultaneous things?

- a whole bunch of "one bench with something else in the background"
- one or two of "one bench with several things in the background"
- perhaps even 2 simultaneous benches with moderate to high CPU intensity

Obviously, what's in the background needs to be intensive enough to affect the bench run, plus it also needs to attempt to "copy" a "normal user".


----------



## 1d10t (Oct 19, 2018)

Great feat Intel,for pulling 5Ghz from your hat. But as chart told, that clocks doesn't make a clear winner, I think we safely to assume 'we reach a barrier' 
As for the CPU itself,I don't see any reason why they sell this at obnoxious price.If they targetting for 1080p gamer,which is truly shines,why should people take this over $500 GPUs? Sadly,the same scenario goes all over resolution.
"The Best CPU for gaming", duh! 



HTC said:


> @W1zzard : would you consider including benches of multiple simultaneous things?
> 
> - a whole bunch of "one bench with something else in the background"
> - one or two of "one bench with several things in the background"
> ...



Ahh,i see what you did there. Busting a myth "SINGLE THREAD IS ALL THAT MATTER" ?


----------



## dicktracy (Oct 19, 2018)

I don't understand why reviewers are using the old 1080 Ti to test brand new CPUs... Tom's Hardware is even worse... a GTX 1080... ROFL. GPU bottleneck ensues.


----------



## Robcostyle (Oct 19, 2018)

Actually, for me, it's kinda of a huge dissapointment (As entire hardware market this year). I mean, yeah, great! - latest intel arch + 8c16t would crush everything, literally.
But...
doubtfull 5GHz OC (many reviewers stucked at 4.8-5.1GHz allcore, with awfull temps) - that means my 8700K 5GHz 1.35-1.4V with temps around 60-80 under water is not so bad. And I'm not even talking about golden chips with 1.2V requirement. Thus, since I don't have any solid opportuity to bin CPUs by myself (and I don't want to pay trice price to silicon lottery for 5.3GHz silicon) - that means I have a decently high chance ending up with worse CPU for 600$.
*100% guaranteed 5-5.3GHz overclock with low temps due to STIM and 9900K binning* - that the first thing I hoped for. Unfortunately, its waaaay off the reality. 

Second - any improvements over 8700K in games - like, more cores/threads available for pure game + more cores/threads for backfground = nanolatency and mindBLOWING 0.1% frame. It did happend, but not how I imagine that.

Soo, in the end. + 61% price for +1.8% perfomance. Nah, even slicky jensen has more to offer - for example, godlike rays and drake's DLSS aliasing.
They want too much for so less


----------



## newtekie1 (Oct 19, 2018)

I think the price is justified.  It is trading blows with $900+ processors.  The problem, IMO, is people judging this just as a gaming chip.  This processor, IMO, is not a gaming chip.  If you want to game, and that's your only concern, buy an i7-9700K or even the i5-9600K.  This processor is for people that do more with the CPU than just play games, and for those of us that do that, this is an absolute bargain compared to going with a HEDT platform.



dicktracy said:


> I don't understand why reviewers are using the old 1080 Ti to test brand new CPUs... Tom's Hardware is even worse... a GTX 1080... ROFL. GPU bottleneck ensues.



It doesn't really matter if you are including 720p results.  If you prefer a 2080Ti be used, just use a sharpie to mark out the 720p on the graphs and put 1080p, the results will likely be pretty close to the same.

The reality is, for games you are usually always going to be GPU bound.  You aren't going to be buying a 2080Ti and then playing at 1080p, just like no one with a 1080Ti plays at 720p.  You crank the settings until the GPU can't handle it anymore.  At this point, with any reasonably powerful CPU, the CPU is not going to be what is making the game unplayable your GPU is.


----------



## kajson (Oct 19, 2018)

I'd love to see someone delid one of these new cpu's and use Intels old TIM style on it, and see what the actual difference is in temps..  Not sure if the new STIM is even removable though.. Just want to know if it is a "feature" or if the chips would be severely crippled had they gone with the old TIM.


----------



## Kissamies (Oct 19, 2018)

dicktracy said:


> I don't understand why reviewers are using the old 1080 Ti to test brand new CPUs... Tom's Hardware is even worse... a GTX 1080... ROFL. GPU bottleneck ensues.


1080 Ti is still powerful and I have a strong feeling that people have more them, than those overpriced RTX cards..

That "IHS is now soldered again" is just a joke.


----------



## XiGMAKiD (Oct 19, 2018)

9900K and RTX 2080 Ti are a match made in Heaven, both are the best for playing games with RTRT at 720p60 

But seriously, for $500 I expect no less than the best performance


----------



## R0H1T (Oct 19, 2018)

kajson said:


> I'd love to see someone delid one of these new cpu's and use Intels old TIM style on it, and see what the actual difference is in temps..  Not sure if the new STIM is even removable though.. Just want to know if it is a "feature" or if the chips would be severely crippled had they gone with the old TIM.


Removing solder is like playing poker with your eyes closed, that is to say it's simply not worth the gamble especially for such an expensive chip.


----------



## dicktracy (Oct 19, 2018)

Chloe Price said:


> 1080 Ti is still powerful and I have a strong feeling that people have more them, than those overpriced RTX cards..
> 
> That "IHS is now soldered again" is just a joke.


People who wants to buy a highend CPU right now are most likely interested in the 20 series such as 2080 Ti. I can see the gap between the 9900k and 2700x increase further when using a 2080 Ti, let alone with DLSS (AMD's nightmare).


----------



## Octopuss (Oct 19, 2018)

moob said:


> Lmao. I'm still running a 3770K and there's no chance in hell I'd get this CPU. Like *ShurikN *mentioned*, *I could literally buy a 2700X + motherboard + 16GB RAM for the same price as this CPU alone, with only a small performance decrease across the board. But I've held off on that for this long so I can wait a few more months for Zen 2. The 9900K isn't even a consideration with that price/performance.


I've just checked prices.
This ridiculous crap cost *more than twice* as much as 2700K. I'd have to be retarded to buy this.

Still running 3770K here. I don't like anymore and I really need some more cores, but I'll wait for Zen2. There's no way in hell I'd go with Intel again (last time I had AMD CPU was in 2003 I believe, AthlonXP).


----------



## ShurikN (Oct 19, 2018)

dicktracy said:


> People who wants to buy a highend CPU right now are most likely interested in the 20 series such as 2080 Ti. I can see the gap between the 9900k and 2700x increase further when using a 2080 Ti, let alone with DLSS (AMD's nightmare).


I believe Hardware Unboxed used a 2080ti, and the percentage gap was still around 12%


----------



## EatingDirt (Oct 19, 2018)

newtekie1 said:


> I think the price is justified.  It is trading blows with $900+ processors.  The problem, IMO, is people judging this just as a gaming chip.  This processor, IMO, is not a gaming chip.  If you want to game, and that's your only concern, buy an i7-9700K or even the i5-9600K.  This processor is for people that do more with the CPU than just play games, and for those of us that do that, this is an absolute bargain compared to going with a HEDT platform.



It's not comparable to HEDT CPU. It doesn't have is Quad Channel+ RAM or an acceptable amount of PCIe Lanes to be a serious HEDT.


----------



## dicktracy (Oct 19, 2018)

ShurikN said:


> I believe Hardware Unboxed used a 2080ti, and the percentage gap was still around 12%


On average with a 2080 Ti. O.O AMD has a lot of catching up to do.
https://www.computerbase.de/2018-10/intel-core-i9-9900k-i7-9700k-cpu-test/


----------



## E-curbi (Oct 19, 2018)

Sorry if I'm a little late to the party. I just saw this Der8auer video.

Now we have to pay for a delid and a grind? 


"The 9900K is the fastest gaming CPU on the planet!"

*(IFF - you are playing two games simultaneously, and streaming both games, also simultaneously.) *

IFF from Calculus courses if I remember correctly means *IF and only IF.*  Oh Intel, what are we gonna do with you?


----------



## newtekie1 (Oct 19, 2018)

EatingDirt said:


> It's not comparable to HEDT CPU. It doesn't have is Quad Channel+ RAM or an acceptable amount of PCIe Lanes to be a serious HEDT.



Yes, that would be why it is still $400 cheaper.  But if you don't need the Quad-Channel RAM or extra PCI-E lanes, it would be a hard sell to go with a 7900X over a 9900K.


----------



## Kissamies (Oct 19, 2018)

dicktracy said:


> People who wants to buy a highend CPU right now are most likely interested in the 20 series such as 2080 Ti. I can see the gap between the 9900k and 2700x increase further when using a 2080 Ti, let alone with DLSS (AMD's nightmare).


If I'd get a 1000+ eur graphics card, I'd go for HEDT platform.


----------



## GoldenX (Oct 19, 2018)

newtekie1 said:


> Yes, that would be why it is still $400 cheaper.  But if you don't need the Quad-Channel RAM or extra PCI-E lanes, it would be a hard sell to go with a 7900X over a 9900K.


I would go for a 2700X then, have similar multi thread preformance and save money. Getting a used Threadripper + GT1030 is also an option.

Man good thing it's soldered, it would meltdown otherwise.


----------



## Kissamies (Oct 19, 2018)

GoldenX said:


> I would go for a 2700X then, have similar multi thread preformance and save money. Getting a used Threadripper + GT1030 is also an option.
> 
> Man good thing it's soldered, it would meltdown otherwise.


The soldering is a joke. Watch the video by der8auer.


----------



## EatingDirt (Oct 19, 2018)

newtekie1 said:


> Yes, that would be why it is still $400 cheaper.  But if you don't need the Quad-Channel RAM or extra PCI-E lanes, it would be a hard sell to go with a 7900X over a 9900K.



The 7900x has simply always had awful value. In fact the entirety of intel's current HEDT platform has been pretty awful compared to AMD's Threadripper, with only a few reasons to go intel's HEDT over AMD's, the main reasons being clock speed & AVX sensitive workloads. (For value comparison as of this thread: The 1920x, a 12/24 CPU, is currently $434 on Newegg & 394 on Amazon.)

The 9900k is simply a bad HEDT CPU, and it's an awful value consumer CPU. Its niche is almost so small, it's non-existent.


----------



## John Naylor (Oct 19, 2018)

dj-electric said:


> A great piece of hardware for an almost criminally high price compered to the competition.



As compared to what ?

"A surprising result is that Core i9-9900K matches performance of AMD's 16-core / 32-thread Threadripper 2950X processor when averaged over our whole CPU test suite. "   Can't quite understand why $530 is "criminally" high and $800 is not ?   If you are like 98% of PC users, then don't buy either.

"The new Intel Core i9-9900K finally puts eight-cores and 16-threads into the hands of gamers, consumers and enthusiasts" ... "Intel's own marketing paraded the Core i9-9900K as the best processor for gamers.   While that is technically true, the differences are rather slim in our own testing.

Goes to what I was saying all along ... The focus on cores is meaningless for most of what 98% of PC users do every day.  So yes, like the $900 Threadripper, it's great at stuff most folks never do, so why those folks would spend more for a processor that doesn't help them in any way is kinda puzzling.

The more appropriate comparison is the $399 9700k  ... just $20 more then the $8700k and sure to drop as supplies increase and the "I must be 1st on block with new crowd" has filled their quota or the $289 9600k ... again just $20 more then the $8600k.   Also not thet wholesale price is $488 ($42 markup) for the Core i9-9900K, $374 ($25 markup) for the Core i7-9700K, and $262 ($27 markup) for the Core i5-9600K

Are we gonna see any significant increase in gaming performance ?  When was the last time that happened ? ... Sandy bridge.  But soldered IHS is easily worth $20 on it's own, the chipset's support for 10 Gbps and integrated Wi-Fi 802.11ac will be of value to some, but personally, I see no value in Wifi for desktops ... apartment dwellers obviously will and that saves some money upgrading to a MoBo with 'on board" WiFi as a separate chip that must be added.




> It's not intel's fault that today's games do not fully utilize an eight core CPU.   same for the 2700X.  There is nothing intel can do about gaming perfomance at this point, but 9900K will show its true benefits for gaming over years.



It's not Florida's fault that it doesn't snow there so residents can take advanage of 4WD.   I have been hearing the advantage of more than 4/8 cores for 2+ years now and not seen anything yet.




Crowley said:


> I am going to say it up front, I prefer Intel over AMD. But looking at these benchmarks and going over the price difference, it makes me sad to admit that
> 
> I play my games at 1440p and when I compare the performance differences between the 9900k and 2700X, I just can't figure out the insane price difference



So why are you comparing the 9900k to the 2700x ?     Why not the $900 2950X for which the 9900k has comparable performance ?

"A surprising result is that Core i9-9900K matches performance of AMD's 16-core / 32-thread Threadripper 2950X processor when averaged over our whole CPU test suite. "

The proper comparison would be the $399 9700k and the $305 2700x still significant price difference but by no means insane, especially considering difference in OC potential.




dirtyferret said:


> I'm sure some fan boy will come and say how its future proofing to buy this CPU (maybe Toms will will even write an article) but in the future I would expect more from $500 CPU.  Maybe there will be a rush of used 8700k for cheap on ebay from this....



If you need a 8/16 core CPU for workstation apps then you but a $500 CPU ... if you're gaming, why would you buy the 9700k or 9600k ?




E-curbi said:


> NO need to be rich, just create your gaming rig from hand-me down 2ndary components. Plus an outstanding GPU.
> 
> "Incremental upgrades over time" is the way to do it, no need to shell out lots of cash all at once. Time is NEVER our friend in this world EXCEPT when upgrading your PC(s).
> 
> Two individual PCs on your work desk is the only way to live brother.



I prefer to pass the old PC on and build new.... sure the cost is bigger doing it all at once... but if "time is money", then it's a losing proposition, especially on a WC build and also "can you afford to have PC down while you do it ?"..,,  many make a living on thier PCs ... so what what works for anyone depends in individual situation

As for two Pcs on ya desk, a bit space limiting and, the option exists to put two in one box ....


----------



## Fluffmeister (Oct 19, 2018)

dicktracy said:


> On average with a 2080 Ti. O.O AMD has a lot of catching up to do.
> https://www.computerbase.de/2018-10/intel-core-i9-9900k-i7-9700k-cpu-test/
> View attachment 108998



Once again the i5 8400 continues to shine!


----------



## GoldenX (Oct 19, 2018)

Fluffmeister said:


> Once again the i5 8400 continues to shine!


The list is still the same, G4560, R3 1200, i5 8400. Those are the best gaming CPUs.

Get that 600 bucks nuclear reactor Skylake out of my face.


----------



## John Naylor (Oct 19, 2018)

fabtech said:


> Yes indeed, am I the only one to be disappointed by this CPU ? The scaling in Multithread is very poor, some other tests have shown a huge decrease in speed when used with heavy Multithreaded tasks... If OC is the only way to keep high frequency, a very expensive cooler will be needed and a powerful PSU, getting an increase of 140W in power consumption when OC only at 5.2 GHz is just not possible for a little workstation, a 7900x will do much better, even my 6950x (got it at $500) performs better in Multithread tasks @ just 4.4GHz (Cinebench score 2318).
> 
> Well, I am not a high core gamer, but I do play with a 1440p monitor, this 9900K gives no advantages at this resolution, so this cpu is not worth for gaming at this resolution in front of a 2600x, 2700x, 8700K...
> So what is the real target of this CPU(sold 700€ in EU)  ? As little workstation, a 1920x way cheapper (220€ less expensive than the 9900K in EU), a 1950x (sold just a little bit more expensive) will be better choice for workstation.
> Well the 9900K do well in all kind of task but the 2700x do well too and i IS for my opinion, the BEST CPU VALUE for all kind of tasks, the 9900K is crucified by its high price.



I don't understand the comparisons ... as the review  states, the 9900k has comparable CPU performance as the $900 2950x so why is it being compared to the 2700x ?    Wouldn't the proper comparison be the 9700k ?




PopcornMachine said:


> Another review where I am dumbfounded with the conclusion of "Editor's Choice".
> 
> Leaves me wondering if the reviewer read his own review.



As above .... 9900k's average CPU performance is same as 2950X ... 20% better gaming performance then Ryzen 2700x.   I don't "get" the more cores thing as only 2% of PC users will see any benefit of "more cores", but then again all we been hearing for 2 years has been "OK, Ryzen can't catch Intel in gaming but it has more cores if ya want to do anything else".  Has this ceased to be true literally overnight ?  It's never been true in my book but I don't understand the sudden reversal.  If you don't need  8/16 core CPU then no need to buy one.




ShurikN said:


> I think Steve from Hardware Unboxed said it best. "For whom was this chip made?!"
> It's marginally better than 8700K in gaming while costing $150-200 more.
> It's marginally better than 2700X in productivity while costing almost two times more.



How about the poterntial 2950x customer who wants better gaming and cupla $100 bills left in his pocket.


----------



## GoldenX (Oct 19, 2018)

AMD has SMT on the 8 core models.


----------



## R-T-B (Oct 20, 2018)

E-curbi said:


> Now we have to pay for a delid and a grind?



According to a guy who literally makes a living selling those types of services, yes.  Personally I doubt you'll see much difference.  Solder is more or less solder and thickness of the bond won't account for much.


----------



## Tom_ (Oct 20, 2018)

Dante Uchiha said:


> Weird... how consumption of the 9900k can be less than the 8700k?



Are you drunk?
The 9900K draws over 200W under load, even without overclocking.


----------



## hat (Oct 20, 2018)

So, we're back to Pentium vs Athlon? Intel's hot, power hungry chips with lots of gigglehurtz are slightly faster than AMD's cheaper, more efficient offerings.


----------



## trparky (Oct 20, 2018)

ShurikN said:


> I think Steve from Hardware Unboxed said it best. "For whom was this chip made?!"
> It's marginally better than 8700K in gaming while costing $150-200 more.
> It's marginally better than 2700X in productivity while costing almost two times more.
> 
> Not to mention you can get an entire 2700X system with mobo and 16GB of ram for the same preorder/inflated price


It sure looks like Intel lost their way. Epic fail.


ShurikN said:


> Not only that, but you can't hit the same clocks as a delided 8700K. Plus, you need a monster of a cooler for any type of OC. And even with it, it'll still hit 90C.
> 
> Gotta hand it to Intel, they made both 2700X and original Coffee Lake look amazing.


Further proof that Intel forgot how to compete. Now that they have an AMD that's actually competitive nipping at their heels they don't know how to react.


----------



## mcraygsx (Oct 20, 2018)

W1zzard said:


> To me this looks like "working as intended". The CPU is rated for 95 W, so it'll run at 95 W max (it can go beyond that for a short duration).
> 
> If you have a better heatsink -> dial up the TDP. If Intel magically changed their 95 W parts to 150 W over night via BIOS update, all hell would break lose



Excellent point, thank you for reply.


E-curbi said:


> Sorry if I'm a little late to the party. I just saw this Der8auer video.
> 
> Now we have to pay for a delid and a grind?
> 
> ...



At this point INTEL can take a lesson from AMD's Gold plated Solder used in RYZEN.


----------



## E-curbi (Oct 20, 2018)

Now it’s a delidding plus a sanding.

Would you like your 9900K quarter-sanded, half-sanded or fully-sanded? 

We charge $50 for every 0.2mm.

Don’t worry, we have a special this week on a quarter sanding…And if you can’t afford that, well there’s always my daughter … Sandy.


----------



## GoldenX (Oct 20, 2018)

What about good olde lapping after the delidding and sanding?


----------



## EatingDirt (Oct 20, 2018)

John Naylor said:


> I don't understand the comparisons ... as the review  states, the 9900k has comparable CPU performance as the $900 2950x so why is it being compared to the 2700x ?    Wouldn't the proper comparison be the 9700k ?
> 
> How about the poterntial 2950x customer who wants better gaming and cupla $100 bills left in his pocket.



First off, saying the "9900k is comparable to the 2950x" is misleading. In well threaded workloads, that utilize all the cores well, the 2950x is _well_ ahead of the 9900k as seen in these benchmarks:
https://tpucdn.com/reviews/Intel/Core_i9_9900K/images/wprime.png
https://tpucdn.com/reviews/Intel/Core_i9_9900K/images/veracrypt.png
https://tpucdn.com/reviews/Intel/Core_i9_9900K/images/7zip-pack.png
https://tpucdn.com/reviews/Intel/Core_i9_9900K/images/7zip-unpack.png

Where the 9900k excels in the CPU comparison, the 8700k also often excels, and that is in single threaded workloads such Microsoft Office & Photo Editing(activities where the difference between the fastest & slowest workloads is less than 1 second).

Someone property utilizing the 2950x (and even all the way down to a 1920x(which is cheaper than the 9900k currently btw)) probably wouldn't even consider a 9900k as an option.


----------



## HTC (Oct 20, 2018)

The price / performance should be amended when compared to other CPUs that come with cooler, regardless of manufacturer, i think:

- if the test is @ stock *and the CPU does not come with a stock cooler*, then add to the CPU's price, the price of whatever cooler is deemed need for it to be able to run @ stock
- if the test is overclocked then try to have all CPUs use the same cooler, regardless if the CPUs came with stock cooler or not. Obviously this isn't always possible due to socket incompatibility, such as TR3 VS AM4, for example: *in all of these cases*, the price of the cooler should be added to the CPUs, from both camps, and *only then* the price / performance ratio should be calculated

In the case of this particular review, only the 9900K was overclocked so only it needs to add the price of whatever cooler was used while those CPUs it's compared to need to add zero if they came with a stock cooler or the price of whatever cooler used to be able to run @ stock, and only then do the price / performance ratio.


----------



## 1d10t (Oct 20, 2018)

E-curbi said:


> "The 9900K is the fastest gaming CPU on the planet!"
> *(IFF - you are playing two games simultaneously, and streaming both games, also simultaneously.) *
> IFF from Calculus courses if I remember correctly means *IF and only IF.*  Oh Intel, what are we gonna do with you?



Well I *do *play games ( Battlefield 1 ultra settings + HDR )  and streamed them while running Linux's VM in background, all said and done with just crappy Ryzen 5 1600.Did i overdo it? 

Something that hadn't exposed yet,that larger gap between base clocks and turbo clocks tend to make spikes in game. I don't know if anyone notice or worse, reluctant to admitted it. My observation concludes AMD CPU are low in fps,but they keep stable as long as it is with barely noticeable impact.On the other hand, Intel CPU high in fps but declining over time, both happened in 7700K and the worst are 8700K,reaching "visible" spikes.


----------



## fabtech (Oct 20, 2018)

HARWARE UNBOXED 9900K OC 85C with a CUSTOM LOOP 360 !  So exit KRAKEN X72 and other AIO for 5GHZ and more... Well Done Intel !


----------



## GoldenX (Oct 20, 2018)

1d10t said:


> Something that hadn't exposed yet,that larger gap between base clocks and turbo clocks tend to make spikes in game. I don't know if anyone notice or worse, reluctant to admitted it. My observation concludes AMD CPU are low in fps,but they keep stable as long as it is with barely noticeable impact.On the other hand, Intel CPU high in fps but declining over time, both happened in 7700K and the worst are 8700K,reaching "visible" spikes.



Shh, let them dream. "Best gaming CPU".


----------



## Captain_Tom (Oct 20, 2018)

dicktracy said:


> On average with a 2080 Ti. O.O AMD has a lot of catching up to do.
> https://www.computerbase.de/2018-10/intel-core-i9-9900k-i7-9700k-cpu-test/
> View attachment 108998



The 9900K is literally the most powerful chip for gaming Intel can make now, and in the next 2 years.  Period.

Meanwhile AMD didn't even bother to make an R7 2800X because they knew Intel would have to spend twice as much money to win by 10% - might as well let them and sell the best yields on EPYC.  *Intel has a lot of catching up to do... although they probably won't till 2022.*


----------



## Melvis (Oct 20, 2018)

hat said:


> So, we're back to Pentium vs Athlon? Intel's hot, power hungry chips with lots of gigglehurtz are slightly faster than AMD's cheaper, more efficient offerings.



Not exactly, Yes Intels is Hot hungry chips but back then there CPU's was also slower, today they are Hot hungry but faster.


----------



## Captain_Tom (Oct 20, 2018)

Melvis said:


> Not exactly, Yes Intels is Hot hungry chips but back then there CPU's was also slower, today they are Hot hungry but faster.



Also back then Intel's chips cost the same amount of money to make as AMD's.... Now they cost substantially more.

AMD is making chips 80% as powerful as Intel's, and they use almost half the energy... and they cost 30%+ less to produce.  Intel is in worse shape than before overall.

P.S.  Oh, and that is just talking about Desktop.  On Server AMD is whipping the floor with Intel in top "halo" performance, price/perf, efficiency, AND security.  It's a bloodbath.


----------



## GoldenX (Oct 20, 2018)

Captain_Tom said:


> Also back then Intel's chips cost the same amount of money to make as AMD's.... Now they cost substantially more.
> 
> AMD is making chips 80% as powerful as Intel's, and they use almost half the energy... and they cost 30%+ less to produce.  Intel is in worse shape than before overall.
> 
> P.S.  Oh, and that is just talking about Desktop.  On Server AMD is whipping the floor with Intel in top "halo" performance, price/perf, efficiency, AND security.  It's a bloodbath.


That's the best of this, Intel's only possible answer is to fight back, making cheaper and better products. Dead are the days of selling the same quad core over and over again.


----------



## Tsukiyomi91 (Oct 20, 2018)

at least Intel sold a 8 core, 16 thread monster for the mainstream market. Highly unlkely that new owners of this part will need an upgrade for the next 3-5 years, considering this beast clocks at 5GHz comfortably on air, probably will sustain that with water-cooling.


----------



## R0H1T (Oct 20, 2018)

Tsukiyomi91 said:


> at least Intel sold a 8 core, 16 thread monster for the mainstream market. Highly unlkely that new owners of this part will need an upgrade for the next 3-5 years, considering this beast clocks at *5GHz comfortably on air*, probably will sustain that with water-cooling.


Nope, to keep it cool you will need water cooling, air cooler only when the ambient temps are real low & the case airflow is good.



The 9900k is good till 4.7~4.8 GHz all core, on air, the 9700k though does seem be much better till 5GHz on all cores. The HT really limits 9900k max core clocks & OC, not to mention the temps & power consumption run away after 4.7GHz


----------



## loki1944 (Oct 20, 2018)

Very nonplussing, especially for gaming, but then it's been that way for a while now when it comes to CPUs. Even Bloomfield with a 980Ti or 1070 is plenty for 1080p/1440p gaming @very high or ultra settings. The one comfort I take in this is that I can still get by on 4C/8T for a very, very, very long time.


----------



## Shatun_Bear (Oct 20, 2018)

Tsukiyomi91 said:


> at least Intel sold a 8 core, 16 thread monster for the mainstream market. Highly unlkely that new owners of this part will need an upgrade for the next 3-5 years, considering this beast clocks at 5GHz comfortably on air, probably will sustain that with water-cooling.



Well it's technically 'mainstream' but at £600 priced in the ultra high-end range like they used to price their HEDT CPUs (Haswell-E).


----------



## 1d10t (Oct 20, 2018)

Tsukiyomi91 said:


> at least Intel sold a 8 core, 16 thread monster for the mainstream market. Highly unlkely that new owners of this part will need an upgrade for the next 3-5 years, considering this beast clocks at 5GHz comfortably on air, probably will sustain that with water-cooling.



Its FX 9590 all over again.Back then when I use that chip, AMD stated that TDP was *220*W. Turn out that was a lie, TDP skyrocketting to *225*W in Prime 95 Small FFT, and i'm having trouble to keep Crosshair V VRM temp at bay, though a mere Corsair H50 are more than sufficient to handle CPU.Short story my system constantly showing BSOD's after a year of usage, although both CPU and motherboard doesn't have any physical damage.  
Now let's flip the story...what *IF* someone make 95W CPU and full load are unknown, need hefty 360mm or 420mm cooling, do you believe this chip last longer than a year?


----------



## efikkan (Oct 20, 2018)

M2B said:


> It's not intel's fault that today's games do not fully utilize an eight core CPU.
> 
> same for the 2700X.
> 
> there is nothing intel can do about gaming perfomance at this point, but 9900K will show its true benefits for gaming over years.





Captain_Tom said:


> It actually kind of is Intel's fault.  They were the ones that continued to pressure devs to focus on quad-cores for gaming.


To the both of you, let's put this one to rest once and for all.
Multithreading for games doesn't work that way at all. We will not get games which fully utilizes 6+ cores for rendering. The direction in game development is less CPU overhead and more of the heavy lifting on the GPU.

Most people misunderstand the features of Direct3D 12. While it is technically possible to have multiple CPU threads build a single queue, the added synchronization and overhead in the driver would be enormous. For this reason, we're not going to see more than 1 thread per workload that can be parallelized, which means separate rendering passes, particle simulation, etc. So games having 6+ threads for rendering is unlikely, and even for games having 2-3, all the main rendering will be done by the main rendering thread.

Intel or AMD is not to blame here, not the developers either, just forum posters and tech journalists driving up expectations without any technical expertise.


----------



## Liviu Cojocaru (Oct 20, 2018)

Great review. Even though this is bad buy the fanboys will love it anyway


----------



## R0H1T (Oct 20, 2018)

Captain_Tom said:


> Also back then Intel's chips cost the same amount of money to make as AMD's.... *Now they cost substantially more*.
> 
> AMD is making chips 80% as powerful as Intel's, and they use almost half the energy... and they cost 30%+ less to produce.  Intel is in worse shape than before overall.
> 
> P.S.  Oh, and that is just talking about Desktop.  On Server AMD is whipping the floor with Intel in top "halo" performance, price/perf, efficiency, AND security.  It's a bloodbath.


I doubt that's the case, Intel owns their fabs while AMD uses TSMC & GF along with Sammy. If you're talking about operational costs, or MCM approach by AMD, then that's a separate issue.
I don't believe though that Intel chips cost more to produce, in fact it might well be the exact opposite.


----------



## Tsukiyomi91 (Oct 20, 2018)

@R0H1T yep. Still, reaching 5 on all cores is still a feat, considering the R7 2700X still struggling to reach such clocks.
@Shatun_Bear to be precise it's a high-end mainstream SKU. Dunno whether that'll fit such a description for it or not xD
@1d10t I think a 240mm rad in push-pull config with fans set to mild profile & using a really good thermal paste, I think the load temps for the i9 part would/may hover round the mid 70C,
depending heavily on ambient room temps.


----------



## Durvelle27 (Oct 20, 2018)

Wow

Exactly what I expected from the i9


----------



## EarthDog (Oct 20, 2018)

efikkan said:


> To the both of you, let's put this one to rest once and for all.
> Multithreading for games doesn't work that way at all. We will not get games which fully utilizes 6+ cores for rendering. The direction in game development is less CPU overhead and more of the heavy lifting on the GPU.
> 
> Most people misunderstand the features of Direct3D 12. While it is technically possible to have multiple CPU threads build a single queue, the added synchronization and overhead in the driver would be enormous. For this reason, we're not going to see more than 1 thread per workload that can be parallelized, which means separate rendering passes, particle simulation, etc. So games having 6+ threads for rendering is unlikely, and even for games having 2-3, all the main rendering will be done by the main rendering thread.
> ...


 I'm using a 16c/32t cpu with HT off (16c/16t) playing COD BLOPS 4. According to task manager, I'm using all cores incredibly evenly.

I assume this is not rendering on more than 2-3 cores? What are we seeing? What are they all doing? 

I agree with what you say, just asking what is going on in that title.


----------



## neomoco (Oct 20, 2018)

Well it seems i will stick another year with my  trusty 2500k  , next spring when zen2 will apear and maby i will buy it  it will be 8 years of 2500k  , who knows maby i will extend it to 10 years if there isn`t a 2x performance gain and for now i can do anything with it  . This 3-5% generational performance gains is getting really boring.


----------



## efikkan (Oct 20, 2018)

EarthDog said:


> I'm using a 16c/32t cpu with HT off (16c/16t) playing COD BLOPS 4. According to task manager, I'm using all cores incredibly evenly.
> 
> I assume this is not rendering on more than 2-3 cores? What are we seeing? What are they all doing?
> 
> I agree with what you say, just asking what is going on in that title.


You have actually very good questions.

Firstly, it's important to understand that utilization in Windows Task Manager is not actual CPU load, but rather how much threads have allocated in the scheduling interval. Games usually have multiple threads waiting for events or queues, these usually run in a loop constantly checking for work, but to the OS these will seem to have 100% core utilization. There are several reasons to code this way, firstly to reduce latency and increase precision, secondly Windows is not a realtime OS, so the best way to ensure a thread gets priority is to make sure it never sleeps. Thirdly, any thread waiting for IO(HDD, SSD, etc.) will usually have 100% utilization while waiting. It's important to understand that the "100% utilization" of these threads is not a sign of CPU bottleneck.

Secondly, game engines to a lot of things that are strictly not rendering or doesn't impact rendering performance unless it "disturbs" the rendering thread(s).
This is a rough illustration I made in 5 min: (I apologize for my poor drawing)

Some of these tasks may be executed by the same thread, or some advanced game engines scale this dynamically. Even if a game uses 8 threads on one machine and 5 on a different one, doesn't mean it will have an impact on performance. Don't forget the driver itself can have up to ~four threads on top of this.

Most decent games these days have at least a dedicated rendering thread, many also have dedicated ones for game loop and event loop. These usually have 100% utilization, even though the _true_ load of event loop is usually ~1%. Modern games may spawn a number "worker threads" for asset loading, this doesn't mean you should have a dedicated core for each, since these are usually just IO wait. I could go on, but you should get the point.
There are exceptions to this, like "cheaply made" games like Euro Truck Simulator 2, which does rendering, game loop, event loop and asset loading in the same thread, which of course give terrible stutter during gameplay.

So you might think it's advantageous to have as many threads as possible? Well, it depends. Adding more threads that are synchronized will cause latency, so a thread should only be given a workload it can do independently and then sync back up, or even better, an async queue. At 60 FPS we're talking of a frame window of 16.67 ms, and in compute time that's not a lot if most is spent on synchronization.


----------



## Assimilator (Oct 20, 2018)

This processor wattage limit thing in the BIOS... haven't Intel boards always had something equivalent? For example, my IVB motherboard has a setting called "Core Current Limit" which I understood to be the maximum amperes that would be allowed to be drawn by the CPU; multiply that by the vCore and you get the maximum wattage the CPU may draw. Or is this something different?


----------



## edbe (Oct 20, 2018)

a. intel is flooding the market with patched and IMMATURE power hungry watt cpu's to f*** this planet.
b. when AMD is about to bring *10 watt* ! 7nm APU with iGPU = to nVidia gtx 1060 mobile !
    then bye bye nVidia and intel !


----------



## EarthDog (Oct 20, 2018)

efikkan said:


> You have actually very good questions.
> 
> Firstly, it's important to understand that utilization in Windows Task Manager is not actual CPU load, but rather how much threads have allocated in the scheduling interval. Games usually have multiple threads waiting for events or queues, these usually run in a loop constantly checking for work, but to the OS these will seem to have 100% core utilization. There are several reasons to code this way, firstly to reduce latency and increase precision, secondly Windows is not a realtime OS, so the best way to ensure a thread gets priority is to make sure it never sleeps. Thirdly, any thread waiting for IO(HDD, SSD, etc.) will usually have 100% utilization while waiting. It's important to understand that the "100% utilization" of these threads is not a sign of CPU bottleneck.
> 
> ...


Thanks!


This is the first time I've seen this cpu even tickled and was surprised to see that much activity across all cores from a game.

Typically we see exactly what you are saying...1/2/3 threads pegged and others tickled with maybe 1/2 at 50%. This just floored me to see the use that high..first title that has done so.


----------



## Dante Uchiha (Oct 20, 2018)

Tom_ said:


> Are you drunk?
> The 9900K draws over 200W under load, even without overclocking.



I'm just contesting the tests, I didn't do them. I think they're not stressing enough the CPU:


----------



## Captain_Tom (Oct 20, 2018)

GoldenX said:


> That's the best of this, Intel's only possible answer is to fight back, making cheaper and better products. Dead are the days of selling the same quad core over and over again.



I am not sure what Intel can even do though.  The 9900K is the best we will see till 2021!  10nm will not be fit for high-end gaming till 2020, and even then it's likely inferior to 7nm and certainly 7nm+.  

Furthermore their 14nm capacity problems are a result of Intel's 6 and 8 core chips taking up twice the space on wafers to produce, and also mobile/server buyers only wanting Intel's best yields (No one wants Intel's 25w mobile i3's lol).  They can't lower prices on their good products because they can't even make enough of them.  Intel is going to be forced to have a ton of $400-$600 good chips that are overpriced, and then a mountain of <$100 quad-cores they try to sell almost at cost just to get rid of them.

In hindsight I wonder if Intel would have done things differently.  I wonder if instead of making 6 and 8 cores ASAP, they would have focused on making 5.2GHz quad-core with sTIM.   I mean my 4.5GHz 6700K games as well as the 9900K most of the time.  Nobody has a use for a $600 8-core.


----------



## Rahmat Sofyan (Oct 20, 2018)

adding more money on cooler, best gaming processor ever .. nice ..


----------



## GoldenX (Oct 20, 2018)

Captain_Tom said:


> I am not sure what Intel can even do though.  The 9900K is the best we will see till 2021!  10nm will not be fit for high-end gaming till 2020, and even then it's likely inferior to 7nm and certainly 7nm+.
> 
> Furthermore their 14nm capacity problems are a result of Intel's 6 and 8 core chips taking up twice the space on wafers to produce, and also mobile/server buyers only wanting Intel's best yields (No one wants Intel's 25w mobile i3's lol).  They can't lower prices on their good products because they can't even make enough of them.  Intel is going to be forced to have a ton of $400-$600 good chips that are overpriced, and then a mountain of <$100 quad-cores they try to sell almost at cost just to get rid of them.
> 
> In hindsight I wonder if Intel would have done things differently.  I wonder if instead of making 6 and 8 cores ASAP, they would have focused on making 5.2GHz quad-core with sTIM.   I mean my 4.5GHz 6700K games as well as the 9900K most of the time.  Nobody has a use for a $600 8-core.


They will have to go to the drawing board. The monolithic design can't continue without a smaller production process.


----------



## Captain_Tom (Oct 20, 2018)

GoldenX said:


> They will have to go to the drawing board. The monolithic design can't continue without a smaller production process.



Which is what my point is when you say "Intel will have to fight back."  Intel has nothing to fight back with... until 2022 when they have a brand new arch on 7nm.


----------



## GoldenX (Oct 20, 2018)

Captain_Tom said:


> Which is what my point is when you say "Intel will have to fight back."  Intel has nothing to fight back with... until 2022 when they have a brand new arch on 7nm.


Well, their fault for being lazy bastards all these years.


----------



## hat (Oct 20, 2018)

GoldenX said:


> Well, their fault for being lazy bastards all these years.


I'm not so sure anymore. The 9900k, while being the best they can produce at the moment (for a desktop chip) is a dud, because of cramming too many cores at too high clocks with too big of a process size, even though they know 14nm very well at this point. Sure they stagnated on core count, but how much more would they be able to do? This can probably be blamed on 10nm issues, and no manufacturer would hold on to an older process that is more expensive and less efficient, no matter how much in the lead they may be. I mean, even if AMD was still messing around with Bulldozer, Intel could still be cranking out more quad core chips per wafer with 10nm than 14nm. There's literally no reason not to do that.


----------



## fabtech (Oct 20, 2018)

Rahmat Sofyan said:


> adding more money on cooler, best gaming processor ever .. nice ..



Here we have an ambiant temp of 21-22C, Just imagine the temps in summer when most of people does not have air conditioning, what could be the result and consequences ?
I live in Asia, I have aircon and the ambiant temp is set to 25-26C, but so many people here does not have aircon, so the ambient temp is around 30-33C as usual temp here so an average of 10 more extra degrees to consider. So this CPU is not made to be used in some countries where temps are high... This CPU is running way more hotter than my 6950x OC 4.4GHz, I really believe there is a problem with this 9900K and we may hear a lot about it with the first buyers as soon as the climate gets warmer.


----------



## GoldenX (Oct 20, 2018)

This is the Pentium D all over again.


----------



## Rahmat Sofyan (Oct 20, 2018)

same bro, I live in Indonesia and really near to equator line.

I've tried 86K + Raijintek Triton, the temp so bad indeed, moreover at summer, maybe 75-100 still safe for the CPU, but I really feel not safe with it, or comfortable with it, my limit since Athlon Xp 2000 only 65, above that .. nope ..

I'm sticked with 3770K right now, next upgrade almost definately 26X or wait for Ryzen 3000.

the big issues that I thinked, almost all tech review give these lake cpus as a recommended, or great bla bla bla .. yes indeed maybe in term of performance, purely.. but how overall ? the cost you may add for decent cooler, price, temp, performance and price percore, etc ..


fabtech said:


> Here we have an ambiant temp of 21-22C, Just imagine the temps in summer when most of people does not have air conditioning, what could be the result and consequences ?
> I live in Asia, I have aircon and the ambiant temp is set to 25-26C, but so many people here does not have aircon, so the ambient temp is around 30-33C as usual temp here so an average of 10 more extra degrees to consider. So this CPU is not made to be used in some countries where temps are high... This CPU is running way more hotter than my 6950x OC 4.4GHz, I really believe there is a problem with this 9900K and we may hear a lot about it with the first buyers as soon as the climate gets warmer.


----------



## trparky (Oct 20, 2018)

Captain_Tom said:


> Which is what my point is when you say "Intel will have to fight back."  Intel has nothing to fight back with... until 2022 when they have a brand new arch on 7nm.


They're going to have to take a page out of AMD's CCX playbook if they hope to go any further. There's too many issues with the monolithic production process.


----------



## hat (Oct 20, 2018)

trparky said:


> They're going to have to take a page out of AMD's CCX playbook if they hope to go any further. There's too many issues with the monolithic production process.


That monolithic design process is probably the only thing giving Intel an advantage right now. The CCX design is great in a lot of areas... it's cheap and gives good yields, and it's easily scaleable, but the design is a bit slower than a monolithic design due to higher latency. So, while the monolithic design is better for performance, it's also expensive and doesn't scale easily.


----------



## dalekdukesboy (Oct 20, 2018)

trparky said:


> Nor should you mince words. Now tell us how you really feel!



Wizzard is wise to just put the numbers out there comment moderately and not piss all over Intel or anyone else he does reviews/business with/for....he can leave it to us to say this processor SUCKS.


----------



## hat (Oct 20, 2018)

dalekdukesboy said:


> Wizzard is wise to just put the numbers out there comment moderately and not piss all over Intel or anyone else he does reviews/business with/for....he can leave it to us to say this processor SUCKS.


There's a way to do things professionally and then there's the unprofessional way. w1zzard is a pro.


----------



## GoldenX (Oct 20, 2018)

Wizzard is the best, thanks to him and this site I started overclocking on my own.


----------



## 1d10t (Oct 21, 2018)

Tsukiyomi91 said:


> @1d10t I think a 240mm rad in push-pull config with fans set to mild profile & using a really good thermal paste, I think the load temps for the i9 part would/may hover round the mid 70C,
> depending heavily on ambient room temps.



You didn't get the point do you? I think most of the time CPU itself gonna be fine, but supporting parts aka motherboard would last longer. How motherboard would deal with such high TDP, and pretty much makes VRM at or above their operational condition while withstanding such heat on their PCB? Not to mention we rely on "software sensor" or "embedded solution" in CPU itself to read temperature, gone the days where thermistor  are soldered on motherboard. That, and VRM on new motherboard almost a joke, they slapped 10 ++ chokes on but leave FET on high-side and low-side.Did I mention PWM controller?
In general, all motherboard maker make me sick, they giving plastic shrouds, gay light while cheap out their power delivery system.



efikkan said:


> To the both of you, let's put this one to rest once and for all.
> Multithreading for games doesn't work that way at all. We will not get games which fully utilizes 6+ cores for rendering. The direction in game development is less CPU overhead and more of the heavy lifting on the GPU.
> Most people misunderstand the features of Direct3D 12. While it is technically possible to have multiple CPU threads build a single queue, the added synchronization and overhead in the driver would be enormous. For this reason, we're not going to see more than 1 thread per workload that can be parallelized, which means separate rendering passes, particle simulation, etc. So games having 6+ threads for rendering is unlikely, and even for games having 2-3, all the main rendering will be done by the main rendering thread.
> Intel or AMD is not to blame here, not the developers either, just forum posters and tech journalists driving up expectations without any technical expertise.





efikkan said:


> You have actually very good questions.
> Firstly, it's important to understand that utilization in Windows Task Manager is not actual CPU load, but rather how much threads have allocated in the scheduling interval. Games usually have multiple threads waiting for events or queues, these usually run in a loop constantly checking for work, but to the OS these will seem to have 100% core utilization. There are several reasons to code this way, firstly to reduce latency and increase precision, secondly Windows is not a realtime OS, so the best way to ensure a thread gets priority is to make sure it never sleeps. Thirdly, any thread waiting for IO(HDD, SSD, etc.) will usually have 100% utilization while waiting. It's important to understand that the "100% utilization" of these threads is not a sign of CPU bottleneck.
> Secondly, game engines to a lot of things that are strictly not rendering or doesn't impact rendering performance unless it "disturbs" the rendering thread(s).
> This is a rough illustration I made in 5 min: (I apologize for my poor drawing)
> ...



We kind in same boat here.The introduction D3D12 feature, such as reduce overhead and thus made more draw calls, giving developer to access hardware at driver level, eliminating necessity of HAL. But there still a catch in arithmetic codes , OS and architecture itself.

Many developer still reluctant from using smallint codes dan don't forget the nature of CPU is serial processor, so "paralleling" threads in single execution could pose a quite challenge in Windows itself. If we spawn many threads, CPU would not mark them and OS will decide the first query would execute first and move to next cycle,leaving them "un-cache" unless set instruction said to be cached.Lengthy instruction and Microsoft seem capped their desktop OS to mere 128KB read-ahead, so we have to add wait states between interval otherwise we'll faced incoherency in cache.

Man, i'm mumbling too much,what do i know i'm just hardware guy 
My point is neither CPU maker,developers or OS are to blame, they just guaranteed their product is widely adopted.


----------



## dalekdukesboy (Oct 21, 2018)

hat said:


> There's a way to do things professionally and then there's the unprofessional way. w1zzard is a pro.



Exactly what I was saying but more succinct and professional. However, I'm not a professional I'm a hack who tortures hardware and shitposts on boards like this saying what I mean with no filter....pure bliss.


----------



## Shatun_Bear (Oct 21, 2018)

hat said:


> That monolithic design process is probably the only thing giving Intel an advantage right now. The CCX design is great in a lot of areas... it's cheap and gives good yields, and it's easily scaleable, but the design is a bit slower than a monolithic design due to higher latency. So, while the monolithic design is better for performance, it's also expensive and doesn't scale easily.



But we've got some interesting market shake-up happening when 7nm Ryzen launches 1H next year since I just do not know what Intel are going to counter it with. They'll lose the performance crown, and lose on efficiency and price too. It's a triple whammy.

Like I said before, the only realistic counter to 7nm Ryzen is another 14nm refresh. This will be seriously humiliating as it will lose in all metrics.  And as the hot and power hungry 9900K is at the limit already, they're backed in a corner. They can't push clocks any further. Anything past 4.7Ghz produces serious heat and power draw if we're talking 8-cores.

TheIR 10nm, when it does arrive in late 2019 best case, won't be in the form of a 10nm 8-core to counter the 3700X. A 10nm 8-core from Intel is very likely to arrive in 2020! By that time AMD will be about to release 7nm+ or the 4700X.


----------



## ppn (Oct 21, 2018)

This integrated graphics takes space that could be populated by 4 more cores. And yet 178mm2 9900K is not big at all, compared to 8 core zen 192mm2 is smaller. 2600K sandy bridge is 216mm2.

Zen 2 can offer at best 8 cores around 100mm2, no integrated gpu. The final optimisation of 7nm+ is 4x desity, but this is something else. 2x density of 12nm glo fo/samsung and tsmc, which are not as dense as 14nm++ intel.


----------



## R0H1T (Oct 21, 2018)

hat said:


> That monolithic design process is probably the only thing giving Intel an advantage right now. The CCX design is great in a lot of areas... it's cheap and gives good yields, and it's easily scaleable, but the design is a bit slower than a monolithic design due to higher latency. So, while the monolithic design is better for performance, it's also expensive and doesn't scale easily.


Intel is going the MCM route after the *lakes* dry out, that's why they hired Keller & bought this company recently. The ring bus will die a slow death, unless Intel decides that MSDT & ring bus will coexist with their MCM solutions. I think their dGPU & next major uarch change may also coincide around the same time. It could be a perfect storm or quite likely the other one which could catapult AMD in the lead, if things don't pan out the way Intel's planning.


----------



## trparky (Oct 21, 2018)

hat said:


> That monolithic design process is probably the only thing giving Intel an advantage right now. The CCX design is great in a lot of areas... it's cheap and gives good yields, and it's easily scaleable, but the design is a bit slower than a monolithic design due to higher latency. So, while the monolithic design is better for performance, it's also expensive and doesn't scale easily.


Which is what Intel is running into. Why do you think these chips are so expensive? I'm betting that they're having yield issues in which (due to how complex these chips are) many chips don't come out quite good enough to be a Core i9. One way or another they will have to adopt a CCX-like design at some point, the more cores you try to pack onto a single die/chip the more complex things get. This is simply the nature of the beast. Not everything can be perfect.


----------



## GoldenX (Oct 21, 2018)

trparky said:


> Which is what Intel is running into. Why do you think these chips are so expensive? I'm betting that they're having yield issues in which (due to how complex these chips are) many chips don't come out quite good enough to be a Core i9. One way or another they will have to adopt a CCX-like design at some point, the more cores you try to pack onto a single die/chip the more complex things get. This is simply the nature of the beast. Not everything can be perfect.


And we will remember this time as the "highest IPC, no matter the cost".


----------



## R0H1T (Oct 21, 2018)

GoldenX said:


> And we will remember this time as the "highest IPC, *no matter the cost*".


You mean (*ST*) performance, because if this is Intel's best "ever" then AMD's already won.


----------



## Mats (Oct 22, 2018)

If you're disappointed then I guess you had hopes, and if you had hopes then I don't understand how you got them in the first place. Intel is stuck at 14 nm and we all knew that,
how much magic could Intel put into this chip without changing the process? AND IT IS STILL CALLED COFFEE LAKE.

The 9900K is amazing for what it is given the 4+ year old (but improved) process that's being used, and it runs hot for the same reason. It's overpriced, but that's nothing new for the fastest of its kind.
If you thought it would cost $100 less, just be happy that you didn't buy it, as it barely makes sense today, and will make even less sense whenever the successor shows up.
If you don't mind the actual price then I guess you upgrade every year or so, so yeah, you can afford it.

I won't buy it, still happy with my 2600K, and next time I'll probably go back to AMD.


----------



## dalekdukesboy (Oct 22, 2018)

Mats said:


> If you're disappointed then I guess you had hopes, and if you had hopes then I don't understand how you got them in the first place. Intel is stuck at 14 nm and we all knew that,
> how much magic could Intel put into this chip without changing the process? AND IT IS STILL CALLED COFFEE LAKE.
> 
> The 9900K is amazing for what it is given the 4+ year old (but improved) process that's being used, and it runs hot for the same reason. It's overpriced, but that's nothing new for the fastest of its kind.
> ...



THis!.... Yes 2600k is still pretty darn good to do most things I'd guess, I've got e5-1680 v2 8 core Ivy and I absolutely love it! It is 22nm but has same cores as this and quad channel memory doesn't OC as high without extreme cooling but 4.4-4.6 doable on moderate water cooling and performs better particularly on memory bandwidth and surprisingly uses about the same or less watts to do it! So to me this processor wouldn't be so disappointing in 2018 except I have a processor from 2013 which I bought for couple hundred bucks less than this thing and it outperforms it or equals it on every metric....with half a decade intel advancement that is pretty sad, boo intel.


----------



## HTC (Oct 22, 2018)

There have been "wild" variations in power usage across reviews for this chip.

Upon "investigation", Hardware Unboxed found the issue: depending on the board used, it's VRM capabilities and whether or not the CPU has "been heated" prior to whatever bench is being tested, the performance drops significantly VS testing with a board that can actually use everything the CPU can "throw @ it", so long as the cooler used manages to perform without forcing throttling.


----------



## Metroid (Oct 22, 2018)

Point taken,  _I would never buy the 9900k over 2700x, it just makes no sense but I would think about 9700k for a proper upgrade._


----------



## GreiverBlade (Oct 22, 2018)

meh... oh well thanks for the review, it confirm my path for a 2600/2700 (or X variante but sometime non X perform better) as next upgrade specially given the number at 1440p



Metroid said:


> Point taken,  _I would never buy the 9900k over 2700x, it just makes no sense but I would think about 9700k for a proper upgrade._


well ... i have a 6600K but i don't even considere a 9600k/9700k as a proper upgrade  nor would i considere any of the 2 over a 2700/X (even over a 2600/X)


----------



## trog100 (Oct 22, 2018)

HTC said:


> There have been "wild" variations in power usage across reviews for this chip.
> 
> Upon "investigation", Hardware Unboxed found the issue: depending on the board used, it's VRM capabilities and whether or not the CPU has "been heated" prior to whatever bench is being tested, the performance drops significantly VS testing with a board that can actually use everything the CPU can "throw @ it", so long as the cooler used manages to perform without forcing throttling.



we have a chip with a claimed boost of 5g but only if two cores or less are in use.. more than two it only boosts to 4.7g.. it would also seem that after 30 minutes continuous use  it all drops down to 4.4g..

none of this is made clear to the average user.. having said that the average user probably dosnt care anyways.. i still recon this thing comes close to to being sold under false pretenses.. it aint what it claims to be..

trog


----------



## GreiverBlade (Oct 22, 2018)

trog100 said:


> we have a chip with a claimed boost of 5g but only if two cores or less are in use.. more than two it only boosts to 4.7g.. it would also seem that after 30 minutes continuous use  it all drops down to 4.4g..
> 
> none of this is made clear to the average user.. having said that the average user probably dosnt care anyways.. i still recon this thing comes close to to being sold under false pretenses.. it aint what it claims to be..
> 
> trog


well still better than a K chip locked to default frequencies and BSOD once you apply a single hertz to OC ... like my 6600K


----------



## caleb (Oct 23, 2018)

Send 9600k review  ? Any ETA on that ?


----------



## Vario (Oct 23, 2018)

In future reviews for the 9600K, I am interested to see how the 8600K thermals compares to the 9600K thermals at the same turbo multiplier.  That would show the difference made by the thermal interface material.


----------



## xorbe (Oct 23, 2018)

1d10t said:


> Something that hadn't exposed yet,that larger gap between base clocks and turbo clocks tend to make spikes in game.



No, this is well known.  This is why you make sure your clock is stuck at max freq for gaming.  Same as nvidia's prefer max performance so the clock doesn't accidentally drop during light loads, leading to crap framerates for a bit when the load increases.  Good for laptop battery life, bad for performance desktops.


----------



## R-T-B (Oct 24, 2018)

R0H1T said:


> You mean (*ST*) performance, because if this is Intel's best "ever" then AMD's already won.



He meant what he said:  IPC.  AMD has not already won in that department.  IPC is irrelevant to ST vs MT debates.  It's measured on one core by it's nature.


----------



## Prima.Vera (Oct 24, 2018)

So I'm guessing there is *ZERO need to upgrade* my god ol' 3770K CPU, especially since I am gaming on 3440 x 1440 @ 100Hz Monitor on a GTX 1080. Correct me if I am wrong, but just to see a frame increase of up to 2, 3%, I need ~1500$ in hardware upgrades (CPU+Cooler+Mobo+RAM) ; it's *absolutely not worth it at all*.


----------



## trparky (Oct 24, 2018)

Prima.Vera said:


> So I'm guessing there is *ZERO need to upgrade* my god ol' 3770K CPU, especially since I am gaming on 3440 x 1440 @ 100Hz Monitor on a GTX 1080. Correct me if I am wrong, but just to see a frame increase of up to 2, 3%, I need ~1500$ in hardware upgrades (CPU+Cooler+Mobo+RAM) ; it's *absolutely not worth it at all*.


I don't know about that, a friend of mine upgraded this older 4000-series Intel system to a new 8700K-based system and he's reporting to me amazing results. He's loving the fact that his 1080Ti is finally being used to its full potential, he never could get it to work at 100% which indicated that the processor was the bottleneck.


----------



## Prima.Vera (Oct 25, 2018)

trparky said:


> I don't know about that, a friend of mine upgraded this older 4000-series Intel system to a new 8700K-based system and he's reporting to me amazing results. He's loving the fact that his 1080Ti is finally being used to its full potential, he never could get it to work at 100% which indicated that the processor was the bottleneck.


If he's playing on 1080p, then yeah. But for 3440x1440p already the GPU it's the bottleneck. Just see the reviews from here or Anandtech....


----------



## R0H1T (Oct 25, 2018)

R-T-B said:


> He meant what he said:  *IPC*.  AMD has not already won in that department.  IPC is irrelevant to ST vs MT debates.  It's measured on one core by it's nature.


It also varies by application & task, for instance AES & SHA benchmarks is where AMD matches or beats (SHA) Intel regularly. ST is a function of IPC x clock speed, that's what I was referring to & if it weren't for Intel's super high clock speeds, there's no way Intel could price the 9900k anywhere near $500 or more, for HEDT.


----------



## GoldenX (Oct 25, 2018)

It's not a problem of clock speed, it's their cost of producing that many great batches, and they chairmen not releasing that their products are now shi... obsolete.


----------



## hat (Oct 25, 2018)

The 9900k is a bold move. Even as a last ditch effort of making the most of what they have (skabfee lake refresh on 14nm), though they do manage to stay on top as far as performance goes, it comes at an arrogantly high price and sucks down power like a v12 engine. It's like Intel's Fermi. They're at the end of the line now, though, and need to make some big changes.


----------



## E-curbi (Oct 30, 2018)

I'm quoting from email the Siliconlottery.com guy.

"Got a few more 9900Ks in, it does look like they're going to clock better than 9700Ks this time.

Due to Intel hard-binning."

...end of quote.

Breaking News!


----------



## HD64G (Nov 10, 2018)

Some benchmarks and comparisons of it with the 95TDP limit forced. Much performance loss leading to lamost 2700X levels. Intel (with the help of their board partners) cheating again as usual. Zen2 will demolish them in all market segments both in raw performance and in VFM me thinks.


----------



## EarthDog (Nov 11, 2018)

HD64G said:


> Some benchmarks and comparisons of it with the 95TDP limit forced. Much performance loss leading to lamost 2700X levels. Intel (with the help of their board partners) cheating again as usual. Zen2 will demolish them in all market segments both in raw performance and in VFM me thinks.


https://www.anandtech.com/show/13544/why-intel-processors-draw-more-power-than-expected-tdp-turbo


----------



## HD64G (Nov 11, 2018)

EarthDog said:


> https://www.anandtech.com/show/13544/why-intel-processors-draw-more-power-than-expected-tdp-turbo


I know that BIOS settings are the cause of this controversy but in Ryzen CPU reviews none of the reviewers present results that have limits over the base BIOS settings as stock. That alone is the scandal in this case.


----------

