# Intel Core i9-10900K



## W1zzard (May 25, 2020)

Intel's Core i9-10900K achieves highly impressive gaming performance thanks to its 10-core/20-thread design with up to 5.3 GHz. We compare three configurations in our 10900K review: all stock, boost limits removed, and a manual 5.1 GHz all-core overclock.

*Show full review*


----------



## Br0nnOfTheBlackwater (May 25, 2020)

What's the difference between 3.7/5.3 and max turbo?


----------



## Vayra86 (May 25, 2020)

310W peak. Something about comets and craters, funny twist, obligatory lol

Other than that, yeah. Nice heater, pretty useless otherwise. Its actually only usable at stock, and you do get most of the performance that way. It is competitive on price if you consider the competition.
But... 310W peak. Hell no. The first sunny day your OC falls apart, first heat wave you've got a cataclysmic event in your case. Staying FAR away.

This does confirm Comet Lake is only competitive if priced right. Might as well buy previous gen if cheaper.

Turbo Boost Max... Oh Intel... please, just don't add Ultra come gen 11 please. We get it, its a boost.


----------



## KarymidoN (May 25, 2020)

Thanks for the Review @W1zzard . Amazing work as always.
After reading it some things come to mind... this is clearly a stopgap for Intel, cause in a few months AMD is gonna trounce it with ZEN3 and Ryzen 4000 desktop chips... it already looks bad cause AMD is already giving discounts on the 3900X (USD 389.99 on Microcenter)... on a cheaper plataform and with upgrade guaranteed to Next gen while consuming less power.... sheesh.
Once you factor in the CPU Price + Cooler + New Motherboard you already in the 800~1000 USD margin... that's insane.


----------



## bug (May 25, 2020)

In some ways, this is Prescott 2.0.


----------



## W1zzard (May 25, 2020)

Br0nnOfTheBlackwater said:


> What's the difference between 3.7/5.3 and max turbo?


3.7/5.3 is with the default power limits active (PL1 = 125 W, PL2 = 250 W, etc). 
"Max Turbo" is with those limits removed


----------



## Cobain (May 25, 2020)

Amazing job, was waiting for this review ! It seems the i5 10600k is the real winner for high end gaming rigs? 260€ and almost same performance as the other top of the line Intel CPUs.

Was also shocked to see differences *up to 30% *on some gaming scenarios, compared to AMD, damn that's a lot imo.

The total cost is too much tho, I would get a 10600k for Pure high refresh gaming machine, 3900x por serious productivity, 3600 for good all arounder budget machine.


----------



## Vayra86 (May 25, 2020)

Cobain said:


> Amazing job, was waiting for this review ! It seems the i5 10600k is the real winner for high end gaming rigs? 260€ and almost same performance as the other top of the line Intel CPUs.
> 
> Was also shocked to see differences *up to 30% *on some gaming scenarios, compared to AMD, damn that's a lot imo.
> 
> The total cost is too much tho, I would get a 10600k for Pure high refresh gaming machine, 3900x por serious productivity, 3600 for good all arounder budget machine.



The only game that is really behind the curve for AMD is Far Cry 5, or did you see any others with substantial gaps? Most others seem to be closing in on margin of error territory. 10 FPS on >150 FPS averages is negligible and you will _never_ see any return of that in a real life scenario. Not in the least because there are other performance limiters even besides the GPU; the new boost tech will quickly destroy the Intel advantage you see here. More load = lower ST perf and clock for clock Ryzen is easily as fast. On the previous gen there were too many 'Far Cry 5'-like examples around. But now? They seem all but gone. So I'm genuinely curious what other games you perceive as problematic / shocking.

I am a 'gaming perf' advocate myself, but nuance must be applied here. It is highly doubtful you will be getting noticeably better gaming perf out of this gen with Intel. And if you do, it will be at increased cost, power usage, heat. Intel is trying hard to top benchmarks here, not create the most usable CPU. Something to keep in mind.


----------



## Cobain (May 25, 2020)

Vayra86 said:


> The only game that is really behind the curve for AMD is Far Cry 5, or did you see any others with substantial gaps? Most others seem to be closing in on margin of error territory. 10 FPS on >150 FPS averages is negligible and you will _never_ see any return of that in a real life scenario. Not in the least because there are other performance limiters even besides the GPU; the new boost tech will quickly destroy the Intel advantage you see here. More load = lower ST perf and clock for clock Ryzen is easily as fast.



Sekiro also. And I Said up to, wich means in some cases the difference is lower of course. Let me assure you that on a 165hz monitor I really do notice the difference between 140fps and 110fps. Easily.

Just to be clear, I would never buy 10900k, it has no place imo, but the 10600k is an interesting CPU imo


----------



## mrthanhnguyen (May 25, 2020)

Out of stock everywhere and price gouging on ebay. Mobo is around $100 more expensive. Lumni  has oc his 10900k to 5.5ghz on dark z490 with mo-ra3, so expect daily binned chip can hit 5.4-5.5.


----------



## W1zzard (May 25, 2020)

mrthanhnguyen said:


> so expect daily binned chip can hit 5.4-5.5.


if my chip can barely hit 5.2 with 1.4 V, then I doubt you'll see that many CPUs with 5.4/5.5


----------



## kapone32 (May 25, 2020)

mrthanhnguyen said:


> Out of stock everywhere and price gouging on ebay. Mobo is around $100 more expensive. Lumni  has oc his 10900k to 5.5ghz on dark z490 with mo-ra3, so expect daily binned chip can hit 5.4-5.5.


 Haha for 5 seconds on 1 core.


----------



## Vayra86 (May 25, 2020)

Cobain said:


> Sekiro also. And I Said up to, wich means in some cases the difference is lower of course. Let me assure you that on a 165hz monitor I really do notice the difference between 140fps and 110fps. Easily.
> 
> Just to be clear, I would never buy 10900k, it has no place imo, but the 10600k is an interesting CPU imo



Absolutely! But its nothing new, is it? Its just a new type number with some tweaks and high peak temps. I mean, this CPU was essentially already available since the 8700K.


----------



## mrthanhnguyen (May 25, 2020)

Binned chip.


----------



## Vayra86 (May 25, 2020)

mrthanhnguyen said:


> Binned chip.



Hilarious. Needs best bin and special gear to hit 200mhz over stock turbo. Why even bother... I'll take stock any day of the week, funny how that is the same between Intel and AMD now all of a sudden


----------



## W1zzard (May 25, 2020)

mrthanhnguyen said:


> Binned chip.



ah I thought on air cooling 



Vayra86 said:


> I'll take stock any day of the week, funny how that is the same between Intel and AMD now all of a sudden


indeed


----------



## dgianstefani (May 25, 2020)

can get pretty insane speeds with this chip. 

The cooling really isn't that bad as long as you keep the voltages under 1.35v.

Der8auer took another 5-10c off each core by using LM too. So there's potential.


----------



## dirtyferret (May 25, 2020)

Am I the only one who's not impressed with either AMD/Intel sticking as many cores as they can on their CPUs?


----------



## Radium69 (May 25, 2020)

What amazes me is that TPU doesnt test gaming with youtube or other apps on the side, eg streaming content. Intel falls flat on its face with this scenario.
I mean, really, you guys review with, and i quote: “application as that better reflects real life” 
It’s time to send some old metrics with retirement.

I love the attention to detail TPU does. But in the end, does the 2,4% improvement really matter when tested with “only” gaming, a lot of people use their computer with more than just gaming, eg, gaming with youtube+twitch+spotify etc...

Also i didn't see the comparison about €€€ buying a new platform again and a new cooler (again), and a beefy one at that... idle performance is better due to pcie3 ipresume.
Load however... seems bonkers. Do we really need to give intel such a big praise for this nee processor while it is, in fact, intel pushed into a corner due to fierce competition?

cheers,
A critic reader who lurks for years.
Keep up the good work.


----------



## theGryphon (May 25, 2020)

Vayra86 said:


> 310W peak. Something about comets and craters, funny twist, obligatory lol
> 
> Other than that, yeah. Nice heater, pretty useless otherwise. Its actually only usable at stock, and you do get most of the performance that way. It is competitive on price if you consider the competition.
> But... 310W peak. Hell no. The first sunny day your OC falls apart, first heat wave you've got a cataclysmic event in your case. Staying FAR away.
> ...




How on earth is this "competitive on price if you consider the competition"??  

$500 is the tray price and it will not retail at this price. $530 if you're lucky. That makes it 25-30% more expensive than a 3900X! 

And that's just the CPU! Add on top the mobo differences plus the cost of extra watts... I mean, come on.

This CPU is for 1) Intel fanbois and 2) gamers with absolutely no concern for money. That's it.


----------



## GreiverBlade (May 25, 2020)

mhhh interesting review thanks

too close to a 3900X less value ... "king of gaming" indeed ... well, if 0.2 to ~ 13 fps when it's already above 100fps for both CPU was a huge gap, i would say "yes" but right now i would just say "meh"
as perf per dollar wise i would not even take the OC 10900K in account only stock and max turbo matter for me, given how my 6600K treated me (aka: rental OC )

i'd wait for the XT refresh and 4X00 line before deciding,
or, since both upgrade would need a full platform refresh, i can take a 3600X, since the 10600K would be 300 chf +, a X570 for the price of that 10900K alone, and upgrade later for a XT or if they lower the 3900X price, because for now it is still 498 chf, to 350 chf (funny how everywhere else the price go down but not where i live ... ahhhh bummer) and then wait till a 4X00

of the pros ... the "Beats AMD 12-core in many lightly threaded apps" is a bit hilarious ... many is rather few ... since in most of them where it is above, the gap between them is ridiculous at best, although being a 10 core it add a bit value on that

still funny how they compete with the previous gen,
the 3900X put more than a fight (granted... it has 2 more core ...)  and is "previous" gen (well ...technically 3X00 was 9X00 concurrent ) i see them as equal and since at the value from reviews (aka the price i will never see where i live, or the OC that will never be since review and "OMG [not really] WORLD RECORD!" chips are hand picked   ) the 3900X is the winner in that case...



dirtyferret said:


> Am I the only one who's not impressed with either AMD/Intel sticking as many cores as they can on their CPUs?


it is needed now ... a bit before it was not really ... why is it an issue for you? streamer need them and more and more games and softwares beneficiate from them  even a game that only use 4 core having more is useful, although 6C/12T would probably be enough for me.


----------



## W1zzard (May 25, 2020)

Radium69 said:


> that TPU doesnt test gaming with youtube or other apps on the side, eg streaming content. Intel falls flat on its face with this scenario.


AMD marketing is pushing reviewers hard for that, because it's the only way their cores don't sit idle in games. Do really that many people game and stream at the same time?


----------



## WeeRab (May 25, 2020)

The test setup makes no mention of the cooler used on the test systems.
Was it the Noctua NH-U12 used for the Blender temp test...And was the test conducted on an open test bench?
 Also...The 10900k is selling for around the £540-£635 mark (but out of stock) here in the UK.   Quite the mark-up 
I have to admit though....Powerful chip, even if it makes no sense at all.


----------



## dgianstefani (May 25, 2020)

Testing still doesn't include 1% and 0.1% lows. Shame really since that's where you really see the differences in a CPU throttled game and one that is smooth, regardless of averages.





It's noticable when you do include these results. E.g. the stock 10900k has higher 1% lows than the AMD 3900x average FPS.


----------



## W1zzard (May 25, 2020)

WeeRab said:


> Was it the Noctua NH-U12 used for the Blender temp test...And was the test conducted on an open test bench?


yes to both


----------



## dgianstefani (May 25, 2020)

So you could look at an averages chart and say - well my 3700/3900x gets 120fps, and the 10900k get 160, but it doesn't matter to me since I only have a 120hz monitor. But you'd be wrong, and wouldn't know that unless the review you looked at had 1% lows/0.1% lows. 

@Wizzard is there some particular reason you don't test for the min framerates? Such good reviews otherwise.


----------



## Vayra86 (May 25, 2020)

theGryphon said:


> How on earth is this "competitive on price if you consider the competition"??
> 
> $500 is the tray price and it will not retail at this price. $530 if you're lucky. That makes it 25-30% more expensive than a 3900X!
> 
> ...



Price changes according to market demand and above msrp is simply inflated. Launch price isnt a great indicator. I did say 'IF priced right' 

Look where Ryzen 9 is sitting at and what you trade is 2 cores for some ST performance. So there is a choice here and if priced right (not saying this is 500 bucks going down the line; Ryzen 9 can be found 100 bucks below msrp), the product is competitive.

It still does offer high perf, after all.


----------



## dirtyferret (May 25, 2020)

GreiverBlade said:


> it is needed now ... a bit before it was not really ... why is it an issue for you? streamer need them and more and more games and softwares beneficiate from them  even a game that only use 4 core having more is useful, although 6C/12T would probably be enough for me.



It's not an issue but once you move away from something like an i7-8700/ryzen 3600x then performance improvements seem more niche or benchmark related but not necessarily real world jumps that you can see with the naked eye.


----------



## Vayra86 (May 25, 2020)

W1zzard said:


> AMD marketing is pushing reviewers hard for that, because it's the only way their cores don't sit idle in games. Do really that many people game and stream at the same time?



Streaming related topics do pop up left and right. Its not my cup of tea either but besides streaming...

people have a browser fullof tabs, social media push services, some discord or likewise, overlays... there is a shit ton of background and out of focus applications in use while gaming. And OS services too. Windows even advertises the use of idle CPU cycles...

Again, it aint my thing to do all that while gaming...but the demand for 1% lows is real and it surprises me you still dont have it. Especially as parts become so much more dynamic in how they deliver performance... its a must, really.


----------



## TheLostSwede (May 25, 2020)

Radium69 said:


> What amazes me is that TPU doesnt test gaming with youtube or other apps on the side, eg streaming content. Intel falls flat on its face with this scenario.
> I mean, really, you guys review with, and i quote: “application as that better reflects real life”
> It’s time to send some old metrics with retirement.
> 
> ...


I guess you haven't done any benchmarking yourself? 
The problem is that as soon as you introduce more variables, there's a huge risk of the testing environment throwing off the normally reproducible numbers. Then we're ending up in a situation where a lot of people are going to start questioning the benchmark results, as every time they come outside the normal 1-2% variance or less. Benchmarking for review purposes has to deliver reproducible results across several platforms, so you need to try to minimise variables that might affect the performance in a negative way, regardless of what you're testing.  

Yes, a lot of people run at least something like discord alongside their gaming, but live streaming, not sure how many people really does that. That said, most of us probably don't turn off all the background services and what not when we game either, so you lose out a few percents performance there too.

I doubt PCIe 3.0 vs PCIe 4.0 would make any difference to idle power, at least not if no PCIe 4.0 devices are being used. It's more likely that Intel is simply better than AMD at the whole idle power end of things and it's something that have been for quite some time. AMD seems to be starting to catch up on the mobile side though, so maybe that'll be something that translates over to the next set of desktop CPUs as well.


----------



## W1zzard (May 25, 2020)

dgianstefani said:


> is there some particular reason you don't test for the min framerates? Such good reviews otherwise.


The current bench system doesn't have the infrastructure to measure, analyze and plot frametimes. This is something I'll work on for next rebench, based on the work I've done with the new GPU bench


----------



## E-curbi (May 25, 2020)

Vayra86 said:


> Hilarious. Needs best bin and special gear to hit 200mhz over stock turbo. Why even bother... I'll take stock any day of the week, funny how that is the same between Intel and AMD now all of a sudden



Probably worth it to Luumi, since he's only promoting a motherboard for EVGA. ie - gets a paycheck in the mail every month.

Yea, same old Skylake architecture, I remember the days when a binned Skylake chip could run 600Mhz over stock turbo ... on air ... on inaudible air ... Ahh, those were the days. 

Rocket Lake is rumored to offer +20% IPC, that will be nice as long as they don't thermal like fireballs.


----------



## ppn (May 25, 2020)

i would be happy if 10700F can do all core at 4.6GHz at 1.2V 175 watt AVX. and 11700F too..

Rocket lake better deliver at least 40% to be viable, inital tests show 0%.


----------



## Vayra86 (May 25, 2020)

W1zzard said:


> The current bench system doesn't have the infrastructure to measure, analyze and plot frametimes. This is something I'll work on for next rebench, based on the work I've done with the new GPU bench








Those GPU reviews are much improved by it! Awesome


----------



## randomUser (May 25, 2020)

W1zzard said:


> AMD marketing is pushing reviewers hard for that, because it's the only way their cores don't sit idle in games. Do really that many people game and stream at the same time?



While i do not do the streaming, there are always steam, epic, core temp, keyboard apps in the background; drive meter/core temp gadgets. Sometimes there is a browser open (so might be some ads running). 
I would really hate closing everything down every time i want to play something.
That might be the reason why my 9900k never hits the 4.8Ghz and up frequency even tho all the mentioned stuff only takes like 1-3% at best.


----------



## E-curbi (May 25, 2020)

ppn said:


> i would be happy if 10700F can do all core at 4.6GHz at 1.2V 175 watt AVX. and 11700F too..
> 
> Rocket lake better deliver at least 40% to be viable, initial tests show 0%.



Agree. AMD is going to be tough to beat in Single Thread.


----------



## cueman (May 25, 2020)

well, i see intel cpus have only one problem and thats all, and it is 14nm process tech.

wow,but even large 14nm tech ,it can beat amd cpus even they have more cores alot better 7nm process tech...hmm.. and thats why intel 10900 doing good work.


anyway, intel need badly 10nm and  7nm process tech cpus,and, they should release thouse soon, looks intel want dong it well and carefully.

i give 10900 cpu score 9


summarum


when intel release first even it 10nm cpu, we seen little bit how strengt goes reality, and how intel can build cpus.
but,when intel release it 1st 7nm, we finally then see battles without any handicap,meaning how good/bad amds 7nm cpus really are,and can intel go top both, performance and efficiency.
i guess when that happends, many peoples eyes open good...

hmm, we seen little bit it when nvidia release it 7nm gpus and then comapare it amd 7nm gpus.


if i get it righ this way we go:

14nm 10900  vs amd 7nm ry(zen 2)
10nm 11900 vs amd 7nm ry(zen 3)
7nm 12900 vs amd 7nm ry(zen 4)
5nm 13900 vs amd 5nm ry(zen 5)

year is 2021 at least,both are same line,without process tech handicap.

and,as all know,saying again,both, amd and intel get 5nm process tech same time 2022 or 2021 late.

exiting!


----------



## X71200 (May 25, 2020)

Why did this CPU get an "Editor's Choice" award? Let's bring these back up:

It costs $500 and the boards seem to cost no less than $200-300, IPC improvements aren't really there as was mentioned in the review - it's all in the clocks... and you have those clocks resulting in an outrageous 305W / 99C.

Then it's been talked like as if buying this CPU actually gets you a noticeable performance increase in games when it is so minimal at such high frames, or simply limited by GPU around 80 FPS. You'll get A LOT more FPS with a stronger GPU & AMD combo. Not same. The amount of FPS you'll gain from the GPU you'll be able to buy with the cash, will be significantly more than what this CPU offers you.

I'd have simply tossed it without an award and left at that.


----------



## E-curbi (May 25, 2020)

X71200 said:


> *Why did this CPU get an "Editor's Choice" award?* Let's bring these back up:
> 
> It costs $500 and the boards seem to cost no less than $200-300, IPC improvements aren't really there as was mentioned in the review - it's all in the clocks... and you have those clocks resulting in an outrageous 305W / 99C.
> 
> ...



The thermals are pretty amazing for the 10900K considering 10 14nm cores, although anyone could argue the reduced die thickness and increased IHS should have already been designed that way for the previous 9th gen - which seems as much a FUBAR today as it did at launch in 2018.  Remember enthusiasts sanding down the die? Why Intel, just why?


----------



## Vayra86 (May 25, 2020)

E-curbi said:


> The thermals are pretty amazing for the 10900K considering 10 14nm cores, although anyone could argue the reduced die thickness and increased IHS should have been already been designed that way for the previous 9th gen - which seems as much a FUBAR today as it did at launch in 2018.  Remember enthusiasts sanding down the die? Why Intel, just why?



In spec is in spec. All the moaning aside, and heat 'problems'... even my 'hot' 8700K under toothpaste still runs within spec. And quite well at that. It just doesn't OC for shit.

Intel changing TIM means their CPUs just got a bit hotter and they give you just enough to compensate for that. After all, this way they could make _two_ baby steps from 8th gen onward 



X71200 said:


> Why did this CPU get an "Editor's Choice" award? Let's bring these back up:
> 
> It costs $500 and the boards seem to cost no less than $200-300, IPC improvements aren't really there as was mentioned in the review - it's all in the clocks... and you have those clocks resulting in an outrageous 305W / 99C.
> 
> ...



Editor's choice to make this review, that must be it


----------



## MrAMD (May 25, 2020)

X71200 said:


> You'll get A LOT more FPS with a stronger GPU & AMD combo.



No. Just no. I know you're saying budget wise but balls to the wall top-end setup is intel CPU and NV RTX 2080 Ti. Nothing will beat it in gaming.


----------



## Cobain (May 25, 2020)

Vayra86 said:


> Absolutely! But its nothing new, is it? Its just a new type number with some tweaks and high peak temps. I mean, this CPU was essentially already available since the 8700K.



Yes maybe you are right, for someone that has a 8700k, no news here, but we should point out that 10600k costs 260€ and is not as hard to cool, lower voltage too. Suely it doesnt justify an upgrade unless someone os still on really old CPUs


----------



## Gameslove (May 25, 2020)

The World's Fastest Gaming CPU.
4K Gamers: Really?  Why should I upgrade my PC? According to price / performance Ryzen 5 1600, haven't seen noticeable performance in 4K (Intel Core i9 10900K only 3 % fast then Ryzen 5 1600).


----------



## GreiverBlade (May 25, 2020)

dirtyferret said:


> It's not an issue but once you move away from something like an i7-8700/ryzen 3600x then performance improvements seem more niche or benchmark related but not necessarily real world jumps that you can see with the naked eye.


well as the softwares and games will advance ... it will be less niche ... for myself i prefer having more even when not needed than needing more and not having them 

but you are right the sweetspot, as i mentioned too, is 6C/12T like a 3600X 

everything i see about Intel is painting my next upgrade path in red ...




MrAMD said:


> No. Just no. I know you're saying budget wise but balls to the wall top-end setup is intel CPU and NV RTX 2080 Ti. Nothing will beat it in gaming.


but again, the margin will not be that much high ...

(yeah yeah i know ... +0.2-10fps = "not equal! i am gaming king!" )

(disclaimer i am a 1620p/1440p 75hz kind ... )



Cobain said:


> we should point out that 10600k costs 260€


for me i think it will be more 320+  at that rate even a 3700X would be a better option


oh i just checked prices ... confirmed 
10600K around 319 chf (i was 99% spot on ) ahahahahaha
10900K around 600 chf ahahahahahahahaha .... no ...

aye ....
3700X/3800X/3900X are 120 to 200 chf cheaper than the 10900K
3600X is almost 100chf less than the 10600K ...


----------



## jsteinhauer (May 25, 2020)

W1zzard said:


> AMD marketing is pushing reviewers hard for that, because it's the only way their cores don't sit idle in games. Do really that many people game and stream at the same time?


Almost always have something running or downstreaming on the other display while gaming on the first (not livestreaming myself), so this would be valuable information.  Given the vast numbers of wannabe streamers-for-a-living, a marketing department would love to use the label "Best processor for streamers."  Personally, I see no reason to upgrade from my 8700K, yet.


----------



## coozie78 (May 25, 2020)

I can get a much better room heater for $500 ;-)
Interesting that is can best the 3900X in the scientific benchmarks, looks like its higher sustained clocks beat out the extra cores/threads of the AMD part. Run at stock with a half decent air cooler it's a nice alternative to a Xeon/Threadripper build.
Those gaming benchmarks are as expected, although I'm a little surprised to see AMD fall so far behind in Sekiro at 1080 rez, maybe it'll be patched out later, although the entire Far Cry series has always strongly favoured Intel with no signs of improvement to date. :-(
For the huge majority of users 8C/16T is plenty, even streamers shouldn't find themselves running into issues with that find of silicon and TBH most wouldn't suffer too much with the far cheaper i5s out there.
This is a halo part for halo users who demand the very highest gaming performance and will back it up with a high refresh display and RTX2080Ti. For the rest of us mortals it's a Bugatti Veyron or Lear Jet, 
great to dream about but realistically, just too expensive when the kids needs new shoes and the rent is due.


----------



## INSTG8R (May 25, 2020)

W1zzard said:


> AMD marketing is pushing reviewers hard for that, because it's the only way their cores don't sit idle in games. Do really that many people game and stream at the same time?


Well also because AMD has integrated the ability to stream from the Adrenaline drivers pretty easily now so it makes sense they‘d want that showcased.


----------



## Gameslove (May 25, 2020)

jsteinhauer said:


> ..... "Best processor for streamers."  Personally, I see no reason to upgrade from my 8700K, yet.


Ye, you absolutely right. 

The new CPU must to bring a new instructions (for example such SSE4. 2, AVX). Since 2000 year a Cpu's and GPU's evolution, I will see in future a motherboards with only GPU, a CPU complete dissappear.


----------



## cucker tarlson (May 25, 2020)

not really an upgrade over 9900k.
it's not that 10900k is bad,but for gaming 9900k was already enough to push the fastest cards to extreme framerates.
technically it is the fastest,but realisctically,who does this even exist for?



dirtyferret said:


> not necessarily real world jumps that you can see with the naked eye.


so glad I got glasses



W1zzard said:


> AMD marketing is pushing reviewers hard for that, because it's the only way their cores don't sit idle in games. Do really that many people game and stream at the same time?


I would really like to see a review with a smaller number of games and resolutions but one that includes avg. and peak core/thread usage.


----------



## mohammed2006 (May 25, 2020)

the 8% gaming performance will be shrinked with xt higher frequency


----------



## X71200 (May 25, 2020)

MrAMD said:


> No. Just no. I know you're saying budget wise but balls to the wall top-end setup is intel CPU and NV RTX 2080 Ti. Nothing will beat it in gaming.



The point was, buying a 2080 or similar with an AMD CPU, for the same money as, let's say, 2070 with the Intel. In such case, you will get better performance with the AMD setup.

One should buy what they needs. I could have gotten the Ti at the time but I didn't, because the 2080 was suffice for my needs, and still is. I can play Warzone most things cranked pretty much gloriously at 3440x1440.


----------



## W1zzard (May 25, 2020)

cucker tarlson said:


> avg. and peak core/thread usage


Elaborate please


----------



## cucker tarlson (May 25, 2020)

W1zzard said:


> Elaborate please


what max usage the cores are hitting
if you analyze youtube videos you can see that amd and intel manage usage in a different way.
amd 3000 tends to hit higher usage on some cores and leave others working very lightly.


----------



## Vayra86 (May 25, 2020)

cucker tarlson said:


> what max usage the cores are hitting
> if you analyze youtube videos you can see that amd and intel manage usage in a different way.
> amd 3000 tends to hit higher usage on some cores and leave others working very lightly.
> View attachment 156658
> View attachment 156659



The usage stat is highly misleeading. I wouldn't bother much with it to be fair. I've seen games saturate all cores and no problems whatsoever, yet another game saturates all cores and its a stutterfest.


----------



## cucker tarlson (May 25, 2020)

Vayra86 said:


> The usage stat is highly misleeading. I wouldn't bother much with it to be fair. I've seen games saturate all cores and no problems whatsoever, yet another game saturates all cores and its a stutterfest.


would prefer to stay on the safe side if that's possible.


----------



## Houd.ini (May 25, 2020)

TheLostSwede said:


> I doubt PCIe 3.0 vs PCIe 4.0 would make any difference to idle power, at least not if no PCIe 4.0 devices are being used. It's more likely that Intel is simply better than AMD at the whole idle power end of things and it's something that have been for quite some time. AMD seems to be starting to catch up on the mobile side though, so maybe that'll be something that translates over to the next set of desktop CPUs as well.


You're right, PCIe is throttled in idle so it should not make much of a difference. The Infinity Fabric uses quite a bit of power at idle because it apparently does not clock down, that was something AMD worked on in Renoir (asynchronous IF/ mem speed if I am not mistaken).


----------



## Vayra86 (May 25, 2020)

cucker tarlson said:


> would prefer to stay on the safe side if that's possible.



That's the thing; its not possible going by the usage stat. Game load varies highly per engine, whether online or not and even with how many players (remember Battlefield)...The safe side is getting all the CPU you can afford, and then just upgrade when you feel the current one becomes a problem.

I mean isn't that what you've done last time and since?


----------



## cucker tarlson (May 25, 2020)

Vayra86 said:


> That's the thing; its not possible going by the usage stat. The safe side is getting all the CPU you can afford, and then just upgrade when you feel the current one becomes a problem.
> 
> I mean isn't that what you've done last time and since?


I'm way past due upgrading


----------



## MrAMD (May 25, 2020)

X71200 said:


> The point was, buying a 2080 or similar with an AMD CPU, for the same money as, let's say, 2070 with the Intel. In such case, you will get better performance with the AMD setup.



I already said that's true but the point isn't totally correct anymore once you factor in the new i5-10600k with cache ratio and ram overclocking. It even beats the 10900k in most cases. You'll get best value and top-performance for gaming with that chip and RTX GPU for any said budget.


----------



## X71200 (May 25, 2020)

MrAMD said:


> I already said that's true but the point isn't totally correct anymore once you factor in the new i5-10600k with cache ratio and ram overclocking. It even beats the 10900k in most cases. You'll get best value and top-performance for gaming with that chip and RTX GPU for any said budget.



Yeah, 6 core against 8 and 12, sure I'm getting the best value for any said usage. I want to be able to stream off the CPU as well when I do it, not just NVENC.


----------



## cucker tarlson (May 25, 2020)

X71200 said:


> The point was, buying a 2080 or similar with an AMD CPU, for the same money as, let's say, 2070 with the Intel. In such case, you will get better performance with the AMD setup.
> 
> One should buy what they needs. I could have gotten the Ti at the time but I didn't, because the 2080 was suffice for my needs, and still is. I can play Warzone most things cranked pretty much gloriously at 3440x1440.


seriously,intel has a cpu to beat or match amd in value for gaming in every segment.
10400f for 3600,10600kf for 3700x,10700f/kf for 3900x
they go back and forth,win some lose some,but I don't know where you're getting a whole $200 difference between them.


----------



## X71200 (May 25, 2020)

cucker tarlson said:


> seriously,intel has a cpu to beat or match amd in value for gaming in every segment.
> 10400f for 3600,10600kf for 3700x,10700f/kf for 3900x
> I don't where you're getting a whole $200 difference between them.



Look at the thread title and you might see it.


----------



## cucker tarlson (May 25, 2020)

X71200 said:


> Look at the thread title and you might see it.


that'd be true if intel's own 10600kf/10700f hadn't offered that already.
amd's r5/7 aren't really more budget friendly gaming than 10400f,10600kf or 10700f


----------



## X71200 (May 25, 2020)

cucker tarlson said:


> that'd be true if intel's own 10600kf/10700f hadn't offered that already.
> amd's r5/7 aren't really more budget friendly gaming than 10400f,10600kf or 10700f



See I don't want a 6 core or locked CPUs for my main. So that rules them all out for me.


----------



## cucker tarlson (May 25, 2020)

X71200 said:


> See I don't want a 6 core or locked CPUs for my main. So that rules them all out for me.


I guess it's good for you to have a choice then.


----------



## X71200 (May 25, 2020)

cucker tarlson said:


> I guess it's good for you to have a choice then.



It indeed is, at $500 and with $300 boards, I actually found a deal on a 14 core Cascade Lake and messaged a guy about it. That should make quite a machine with an Ampere.


----------



## cucker tarlson (May 25, 2020)

X71200 said:


> It indeed is, at $500 and with $300 boards, I actually found a deal on a 14 core Cascade Lake and messaged a guy about it. That should make quite a machine with an Ampere.


Jesus how come there's yet another Lake I haven't heard of.

oh,they're 10th gen 2066.heard of them.They're mediocre at stock but get them to run 4.7GHz core,OC the mesh and get some quad channel +3000 sticks and it absolutely wrecks the competition.


----------



## MrAMD (May 25, 2020)

X71200 said:


> Yeah, 6 core against 8 and 12, sure I'm getting the best value for any said usage. I want to be able to stream off the CPU as well when I do it, not just NVENC.



You're moving the goalposts now. Obviously if said user wanted to seriously stream games every session you would pick an appropriate chip. In this scenario we would still have to compare the i7-10700k (value 9900ks) as 8 cores are perfectly able to stream and have stupid amounts of background tasks.


----------



## cucker tarlson (May 25, 2020)

MrAMD said:


> You're moving the goalposts now. Obviously if said user wanted to seriously stream games every session you would pick an appropriate chip. In this scenario we would still have to compare the i7-10700k (value 9900ks) as 8 cores are perfectly able to stream and have stupid amounts of background tasks.


they're a fair bit more pricey than 3700x.
I'd go for a 10700f frankly.yeah it's locked but look at the actual boost values.what are you seriously going to achieve with a K-series over what it offers.200MHz maybe 300MHz.


----------



## MrAMD (May 25, 2020)

cucker tarlson said:


> they're a fair bit more pricey than 3700x.
> I'd go for a 10700f frankly.yeah it's locked but look at the actual boost values.what are you seriously going to achieve with a K-series over what it offers.200MHz maybe 300MHz.



Yes true it's slightly more expensive compared to the 3700x but it's also considerably faster. Downside is no included cooler which Intel is losing out on. You probably can't find a huge GPU upgrade for only $80+ odd dollars though. Better GPU model cooler? That's about it all things being equal


----------



## X71200 (May 25, 2020)

MrAMD said:


> Yes true it's slightly more expensive comapred to the 3700x but it's also considerably faster. And you probably can't find a huge GPU upgrade for only $80+ odd dollars. Better GPU model cooler? That's about it all things being equal



Actually, you can with so many series out in the market. Like all those Nvidia Super GPUs.


----------



## MrAMD (May 25, 2020)

X71200 said:


> Actually, you can with so many series out in the market. Like all those Nvidia Super GPUs.



Looking at current market trends via pcpartpicker. You would need to spend at least $110+ to upgrade an 2060 to 2070. With Intel not including an cooler that would make up the difference. I concede my previous point


----------



## cucker tarlson (May 25, 2020)

MrAMD said:


> Looking at current market trends via pcpartpicker. You would need to spend at least $110+ to upgrade an 2060 to 2070. With Intel not including an cooler that would make up the difference. I concede my previous point


what about 2060 to 2060s ?


----------



## HenrySomeone (May 25, 2020)

cucker tarlson said:


> they're a fair bit more pricey than 3700x.
> I'd go for a 10700f frankly.yeah it's locked but look at the actual boost values.what are you seriously going to achieve with a K-series over what it offers.200MHz maybe 300MHz.


So are you going to finally pull the trigger? I remember you were first itching for an 8700k, then later 8086k then 9900ks... Or are you waiting for Rocket Lake (or maybe Zen3 which although certainly won't beat 10th gen in gaming, might cause the latter's price drops if it gets close enough...)


----------



## MrAMD (May 25, 2020)

cucker tarlson said:


> what about 2060 to 2060s ?



Looks like highest-end 2060 is equal to entry-level 2060s price ($369)


----------



## cucker tarlson (May 25, 2020)

frankly I don't feel like pulling the trigger now.
zen 4000 is not going to disappoint,and b550 is the only way to get pcie4 without ruining the budget.


----------



## HenrySomeone (May 25, 2020)

Yeah, I sort of feel your predicament. It is getting more and more clear that 8700k bought at around the time when Zen+ launched (when you could get it for even less than 300$ or Euros for that matter - I have a buddy who was living in Germany at the time and got his for either 285 or 289) was the best gaming oriented cpu purchase of recent years...


----------



## cucker tarlson (May 25, 2020)

HenrySomeone said:


> Yeah, I sort of feel your predicament. It is getting more and more clear that 8700k bought at around the time when Zen+ launched (when you could get it for even less than 300$ or Euros for that matter - I have a buddy who was living in Germany at the time and got his for either 285 or 289) was the best gaming oriented cpu purchase of recent years...


true.now getting comet lake-s equivalent seems meh and as far as ryzen 3000..... 4000 seems worth the wait.


----------



## mrthanhnguyen (May 25, 2020)

Sold my binned 9900ks for $1100 and It is not enough money to buy new gen. Apex xii $450, 10900k $800. Should I wait for Ryzen 4000 ? Gaming only.


----------



## cucker tarlson (May 25, 2020)

mrthanhnguyen said:


> Sold my binned 9900ks for $1100 and It is not enough money to buy new gen. Apex xii $450, 10900k $800. Should I wait for Ryzen 4000 ? Gaming only.


you had a 5.3GHz 9900ks that you sold to get a better gaming cpu.
good luck.


----------



## X71200 (May 25, 2020)

mrthanhnguyen said:


> Sold my binned 9900ks for $1100 and It is not enough money to buy new gen. Apex xii $450, 10900k $800. Should I wait for Ryzen 4000 ? Gaming only.



Yes.

Off-topic but you have / had all that with an old 24 inch TN panel? Ditch that and get a real monitor first.


----------



## cucker tarlson (May 25, 2020)

X71200 said:


> Off-topic but you have / had all that with an old 24 inch TN panel? Ditch that and get a real monitor first.


pardon me,but monitors are use specific.


----------



## KarymidoN (May 25, 2020)

mrthanhnguyen said:


> Sold my binned 9900ks for $1100 and It is not enough money to buy new gen. Apex xii $450, 10900k $800. Should I wait for Ryzen 4000 ? Gaming only.



for "gaming only" the 10600k looks really promissing tbh. maybe wait a bit for the AMD price drops and AMD B550 boards.


----------



## cucker tarlson (May 25, 2020)

KarymidoN said:


> for "gaming only" the 10600k looks really promissing tbh. maybe wait a bit for the AMD price drops and AMD B550 boards.


xt versions coming.
the prices are fine,it's a bit more performance they need.


----------



## HenrySomeone (May 25, 2020)

Yup, this will be a tough one... I see no way any 4000 chip is going to be faster than 10900k (in gaming at least), but that also won't really be any faster than a 5.3 9900ks...except once again a highly binned and OCed one like shown in this thread before. Would honestly be best to wait for Rocket Lake but that depends on what your placeholder cpu is.


----------



## BArms (May 25, 2020)

So essentially the i9-10900K gives you the best gaming CPU at stock speeds, but little to no headroom for overclocking due to heat. 

I think if the tables were reversed, e.g. AMD had produced this CPU on 14nm it would have been seen as a huge success. I'm not saying go buy one, I want to replace my 7700K but I'll wait for Zen 3 (moreso because of PCIE 4.0) but it does seem if you just want a very solid gaming CPU to play the latest, and you aren't going to overclock much, it's not a terrible purchase.


----------



## HenrySomeone (May 25, 2020)

If Pcie 4 is your main concern, then just sell your 7700k + mobo while they are still fetching such great prices and get a Z490 + 10600k at almost no extra cost. The board should then support it with 11th gen cpus when it actually becomes important...


----------



## LTUGamer (May 25, 2020)

Now I see why it is called Ice Lake. Intel just mentioned what type of liquid cooler is needed


----------



## cucker tarlson (May 25, 2020)

HenrySomeone said:


> If Pcie 4 is your main concern, then just sell your 7700k + mobo while they are still fetching such great prices and get a Z490 + 10600k at almost no extra cost. The board should then support it with 11th gen cpus when it actually becomes important...


or get b550 w. zen 2/3 and don't upgrade


----------



## BArms (May 25, 2020)

HenrySomeone said:


> If Pcie 4 is your main concern, then just sell your 7700k + mobo while they are still fetching such great prices and get a Z490 + 10600k at almost no extra cost. The board should then support it with 11th gen cpus when it actually becomes important...



It's not bad advice, thanks. I'll probably wait for Zen 3 though first, since a new build won't really help that much until the next gfx cards are out anyway.


----------



## HenrySomeone (May 25, 2020)

cucker tarlson said:


> or get b550 w. zen 2/3 and don't upgrade


Yes, but 10600k will likely at least match the best of Zen 3 chips in games and while the latter will be the end of the road for AM4, you will have another gen available on LGA 1200 (in an interesting reversal of team red's perpetual motto - "better upgradeability"). Although 7700k should last another year or so for even high-end gaming, its second hand market value will eventually plummet though and it is actually a good idea to sell now.


----------



## GeorgeMan (May 25, 2020)

Very good review, as always! 
Sorry if it was mentioned again in the comments, but on page 20 I don't see it hitting 5.3GHz even on 1 thread generic FP load. Isn't it advertised to be able to hit 5.3GHz on 1-2 thread loads or am I missing something?


----------



## zaku49 (May 25, 2020)

I game at 4k and wow is the 10900k a horrible value when compared to the 3900x, 2fps difference but a MAJOR loss of 2 cores, even at 2k, there's an extremely small difference, not worth the major loss of 2 cores. Unless you're still gaming at 1080p like a peasant this CPU is a horrible value when compared to the 3900x.

Also Zen 3 is just 4 months away , and it's going to make you ask yourself why does the 10900k even exist?

This review is also very one sided, it's strictly targeted at one type of gamer, if you game at 1080p then this review is for you, if you game at anything higher, which most of us do these days then this review will not be providing you with an accurate representation of  value to performance.


----------



## AnarchoPrimitiv (May 25, 2020)

Does anybody else think it would be both informative and beneficial (and also finally put some potentially false arguments "to bed") if TPU, some larger tech YouTube Channel, or anyone with the necessary resources did a controlled, blind experiment to definitively determine at what percentage delta a human can *actually differentiate *an Fps difference correctly more than 50% of the time (specifically for frame rates above 100fps since, whether explicitly or implicitly, it's the high fps, 1080p performance that's given the most weight in determine gaming performance)?

Personally I feel like it's impossible for a human being to discern a 5% difference in Fps (over 100fps) which equates to, for example 128fps vs 134fps, but I'd like to know for sure, especially if my assumption is incorrect.  Does anyone here honestly think that a *majority of people can correctly identify a difference this small better than random guessing? *

The trend in PC hardware reviews has been to increasingly incorporate more "real world" tests that are more meaningful to the user, just like how most of us know, for example, that sequential r/w's for an SSD mean very little for your day to day experience compared to random read and writes at low queue depths, and most of the better reviewers weight such performance more heavily than sequential performance. 

I just feel as though scientifically determining at what point a difference in Fps and gaming performance is actually meaningful and detectable by a human being would give less knowledgeable consumers (as well as the community as a whole) the tools they need to make a more informed decision, and in the end, isn't that the overarching purpose of reviews?


----------



## david0852 (May 26, 2020)

Why is PL2 (250w) in the +300W mark and the PL1 (125w) in the 200w ? Is the vertical axis wrong?


----------



## GeorgeMan (May 26, 2020)

david0852 said:


> Why is PL2 (250w) in the +300W mark and the PL1 (125w) in the 200w ? Is the vertical axis wrong?


No it's total system power consumption.


----------



## watzupken (May 26, 2020)

dgianstefani said:


> Testing still doesn't include 1% and 0.1% lows. Shame really since that's where you really see the differences in a CPU throttled game and one that is smooth, regardless of averages.
> 
> View attachment 156632
> 
> It's noticable when you do include these results. E.g. the stock 10900k has higher 1% lows than the AMD 3900x average FPS.


Who buys a i9 processor with a RTX 2080Ti and games at 1080p?



david0852 said:


> Why is PL2 (250w) in the +300W mark and the PL1 (125w) in the 200w ? Is the vertical axis wrong?
> View attachment 156674


Its telling you that under load, the total system power draw is say around 320W, of which the CPU should be taking 250W to achieve the boost speed. After running for what looks like 50 seconds, it goes back to the PL1 state (base clockspeed). The boost (PL2) state duration may differ from motherboard to motherboard, and subjected that you have sufficient cooling and power. With most reviews using high end cooler (generally a 360 AIO from what I observed), so you are unlikely to see throttling or critically high temps.


----------



## B-Real (May 26, 2020)

Cobain said:


> Amazing job, was waiting for this review ! It seems the i5 10600k is the real winner for high end gaming rigs? 260€ and almost same performance as the other top of the line Intel CPUs.
> 
> Was also shocked to see differences *up to 30% *on some gaming scenarios, compared to AMD, damn that's a lot imo.
> 
> The total cost is too much tho, I would get a 10600k for Pure high refresh gaming machine, 3900x por serious productivity, 3600 for good all arounder budget machine.


Yes, there is a game, where there is bigger difference (actually more than 30% in FC5), but in the end averaged from 10 games you get that it's only 8% faster in games, in FHD, with a 2080Ti. However, nobody uses the 2080Ti in FHD. Or maybe 1% of the 2080Ti owners. In 1440p, the 8% reduces to 4%. You can't even see that 8%, not to speak of this 4%. And you have to pay $100-120 extra for the CPU + the cooling + the (maybe more expensive?) motherboard. Not to speak of the 50-70W extra power consumption and the bigger heat. Man, that 100C temperature in OC...

Comet Lake is a huge NO for every PC builder, just as refreshed Coffee Lake was: clearly slower in productivity, slightly faster in games, significantly higher power consumption, higher temperature, these paired with significantly higher price (+25%, it reaches more than +35% at Microcenter), with no stock cooler (yes, you can get better for the stock 3900X cooler but hey, it can cope with the CPU), so you can count an extra $50-100 for that for a 10900K. And not to mention the probably higher mobo prices.


----------



## Ravenas (May 26, 2020)

Thanks for this review W1zzard.

Good read. Overall I agree, however, I just don't see the benefit. For an extra ~$220 you could jump in


W1zzard said:


> AMD marketing is pushing reviewers hard for that, because it's the only way their cores don't sit idle in games. Do really that many people game and stream at the same time?



All the time.


----------



## EarthDog (May 26, 2020)

Vayra86 said:


> 10 FPS on >150 FPS averages is negligible and you will _never_ see any return of that in a real life scenario


That's 7.5%... the difference between high and ultra or 2xAA and 4xAA...or pegged at 144 FPS versus not. It's something...and in some cases, a bump in the class of card!


W1zzard said:


> AMD marketing is pushing reviewers hard for that, because it's the only way their cores don't sit idle in games. Do really that many people game and stream at the same time?


A handful surely (all the time, no)...I think its novel to have this type of testing...but in its own article... kind of like the PCIe bandwidth or memory scaling articles. To have to do that for every new CPU review...ick.


Vayra86 said:


> Hilarious. Needs best bin and special gear to hit 200mhz over stock turbo. Why even bother... I'll take stock any day of the week, funny how that is the same between Intel and AMD now all of a sudden


Stock all core turbo is 4.8 Ghz (4.9 GHz TVB). Stock single core TVB is 5.3 GHz. So that 5.5 GHz is likely a great bin...and is, technically 600 MHz over stock TVB..700 MHz over stock turbo.

As I side note, I've had two of these so far. Both have run 5.2 GHz at 1.35V all c/t using AIDA64 default stress test. I run out of cooling there (Corsair H150i).


----------



## Gmr_Chick (May 26, 2020)

theGryphon said:


> How on earth is this "competitive on price if you consider the competition"??
> 
> $500 is the tray price and it will not retail at this price. $530 if you're lucky. That makes it 25-30% more expensive than a 3900X!
> 
> ...



Or in my case 1) Not an Intel fan girl 2) Gamer who has had abysmal luck with AMD in recent months and had to switch to Team Blue out of necessity. But I didn't get the 10900K though. Too expensive. I got the 10700K for near MSRP thanks to the wonderful @oxrufiioxo and the rather snazzy Asus Strix Z490-g Gaming (Wifi) board for a rather reasonable $250. Should be a great combo.



W1zzard said:


> AMD marketing is pushing reviewers hard for that, because it's the only way their cores don't sit idle in games. *Do really that many people game and stream at the same time?*



While I avoid doing any kind of "multitasking" while gaming, gaming + streaming has become increasingly popular for whatever reason. I find watching somebody recording themselves playing a game to be as exciting as watching paint dry, but I guess I'm just way crustier than my 33 years would suggest, lol.


----------



## ppn (May 26, 2020)

the purpose of existence of 10 core is to make 8 core out of the most of them with 2 defective cores and drive the prices down not to compete with 3900.

10700K is the best choice, well 10700F $300 to be precise, not the 10600K soon to be bottlenecking CPU, and not the 10 core, 

10 core sounds as absurd as 5 core or something.


----------



## WeeRab (May 26, 2020)

W1zzard said:


> yes to both


Thanks for that W1z
 My mate is thinking about building a rig based on the 10600k, but is worried about the temps being in an enclosed case. But it shouldn't be any worse than similar tier chips.


----------



## John Naylor (May 26, 2020)

Vayra86 said:


> 310W peak. Something about comets and craters, funny twist, obligatory lol



Gaming is 395 watts  .... the 3900X is 385 watts ... what are  we all  missing ?  Must be 15 posts commenting on the "heaters" .... reading comprehension issues ?

3900X = 385 watts / 79C
10800K = 295 watts / 54 C ... this "heater" is 25C cooler

The 3900X vs 9900KF was interesting .. looking at the review pages the two traded wins, the difference being that the 3900X excelled in brain simulation and things mosts folks never do.  While this tweaks the nerd in inside of me's interests, it does nothing for the enthusiast and engineering business owner. ,,, and this is **my build**.   This time around, it's almost a complete sweep. But the categories Intel picked up with the 10900k over the 9900KF, we never cared about in the 1st place. 

In January the 3900C and 9900KF were the same price... In February it was $415.  That had nothing to do the 10900K or Ryzen 4000 but with the perceived value of the 9900KF vs 3900X.




> The World's Fastest Gaming CPU.
> 4K Gamers: Really?  Why should I upgrade my PC? According to price / performance Ryzen 5 1600, haven't seen noticeable performance in 4K (Intel Core i9 10900K only 3 % fast then Ryzen 5 1600).



You shouldn't.  But when you are gaming at 144 Hz under ULMB, you will know that answer.   I'll think about 4k when there's a GFX card powerful enough to drive 144 hz

One thing that kinda took me off guard ...

_"Using a 240 mm AIO I could get 5.2 GHz stable, but with even more voltage, which causes CPU temperatures to reach over 95°C, right at the throttling point—despite watercooling. Definitely not worth it. " _ That sounds like one went to extraordinary lengths for cooling .... The Scythe Fuma does as well or better than most 240mm coolers. Having $42 worth of cooling isn't exactly a big deal. Not that I'm of the opinion that there significant value to push the CPU that far ... impressive that ya can of course, but I'd prolly stop at 5,0

Here's how I'm looking at this CPU  *** Friom y owm PERSONAL *** perspective.... So what changed since the 9900k

Synthetic Benchmarks - AMD / Intel still split wins here; we don't run benchmarks even on an infequent basis so don't care, irrelevant

Rendering -Intel has moved up quite a bit ; but as we don't do rendering even on an infrequent basis so irrelevant

Software / Game Development - Intel has again moved up quite a bit taking leads; but again as we don't development on even on an infrequent basis so irrelevant

Web Stuff - Intel has again moved up quite a bit taking leads; but again as the differences are so small, still irrelevant

Machine Learning, Brain Simulation, Physics  - This is where "more cores mattered, and if I ever dip my toes into this kinda thing, maybe I'll look at AMD, until then, still irrelevant.

Office suites - The silliest of tests resulting in scripting a bunch of tasks together, each of would would require user input (1-2 seconds each) in between and these benchmarks finish 1 or 2 tenths f a second apart  The equivalent of racing thru Manhattan during rush hour and hitting a light at every corner

Photo Editing - OK Intel continuing with most of the wins... but 0.1 seconds ?  Who cares.

Video Editing - Intel with a  25 second win, OK we're all paying attention.

Photogrammetry - We send that work out but otherwise Id be impressed with Intel's win here

Text Recognition - Still something we do frequently and yet another Intel win, color me interested.

Server / Workstation - Teeny wins for both sides here, but of no interest as that stuff ain't done here.

Compression - I did my 1st RAR files this year 2 days ago ... not something gonna base a CPU selction on even tho Intel finished in 3/4 of the time.

Encryption - We don't do that here, tho grats for a big Intel win by 60% faster

Encoding - Intel came in within a hair of a sweep here and with a huge win on the sound side ... but who cares ? ... not me, don't do that stuff.

Gaming (1080p) - OK 8% win for Intel is significant ... expected much less.  Did expect that to drop in hal tho at 1440p due to GPU bottlenecking being more of an impact at higher res.

Power Consumption - 10 watts at stock difference when gaming (385 / 395).  Would be interesting to compare AMD vs Intel if the 3900x could overclock

Temperature - 54C for Intel vs 79C for 3900X at at stock .. Wow

Overclocking - Would like to see something used here like RoG Real Bench, much more useful and something we can compare against our own builds at home  Would love to see how it stacks up against the 420 + 280 in push / pull using here.

So, w/ 3900X out either way, it's between 9900KF or 10900K ?  

1.  Handling that "potential" power, MoBo makers have to pay attention... that's gonna cost money ... cooling gonna cost money
2.  Too few MoBos have been reviewed.
3.  We still in 1st stepping.
4.  What's the F version like

If a box went down to a catastrophic failure, I'd do a 9900KF build  .... In 6 months might be different.   But If I'm choosing, waiting for late fall when there will be way more options


----------



## mrthanhnguyen (May 26, 2020)

John Naylor said:


> Gaming is 395 watts  .... the 3900X is 385 watts ... what are  we all  missing ?  Must be 15 posts commenting on the "heaters" .... reading comprehension issues ?
> 
> 3900X = 385 watts / 79C
> 10800K = 295 watts / 54 C ... this "heater" is 25C cooler
> ...


Do you want to buy my 9900kf? 5.3ghz on apex xi, 1.24v llc8.


----------



## ThrashZone (May 26, 2020)

Hi,
Nice review
Yeah I got a 10900k from local micro center could of knocked me over with a feather when I saw it for 529.99
Return a borked x299 mark 2 board for 260.00 off the chip so all in all 313.00 for the chip
Pair it with the asus xii  formula since apex was going to take way too long to show up
Have another set of 3600c16 for now tearing down my x99 rig this weekend or as soon as the board  shows up
Should be fun to see what the  optimus blocks can do on it since it did so well on HEDT


----------



## lesovers (May 26, 2020)

Great review W1zzard as always, Techpowerup GPU reviews are the best and have the most amount of useful information of all of the reviews on the web however not quite there with this CPU review!

Only issue I have with this CPU review is it should have included the overclocker details and the test results for the 3900X as it is the most direct competitor for the 10900K. The specifications for my AMD computer and the single-core and multi core results for Cinebench R20 are shown below. My 3900X CPU is not a greatly binned chip at all but still had some good results!

�

*Test System "Zen 2"*​Processor:AMD Ryzen 3900X Manual OC to 4.475 GHz all cores
With Noctua NH-D15 air cooler​Motherboard:Asus X470 Prime Pro with 5406 BIOS​Memory:2x 8 GB G.SKILL Trident Z DDR4
DDR4-3000 15-15-15-35 (clocked to DDR4-3200)​Graphics:MSI 5700XT Gaming X​Storage:2 TB Samsung M.2 SSD​Power Supply:Corsair AX-860 860Watt​Software:Windows 10 Professional 64-bit
Version 1903 (May 2019 Update)​Drivers:AMD 20.4.2 Graphics Driver​


----------



## Makaveli (May 26, 2020)

W1zzard said:


> AMD marketing is pushing reviewers hard for that, because it's the only way their cores don't sit idle in games. Do really that many people game and stream at the same time?



Twitch Streamers?


----------



## lesovers (May 26, 2020)

John Naylor said:


> Gaming is 395 watts  .... the 3900X is 385 watts ... what are  we all  missing ?  Must be 15 posts commenting on the "heaters" .... reading comprehension issues ?
> 
> 3900X = 385 watts / 79C
> 10800K = 295 watts / 54 C ... this "heater" is 25C cooler
> ...




The total system gaming power consumption ranges from Core i5-9600K @ 356W to Core i9-10900K @ 395W, this is only a range difference of only 39 Watts for all of the 14 x CPUs.
Me thinks this is mostly GPU power not CPU, the Core i5-9600K (not overclock) for example does not draw anything like 356 Watts !!!!!!!!


----------



## Hossein Almet (May 26, 2020)

When talking about performance, most (if not all) reviewers simply refuse to put the cost-benefit analysis in the same sentence.  That is to say if you put the cost-benefit analysis in the same sentence, the 10900K will be less attractive.


----------



## Vya Domus (May 26, 2020)

Vayra86 said:


> That's the thing; its not possible going by the usage stat.



Usage figures are worthless anyway, they reflect almost nothing. They are calculated based of deltas for the total process time according to a fixed sampling rate so they don't actually measure something happening in hardware, in fact "CPU usage" is a totally misleading term. For instance there is no way to know whether a core is running mostly integer/float//double/SIMD/etc or a combination of all, so you could have two workloads generating "100% usage" but one may cause the core to consume 5W and one 15W.


----------



## AddSub (May 26, 2020)

@W1zzard!  thumbs-all-the-way up for still doing SuperPi front and center. Keeping everyone honest. Makes me smile everytime I see it. I'm sure "they" are loving it, that you are still using something like that.

...
..
.


----------



## W1zzard (May 26, 2020)

GeorgeMan said:


> but on page 20 I don't see it hitting 5.3GHz even on 1 thread generic FP load. Isn't it advertised to be able to hit 5.3GHz on 1-2 thread loads or am I missing something?


Correct, it will not hit tvb with that load. It seems you need very light weight activity for it to activate, like humans clicking around, not calculating something (which puts a single core at 100% for several seconds)


----------



## Hossein Almet (May 26, 2020)

TheLostSwede said:


> I guess you haven't done any benchmarking yourself?
> The problem is that as soon as you introduce more variables, there's a huge risk of the testing environment throwing off the normally reproducible numbers. Then we're ending up in a situation where a lot of people are going to start questioning the benchmark results, as every time they come outside the normal 1-2% variance or less. Benchmarking for review purposes has to deliver reproducible results across several platforms, so you need to try to minimise variables that might affect the performance in a negative way, regardless of what you're testing.
> 
> Yes, a lot of people run at least something like discord alongside their gaming, but live streaming, not sure how many people really does that. That said, most of us probably don't turn off all the background services and what not when we game either, so you lose out a few percents performance there too.
> ...


Intel is better than AMD idle, because generally Intel's CPUs base clocks are much lower than AMD's base clock, not to mention fewer cores as well.  In this case, 10 cores with the base clock of 3.7 GHz  vs 12 cores  with the base clock of 3.8 GHz.


----------



## SIGSEGV (May 26, 2020)

thanks for the review, from my point of view AMD Ryzen still the winner.


----------



## vlad.coolish (May 26, 2020)

Please, add to CPU test Civilization VI AI benchmark score (Average turn time, s).


----------



## Vya Domus (May 26, 2020)

Hossein Almet said:


> Intel is better than AMD idle, because generally Intel's CPUs base clocks are much lower than AMD's base clock, not to mention fewer cores as well.  In this case, 10 cores with the base clock of 3.7 GHz  vs 12 cores  with the base clock of 3.8 GHz.



These CPUs idle at much lower clocks than the base frequencies.


----------



## cucker tarlson (May 26, 2020)

B-Real said:


> Comet Lake is a huge NO for every PC builder


yeah,right  typical amd PR talk.NO intel cpu is even remotely good.if you have more of such ridiculous statements please keep them to yourself.
10900k is not a good value option,but there are many skus in the 10th gen that'd make a very good proposition for majority of users.and I'm saying it as someone who's got absolutely no interest in 10th gen at this point.
10400f is a very good budget chip,considering the cpu alone-better than 3600 for a home/gaming rig.so is 10600kf for an enthusiast gamer and 10700f for,well,frankly everything,it's a 9900 non-K at $350 with boost clocks just shy of the K-sku.
wouldn't buy  them with b550 coming,but still,your point is just laughable.


----------



## laszlo (May 26, 2020)

it is the fastest gaming cpu no comment; i call it "ego cpu" as is all they can do for now ; the eskimos will be happy to have it btw


----------



## Vayra86 (May 26, 2020)

John Naylor said:


> Gaming is 395 watts  .... the 3900X is 385 watts ... what are  we all  missing ?  Must be 15 posts commenting on the "heaters" .... reading comprehension issues ?
> 
> 3900X = 385 watts / 79C
> 10800K = 295 watts / 54 C ... this "heater" is 25C cooler
> ...



Those temps might be a tad misleading. But 54 C on this CPU going at stock is a fairy tale, you can safely forget about that. It sure as hell never happened on my 8700K and this one has additional cores and higher TDP. And definitely not on a Scythe Fuma lol. Dual stack air on this sort of CPU easily produces 70C and above load temps. But its not relevant. Both are very much in the safe zone and no throttling happens at stock. That is; they aren't clocking back. Whether they reach advertised boost all the time is another story... That goes for both camps too.

Performance wise my opinion here is that its all the same anyway. For gaming... anything 8700K and over will do fine, it was like that 2 years ago and has not changed. Not for high refresh gaming either, really. The differences we see here are in single digit percentages most of the time, and the base level of performance is great. If you do some content production and or heavy multitasking as a home user, some 8 core version will suit you better.

Beyond that... I think every other metric for a CPU choice comes into play in a big way, more so than performance. The biggest one is then obviously price, as the platforms are rather similar now. But Intel's aggressive boost and power limiting is not really something to get all excited about, in that sense Ryzen still has the definite edge.



EarthDog said:


> That's 7.5%... the difference between high and ultra or 2xAA and 4xAA...or pegged at 144 FPS versus not. It's something...and in some cases, a bump in the class of card!



Correction... 7.5% averages; not minimums which is what its all about. Nobody cares (read: should care) if games can peak to 240 FPS if the average is 150, but it still pushes averages up. Even with those monster scores all you _really _ need is the CPU that will hold at least 120. And even then you will be dropping below it from time to time. Its a CPU. The load will vary and is still then limited by the GPU, and most people don't push 2080tis. Come next gen they might, but even then, you're in the safe zone with all of these cpus. None of them will be holding anything back in any notable way.


----------



## W1zzard (May 26, 2020)

Vayra86 said:


> But 54 C on this CPU going at stock is a fairy tale, you can safely forget about that.


That's what I measured, and yes, my 8700K runs warmer than that, too. Both are sitting at PL1 125 W in that test, which is the definition of "stock"


----------



## KLMR (May 26, 2020)

I feel unimpressed by the numbers between the last two generations. IMHO its an unnecessary product for the market, only necessary for "the press"; to say "hello we're still here".
Given the good performance of the i5-9600 the trick will be to position the i7 i5 price points to simulate "improved performance" over the previous generation.

This is my last day with a X5690. What is the point of this? Overclocking, for me isn't worth anymore to try to achieve 5-10% improvement for a 70% power increase. OC is for the fun of OCing, not for a real palpable benefit anymore. Because going from 60 to 64 fps at 4k in a bench or from 150 to 170 fps 1080 its pointless or rendering something in 48 seconds instead of 56. And if you're a mad renderer, CC, or simulator, you already have a sTR4 or 2066 platform.

To compete with AMD they should offer more pcie lanes, or pcie4, thunderbolt in all mobos, 2-3 generations per SB, etc. you know what I mean.

It would be interesting to have a 8-core limited comparison between the last 4 generations of top-of-the-line desktop cpus.


----------



## Shatun_Bear (May 26, 2020)

*'No cooler included in the box'.*

More needs to be made of this omission and in this case, you'll need a beefy, expensive cooler (240mm AIO optimal). The 3900X and upcoming 3900XT include coolers that you don't need to upgrade. So that adds at least $100 to the total price. Then there is added expense of more expensive mobos, only high-end for optimal performance.

So it's about $150-200 extra for 3.6% more fps at 1440p (realistic res) if you happen to have a 2080 Ti. No thanks.

EDIT: Actually, the listed price is the trade price. Already online the prices of these are $50 at least higher. So you're paying $200-250 more for a 10-core instead of a 12-core and I'm not exaggerating. Terrible deal.


----------



## cucker tarlson (May 26, 2020)

Shatun_Bear said:


> The 3900X and upcoming 3900XT include coolers that you don't need to upgrade.So that adds *at least $100* to the total price.


but frankly ones that you want to upgrade.
they get decimated by $30 dollar coolers,let's not make a big deal out of wraith prism.it looks super dope and is a decent stop gap solution.


----------



## BSim500 (May 26, 2020)

W1zzard said:


> AMD marketing is pushing reviewers hard for that, because it's the only way their cores don't sit idle in games. Do really that many people game and stream at the same time?





TheLostSwede said:


> The problem is that as soon as you introduce more variables, there's a huge risk of the testing environment throwing off the normally reproducible numbers. Then we're ending up in a situation where a lot of people are going to start questioning the benchmark results, as every time they come outside the normal 1-2% variance or less. Benchmarking for review purposes has to deliver reproducible results across several platforms, so you need to try to minimise variables that might affect the performance in a negative way, regardless of what you're testing.


^ This. The other problem with benchmarking game streaming as a CPU test is that in the real world, the vast overwhelming majority don't "brute force" software-encode on one CPU on one PC. Serious streamers usually have a second PC dedicated for capture / encoding for reasons beyond mere CPU offload (eg, some games just don't 'play well' with video broadcast software and experience weird non performance related stutter, you can also ALT-TAB out without fear of breaking something vs broadcasting software capturing Exclusive Fullscreen, a crash on the gaming PC doesn't bring down the stream, better ergonomics having separate broadcast controls on another PC, etc). It's just more professional all round. Likewise budget streamers don't sit there saying _"I only have a quad-core. Let's inflict stutter on my audience by trying to brute-force CPU encode"_ Instead they just enable NVEnc / Shadowplay in OBS (fixed-function encoders on the dGPU) and enjoy "good enough" quality for most games (especially Turing) with <5% FPS impact and hardly any extra CPU load. Others might use an external capture box (a common option for console game streaming too). Real-world streaming CPU usage is clearly going to be vastly different between these methods vs CPU encoding on the same PC and the latter is not actually the most popular way of doing it.

Same goes for _"well let's try and benchmark this game with a web browser running with 200x tabs open to try and find something to fill up those cores"_. Any half decent modern browser should be intelligent enough to both "lazy load" tabs plus force suspend idle background tabs (a highly desirable feature anyway vs hostile background mining scripts plus reduced memory usage). Many browsers now block tracking scripts natively and anyone with half a brain will have been running uBlock Origin since 2014. There are options for telling W10 to not download updates during certain time periods / game sessions. All of a sudden, supposedly FPS crippling levels of background CPU usage drops to barely a couple of percent on a quad-core and becomes a non-issue for gaming with a bit of common sense.

So rather than waste time on inconsistent and non-repeatable "background benchmarking" just to try and find something to fill up unused cores for the 1% of gamers who stream, personally I'd rather see games benchmarked normally but have another vote here for including 1% / 0.1% lows.


----------



## Radium69 (May 26, 2020)

W1zzard said:


> AMD marketing is pushing reviewers hard for that, because it's the only way their cores don't sit idle in games. Do really that many people game and stream at the same time?



That is a good question. Ill respond to that in a moment.
The question I ask myself is: Which way is the CPU industry heading and in what context does gaming fit into this.

PS5 development shows that we have moved to 8x Zen 2 Cores, maybe even with SMT enabled who knows. So they are aiming for 3700x performance (assumption)
Prediction and personal experience: Game development shows that games are getting more complex and use more cores. 2->4->6 core processors shows improvement in latest titles. (eg. Borderlands3)
So with Intel upping the core count together with AMD, applications will 'probably' show more scaling, this has been a trend for some time. Even Windows 10 shows improvement with core count increase.

Keeping in mind people tend to use their pc for >5years, and it becomes pretty logical (my understanding) to invest in 'moar corez'

So to (sort of) answer your question, 
And this example is N=1

People tend to use all kinds of apps during gaming. Like I said, Spotify for music, and discord for user to user/community interactivity.
^is based on only having 1 monitor.

With two monitors becoming more popular these days (Sorry N=1) I myself use youtube / twitch to watch other players or view some hardware footage, browse the web, multiple tabbs open etc. etc...









						Intel Core i5 10600K & Core i9 10900K review: de snelste gaming CPU die je niet moet kopen
					

In deze review van de Intel Core i5 10600K en Core i9 10900K, codenaam Comet Lake, lees je alles over de 10de generatie Intel Core-processors en vind je...




					translate.google.com
				




See for yourself, and this is only using a 6mbit stream. (8mbit or higher is becoming the norm these days)
It's quite funny to see the 3600 'overtaking' the i9-10900K on that graph, by 3%.

I think AMD is on the right track. Let's hope it stays that way.

I've been using intel and Nvidia hardware for almost all of my systems, but for some reason (finewineTM) AMD hardware seems to have more longetivity.
PS: Intel got me good with the 6600K a processor which is just worthless for gaming nowadays. Next system will be AM4 based.
PS2: I can't predict the future, so my post is based upon the good 'ol glass orb.


----------



## Shatun_Bear (May 26, 2020)

cucker tarlson said:


> but frankly ones that you want to upgrade.
> they get decimated by $30 dollar coolers,let's not make a big deal out of wraith prism.it looks super dope and is a decent stop gap solution.



Ok but you can cool a 3900X with one of these but no way can you tame a 10900K that gets to 85 degrees even with a Noctua NH-U12.

Overrall point is total system cost is prohibitively expensive, much more so than a 3900X that can run on a cheap X470/B450 even X370 board with a $30-50 cooler or the one included in the box. On top of that the review price of $500 for CPU itself is a unicorn i.e it doesn't exist. Here in the UK OverclockersUK has the price as £540. The 3900X is £440.


----------



## cucker tarlson (May 26, 2020)

Shatun_Bear said:


> Ok but you can cool a 3900X with one of these but no way can you tame a 10900K that gets to 85 degrees even with a Noctua NH-U12.
> 
> Overrall point is total system cost is prohibitively expensive, much more so than a 3900X that can run on a cheap X470/B450 even X370 board with a $30-50 cooler or the one included in the box. On top of that the review price of $500 for CPU itself is a unicorn i.e it doesn't exist. Here in the UK OverclockersUK has the price as £540. The 3900X is £440.


point taken,I wasn't debating if you need a custom cooler to run 3900x,you don't.
I said you want one cause wraith prism is gonna be warm,loud and frankly there's low end coolers that would run circles around it.


----------



## Shatun_Bear (May 26, 2020)

cucker tarlson said:


> point taken,I wasn't debating if you need a custom cooler to run 3900x,you don't.
> I said you want one cause wraith prism is gonna be warm,loud and frankly there's *low end coolers that would run circles around it*.



Oh sure.


----------



## cucker tarlson (May 26, 2020)

Shatun_Bear said:


> Oh sure.


consult the chart I posted.
that's why I posted it - for you to read it.
look at the price of arctic freezer 34 and what it does to wraith prims temps/noise wise


----------



## Radium69 (May 26, 2020)

TheLostSwede said:


> I guess you haven't done any benchmarking yourself?
> The problem is that as soon as you introduce more variables, there's a huge risk of the testing environment throwing off the normally reproducible numbers. Then we're ending up in a situation where a lot of people are going to start questioning the benchmark results, as every time they come outside the normal 1-2% variance or less. Benchmarking for review purposes has to deliver reproducible results across several platforms, so you need to try to minimise variables that might affect the performance in a negative way, regardless of what you're testing.
> 
> Yes, a lot of people run at least something like discord alongside their gaming, but live streaming, not sure how many people really does that. That said, most of us probably don't turn off all the background services and what not when we game either, so you lose out a few percents performance there too.
> ...



Thanks for taking your time to respond.



TheLostSwede said:


> The problem is that as soon as you introduce more variables, there's a huge risk of the testing environment throwing off the normally reproducible numbers.



You say, huge risk, I say more 'real world' results. I'm not saying that TPU is obligated to test these scenarios, but more to expand their testing or atleast shed some light on possible future testing.



TheLostSwede said:


> That said, most of us probably don't turn off all the background services and what not when we game either, so you lose out a few percents performance there too.



Fair enough, and to add some of that to your list, think about AV software, chrome background app, adobe reader background app. Win updates running in the background. Lot of these things are already present in the background these days...



TheLostSwede said:


> I doubt PCIe 3.0 vs PCIe 4.0 would make any difference to idle power, at least not if no PCIe 4.0 devices are being used.



Like you said, and to expand on your sentence, ryzen's mobile chips use a different io die (No pci-e4) and with additional optimizations they made a pretty big movement so far. It's quite astonishing really...

It's just a refresher to not stare yourself blindly at the FPS results, a lot more has changed.


----------



## my_name_is_earl (May 26, 2020)

I'm sure some fanboi will plug their ears and say "lala lala la allalaaalala" .


----------



## ratirt (May 26, 2020)

cucker tarlson said:


> consult the chart I posted.
> that's why I posted it - for you to read it.
> look at the price of arctic freezer 34 and what it does to wraith prims temps/noise wise


It is better but that is also an additional cost. Wraith prism is not that bad. I had it for a while with my 2700x and it was pretty ok. The fact that you can go 3900x with it is a solid achievement.
No wonder, Intel didn't go with stock cooler with this 10900K. What would be the point if Noctua can barely keep up. If you consider stock for this Intel CPU , I'd suggest entry liquid solution.


----------



## cucker tarlson (May 26, 2020)

ratirt said:


> It is better but that is also an additional cost. Wraith prism is not that bad. I had it for a while with my 2700x and it was pretty ok. The fact that you can go 3900x with it is a solid achievement.
> No wonder, Intel didn't go with stock cooler with this 10900K. What would be the point if Noctua can barely keep up. If you consider stock for this Intel CPU , I'd suggest entry liquid solution.


Its an achievement for the 3900x tbh.not the cooler.


----------



## ratirt (May 26, 2020)

cucker tarlson said:


> Its an achievement for the 3900x tbh.not the cooler.


Yeah but my point was, you can fit that kind of a cooler and sell is as a box. No additional costs involved. You always can go better that's for sure.


----------



## EarthDog (May 26, 2020)

Vayra86 said:


> Correction... 7.5% averages; not minimums which is what its all about. Nobody cares (read: should care) if games can peak to 240 FPS if the average is 150, but it still pushes averages up. Even with those monster scores all you _really _ need is the CPU that will hold at least 120. And even then you will be dropping below it from time to time. Its a CPU. The load will vary and is still then limited by the GPU, and most people don't push 2080tis. Come next gen they might, but even then, you're in the safe zone with all of these cpus. None of them will be holding anything back in any notable way.


Correction? I never specified mins/avg. I went off of your post (which specifically stated average). But let me see if I understand this (I may need coffee, it's early here, lol)....

The top 1% is irrelevant because it pushes the average UP... but poor 1% lows is relevant because it pushes it down?? Wouldn't it carry similar levels of importance/difference to the average? I mean, I get the point...nobody cares about peak FPS as much as average and mins, but... did a soul here mention peaks? I thought this was average and mins?

EDIT: 1% lows isn't The Gospel (as some feel), but it does give users an idea of what the 'worst' can look like. The bigger the difference, the more potential there is to 'notice' such a difference. As I said, 7.5% average is almost an entire tier of card, and surely the difference between being able to raise multiple settings for higher IQ.



ratirt said:


> No wonder, Intel didn't go with stock cooler with this 10900K.


What I believe a lot of people are seemingly forgetting is that they haven't had a stock cooler included with thier K/X series CPUs in several years......that said, their locked CPUs do come with a cooler.


----------



## Vayra86 (May 26, 2020)

EarthDog said:


> Correction? I never specified mins/avg. I went off of your post (which specifically stated average). But let me see if I understand this (I may need coffee, it's early here, lol)....
> 
> The top 1% is irrelevant because it pushes the average UP... but poor 1% lows is relevant because it pushes it down?? Wouldn't it carry similar levels of importance/difference to the average? I mean, I get the point...nobody cares about peak FPS as much as average and mins, but... did a soul here mention peaks? I thought this was average and mins?
> 
> ...



Maybe I should word it better... its just hard to, really. In my personal experience, above a certain threshold you simply don't really suffer anything from losing 10 FPS. For me personally that limit is around the 100 FPS mark. If it goes substantially below, I will 'feel' it in normal gaming. The difference in feeling, the higher you go above 100~120 FPS, becomes progressively lower.

In comparison, the 10 FPS gap between 60 and 50 FPS is huge. But 20 FPS above 100... I could care less. In addition, this is a CPU and not a GPU. Its not the primary performance limiter, although it might be over time going past several GPU upgrades.

Percentages don't tell the whole story that was my point. Above a certain level its just not really relevant, as much as 240hz isn't really relevant but high refresh rate gaming as a whole certainly is. Its also good to consider what kind of gamer you are, and what niche you're truly buying into. The vast majority of people parroting 'must have best cpu for gaming' are not even chasing that top end FPS, they chase IQ before FPS above a certain threshold.


----------



## cucker tarlson (May 26, 2020)

Vayra86 said:


> Maybe I should word it better... its just hard to, really. In my personal experience, above a certain threshold you simply don't really suffer anything from losing 10 FPS. For me personally that limit is around the 100 FPS mark. If it goes substantially below, I will 'feel' it in normal gaming. The difference in feeling, the higher you go above 100~120 FPS, becomes progressively lower.
> 
> In comparison, the 10 FPS gap between 60 and 50 FPS is huge. But 20 FPS above 100... I could care less. In addition, this is a CPU and not a GPU. Its not the primary performance limiter, although it might be over time going past several GPU upgrades.
> 
> Percentages don't tell the whole story that was my point. Above a certain level its just not really relevant, as much as 240hz isn't really relevant but high refresh rate gaming as a whole certainly is.


That is how difference in percenage works 60 to 50 is more than 120 to 110.literally.
But a product isn't bad cause most people cant tell the difference,its niche.
10900k is what a CPU audiophile headphone would look like.
A whole bunch od money for a tiny difference,but still the best.
I think most pc enthusiasts would be fine with a 3600,and those who arent have options like 10600kf or 10700f.
Its ridiculous how people assume theres a tiny difference between the first and the second best,then between the third and fourth,so on and so on,therefore they come to a conclusion that the difference between say 10900k and 3700x is small.its actually pretty darn big when you add it up,regardless if the individual claims to see or not see it.


----------



## Vayra86 (May 26, 2020)

cucker tarlson said:


> That is how difference on percenage works 60 to 50 is more than 120 to 110.literally.



I spoke of 20 above 100. Same percentage.


----------



## ratirt (May 26, 2020)

EarthDog said:


> What I believe a lot of people are seemingly forgetting is that they haven't had a stock cooler included with thier K/X series CPUs in several years......that said, their locked CPUs do come with a cooler.


So, because there was never any cooler included for several years, in your eyes it's OK and actually it is an added value that these don't have one included either for higher price? Stock cooler is good to start and after some time buy something better. 
Either way I do get why Intel doesn't include one now. Which one would it be? Only liquid can fit the purpose.


----------



## EarthDog (May 26, 2020)

Vayra86 said:


> Maybe I should word it better... its just hard to, really. In my personal experience, above a certain threshold you simply don't really suffer anything from losing 10 FPS. For me personally that limit is around the 100 FPS mark. If it goes substantially below, I will 'feel' it in normal gaming. The difference in feeling, the higher you go above 100~120 FPS, becomes progressively lower.
> 
> In comparison, the 10 FPS gap between 60 and 50 FPS is huge. But 20 FPS above 100... I could care less. In addition, this is a CPU and not a GPU. Its not the primary performance limiter, although it might be over time going past several GPU upgrades.
> 
> Percentages don't tell the whole story that was my point. Above a certain level its just not really relevant, as much as 240hz isn't really relevant but high refresh rate gaming as a whole certainly is. Its also good to consider what kind of gamer you are, and what niche you're truly buying into. The vast majority of people parroting 'must have best cpu for gaming' are not even chasing that top end FPS, they chase IQ before FPS above a certain threshold.


I get ya... but as I have described already, it's almost a whole tier of card or raising the settings for higher IQ. It is tangible. Obviously when talking lower FPS it doesn't matter as much by count, but you can still raise settings with that 'buffer' or reach your FPS goals.



ratirt said:


> So, because there was never any cooler included for several years, in your eyes it's OK and actually it is an added value that these don't have one included either for higher price?
> Either way I do get why Intel doesn't include one now.


Don't put words in my mouth. I didn't mention nor allude to talking about value it adds (or doesn't).

I replied to you because you seem to believe they don't include a cooler because of the heat output..... which is patently FALSE. So, again, Intel, for several generations now, have NOT included a cooler with their unlocked chips. This is nothing new. The i9-10900 (non K) includes a heatsink, right?

EDIT: Generally, people who buy an unlocked Intel processor are overclocking in the first place. And with that, you need cooling that is notably better than Intel's stock heatsink in every case... I get why Intel did this so many years ago. Cheers on AMD for handing one out though.


----------



## cucker tarlson (May 26, 2020)

ratirt said:


> So, because there was never any cooler included for several years, in your eyes it's OK and actually it is an added value that these don't have one included either for higher price? Stock cooler is good to start and after some time buy something better.
> Either way I do get why Intel doesn't include one now. Which one would it be? Only liquid can fit the purpose.


Jesus,the cooler argument is fine for budget pc,can we not hear it again in the +$400 range ?
you need about $25 to match the value of a new wraith cooler temps/noise wise,how much do you need to match the value of a new igpu ?


----------



## Shatun_Bear (May 26, 2020)

cucker tarlson said:


> consult the chart I posted.
> that's why I posted it - for you to read it.
> look at the price of arctic freezer 34 and what it does to wraith prims temps/noise wise



No I meant literally 'sure' I was agreeing with you lol.


----------



## cucker tarlson (May 26, 2020)

Shatun_Bear said:


> No I meant literally 'sure' I was agreeing with you lol.


lol,sorrry,I thought you were sarcastic


----------



## Frick (May 26, 2020)

W1zzard said:


> AMD marketing is pushing reviewers hard for that, because it's the only way their cores don't sit idle in games. Do really that many people game and stream at the same time?





W1zzard said:


> AMD marketing is pushing reviewers hard for that, because it's the only way their cores don't sit idle in games. Do really that many people game and stream at the same time?



Obviously the number of people streaming is < than the numbers of gamers in the world but many, many people are into it. Twitch apperntly had over 5 million streamers april 2020. Obviously they're not all AAA game streamers but it's still a big market.

 I'd be interested in those numbers, if nothing else because you can't get enough data when reading reviews.


----------



## cucker tarlson (May 26, 2020)

Frick said:


> Obviously the number of people streaming is < than the numbers of gamers in the world but many, many people are into it. Twitch apperntly had over 5 million streamers april 2020. Obviously they're not all AAA game streamers but it's still a big market.
> 
> I'd be interested in those numbers, if nothing else because you can't get enough data when reading reviews.


yup.it's a fair test.100%.
I heard from a person who reviews cpus/gpus that amd insisted on reviewers using imc specified ram for 3300x testing,e.g. to test 7700k on 2400mhz ram and 3300x on 3200mhz ram.and apparently many sites did,though the one I'm speaking of refused.now that is something that is just not cool.


----------



## EarthDog (May 26, 2020)

Frick said:


> Twitch apperntly had over 5 million streamers april 2020.


Wow... really? Link to that? I'm interested in seeing some numbers there! 

Like I said earlier, I think this type of article is maybe good annually or something. But to test for it in each review sounds like a monumental PITA and not worth the time.


----------



## ratirt (May 26, 2020)

EarthDog said:


> I get ya... but as I have described already, it's almost a whole tier of card or raising the settings for higher IQ. It is tangible. Obviously when talking lower FPS it doesn't matter as much by count, but you can still raise settings with that 'buffer' or reach your FPS goals.
> 
> Don't put words in my mouth. I didn't mention nor allude to talking about value it adds (or doesn't).
> 
> ...


I'm not pushing nor putting words in your mouth. Stop with this stupid phrase. I simply ask why there shouldn't be a stock cooler. Because there never was one? I simply disagree with you about what you said. Since there was never a stock cooler, meaning nobody should expect one or maybe because it is not needed. 
I never said, cooler is not there due to heat output. Maybe Intel doesn't seem fit or has a reason to get a cooler for K version CPUs knowing people will OC those and the stock cooler will not make it. 
What I said is, I'd see more value if the stock cooler was included anyway. 



cucker tarlson said:


> Jesus,the cooler argument is fine for budget pc,can we not hear it again in the +$400 range ?
> you need about $25 to match the value of a new wraith cooler temps/noise wise,how much do you need to match the value of a new igpu ?


Great. Budget PC is a budget pc and you can get a better cooler despite a budget pc purchase. I still think a stock cooler would have been nice gesture for any CPU. No matter the K line of CPUs never had one (BTW 3770K had a stock cooler so as 4770K if I'm not mistaken about the last one).


----------



## cucker tarlson (May 26, 2020)

EarthDog said:


> Wow... really? Link to that? I'm interested in seeing some numbers there!
> 
> Like I said earlier, I think this type of article is maybe good annually or something. But to test for it in each review sounds like a monumental PITA and not worth the time.


lockdown is responsible for 70% of that probably



ratirt said:


> Great. Budget PC is a budget pc and you can get a better cooler despite a budget pc purchase. I still think a stock cooler would have been nice gesture for any CPU. No matter the K line of CPUs never had one (BTW 3770K had a stock cooler so as 4770K if I'm not mistaken about the last one).


for 10900k it's just plain stupid to include one.that'd have to raise the cost,better not to include it and let ppl decide what kind of cooling they wanna run with it.

btw 4790k and 5775c both had a cooler,I have both sitting in the cabinet,they smell brand new.also-they're bad.


----------



## EarthDog (May 26, 2020)

ratirt said:


> I'm not pushing nor putting words in your mouth. Stop with this stupid phrase. I simply ask why there shouldn't be a stock cooler. Because there never was one? I simply disagree with you about what you said. Since there was never a stock cooler, meaning nobody should expect one or maybe because it is not needed.
> I never said, cooler is not there due to heat output. Maybe Intel doesn't seem fit or has a reason to get a cooler for K version CPUs knowing people will OC those and the stock cooler will not make it.
> What I said is, I'd see more value if the stock cooler was included anyway.
> 
> ...


You did seem to put words in my mouth. I told you why. It's there in plain English (though I believe you are ESL, sorry). No matter though. I was just clarifying the reason why Intel hasn't included a cooler on their unlocked chips for generations.

Maybe I was confused by these lines......





ratirt said:


> No wonder, Intel didn't go with stock cooler with this 10900K. What would be the point if Noctua can barely keep up





ratirt said:


> Either way I do get why Intel doesn't include one now. Which one would it be? Only liquid can fit the purpose.


This tells users that because a certain heatsink/intel stock can't keep up with the HEAT, Intel doesn't include a stock cooler.

Mean what you say... say what you mean, please. 

Who in their right mind would think a cooler isn't needed because it isn't included? lolololololol



A cooler can add more value, but again, intel and AMD do things differently, clearly. All AMD chips are unlocked and come with coolers (9590 didn't though, did it?). Intel, few chips are unlocked and those that are do not include a cooler. It is what it is. If you don't plan on buying an aftermarket cooler to overclock an Intel, you're in the wrong line.


----------



## Frick (May 26, 2020)

EarthDog said:


> Wow... really? Link to that? I'm interested in seeing some numbers there!
> 
> Like I said earlier, I think this type of article is maybe good annually or something. But to test for it in each review sounds like a monumental PITA and not worth the time.



twitchtracker.com/statistics

I saw another page (statista.com) but that's behind a paywall I think. Still several million streamers/month before covid19.

And yeah annually or bianually would be a good thing. Or whenever both AMD and Intel have released new flagships.


----------



## ratirt (May 26, 2020)

EarthDog said:


> This tells users that because a certain heatsink/intel stock can't keep up with the HEAT, Intel doesn't include a stock cooler.
> 
> Mean what you say... say what you mean, please.
> 
> ...


Because they are overclocking. Great. I said it would have more value do you know why? Of course you will be overclocking the CPU for games mostly. And yet you still need a GPU (other components don't play much of a role here so i will skip these)
You don't need to OC straight from the start but what lack of cooler (certified by Intel for the processor) gives you is spending a lot of cash at a start to make the computer functional. $100-$150 cost is for any note worthy liquid (assumption is you would go with that one to get out of the CPU the best you can. I'm sure you would but maybe not straight form the start). That means you are robbed off $120 (lets say) because buying something for the time being (mid range cooler) is simply stupid. You are going to OC so you need best there is or at least a decent one which will cost. You could have spent this cash for better GPU since you got a stock cooler you can get the computer working right away. After 2 or 3 months you buy an awesome liquid (when you get the cash) for your CPU and OC and you get value because you already have 2080 Super instead 2070 Super for instance and you did it for games. (that is where the Intel CPU shine isn't it?)  No stock certified cooler forces you to buy best there is at a start to make the computer work and OC  but you may have to tone down the GPU purchase which is stupid because you are buying this PC for GAMES mostly.
That is how I see it and for me, stock cooler is very important and I'm sure there would have been way more buyers for this or other "K" line CPU from Intel if they have had that option. I would consider at least a purchase.
I don't care how Intel and AMD do things. I said, that for me (maybe for others as well) it has a value and saying because there never was or you will OC so stock cooler is useless is not anywhere near the answer I been expecting from someone who knows stuff about CPUs and other.



cucker tarlson said:


> btw 4790k and 5775c both had a cooler,I have both sitting in the cabinet,they smell brand new.also-they're bad.


Great and this one doesn't so.... for years never had makes no sense and you still OC 4790k for instance not with that stock cooler.


----------



## Vayra86 (May 26, 2020)

cucker tarlson said:


> lockdown is responsible for 70% of that probably
> 
> 
> for 10900k it's just plain stupid to include one.that'd have to raise the cost,better not to include it and let ppl decide what kind of cooling they wanna run with it.
> ...



Well the copper core stock Intel cooler wasn't bad at all for a topflow. Not overly loud either. Still got the one coming off a 3570K, keeping it as a spare, emergency cooler for whenever a heatsink magically blows up or something 

But overall I do agree, just include something in the lower range. Anything i7 and up should not have it IMO, and K definitely doesn't need it. Its a waste of metal imo. Similar story on Ryzen from the 5 onwards really. If its more than a quad, stick something cheap and decent on it, even if just for quality of life and getting the advertised perf out of it.



Shatun_Bear said:


> *'No cooler included in the box'.*
> 
> More needs to be made of this omission and in this case, you'll need a beefy, expensive cooler (240mm AIO optimal). The 3900X and upcoming 3900XT include coolers that you don't need to upgrade. So that adds at least $100 to the total price. Then there is added expense of more expensive mobos, only high-end for optimal performance.
> 
> ...



You don't need a 240 AIO to extract the advertised performance from this CPU, and any overclocking is worthless anyway. The review literally spells that out. So where is the cooler mark up in this situation? You place some 30-40 bucks worth of air on it and you're done. End result: you lose maybe 3-5% of the top perf you can get out of this chip. OTOH, why not just get some non-K version then


----------



## Jism (May 26, 2020)

dirtyferret said:


> Am I the only one who's not impressed with either AMD/Intel sticking as many cores as they can on their CPUs?



CPU's are hitting a speed wall beyond 5GHz. It takes immense power and cooling in order todo so. So if you cant go any faster, simple go wider. More and more games / programs are suitable for multiple threads instead of just the traditional one thread.

The race who had the fastest CPU related to just clocks is long way over. It's all down to optimizing the current to even more IPC.


----------



## ThrashZone (May 26, 2020)

Hi,
A lot of fuss over intel not including air coolers with their chips 
Boxes are getting big enough to include a D15 lol not sure why they don't sure there's some cheap China copies running around they could stuff into the box for pennies on the yen 

I have nothing against amd just waiting for amd 4 and thread ripper 4 and see what it's going for 
Price difference from 3950x and 3960x if that price difference didn't double I would of gotten a 3960x so too bad I was all set until that happened.

Instead I now have a 10 core gaming chip that is slaughtering HEDT delid 7900x/ 9900x & 10900x for less money in the first two cases a lot less $$$$ so it's all good


----------



## Rahmat Sofyan (May 26, 2020)

Comet Lake ... no wonder why these CPU really hot ... made from comet ..

mehhh


----------



## gamefoo21 (May 26, 2020)

Well it looks like the leap frogging continues.

Zen 3 should pull it back with AMD on top. If the rumours of the XT series pan out, them AMD should be able to narrow the gap.

Too bad we didn't get to see the 3950x in the benches.

The per core HTT sounds like a good time. Only turn it on with the best cores and use the extra power budget for more clocks.


----------



## cucker tarlson (May 26, 2020)

gamefoo21 said:


> The per core HTT sounds like a good time. Only turn it on with the best cores and use the extra power budget for more clocks.


da wat?


----------



## mrthanhnguyen (May 26, 2020)

Another silicon lottery was hit by Dancop


----------



## gamefoo21 (May 26, 2020)

cucker tarlson said:


> da wat?



On the 10 series you can disable HTT on a per core basis. So you can run 10c/10t - 10c/20t... Like if you wanted 10c/16t for benching certain apps or whatever.


----------



## cucker tarlson (May 26, 2020)

really ?


----------



## gamefoo21 (May 26, 2020)

cucker tarlson said:


> really ?



That was something Buildzoid geeked out over.


----------



## cucker tarlson (May 26, 2020)

gamefoo21 said:


> That was something Buildzoid geeked out over.


useless imo
but yeah,they way HT is implemented,why not.
it'd be interesting to see HT scaling in games.
not really practical though.


----------



## gamefoo21 (May 26, 2020)

cucker tarlson said:


> useless imo
> but yeah,they way HT is implemented,why not.
> it'd be interesting to see HT scaling in games.



Well the thing is if you are concerned with power usage, running HTT on 2 or 4 cores will get you most of the benefit.

Look at the power usage differences between the 9700K and 9900K, when unlocked and overclocked on air you are looking at a 70W increase from HTT.

On mobile it's definitely something I would appreciate. On desktop you could get quite a bit more performance because the cpu can clock higher for longer without burning the power budget and overloading the cooling. For overclocking, you disable HTT on the weaker cores, and you can get more hard clocks.


----------



## W1zzard (May 26, 2020)

I played with selective HT disablement today

three runs:

turn off HT on first 2 cores
turn off HT on mid 2 cores
turn off HT on last 2 cores
No significant differences, no magical performance gains

The theoretical value proposition sounds nice, set aside some high-performance cores without HT, for "special" loads, and leave HT enabled on the rest, so you can run more threads. Also no response from Intel after asking "does disabling HT for specific cores change their CPPC2 core ranking?"


----------



## dgianstefani (May 26, 2020)

W1zzard said:


> I played with selective HT disablement today
> 
> three runs:
> 
> ...



Obviously you won't see any performance gains simply from turning off HT, you need to follow that up with OC. The point is to do it on either weak cores that can't otherwise achieve an all core OC, or your strongest cores so you can get an extra 100mhz or so.

Some workloads or games don't scale past say, 8 threads. So having 16 threads e.g from your 10c/20t cpu, with 4c having HT disabled but OC'd to 100 or 200mhz higher will give you much better gains than either 10c/10t or 10c/20t. 

Say for example 4 cores at 5.3ghz no HT, 6 cores at 5.2 with HT. Or a CPU that has 4 cores that can't do 5.2 with HT on, so instead of 20x5.1ghz threads you have 16x5.2.

It's a bit more involved than your run of the mill turn all core multipler to xx number, but there are gains to be had there.

In the Batman build a while back, it was very successful at having some very fast ST/MT benches, due to per core OC optimization https://www.techpowerup.com/forums/threads/batmans-caselabs-mercury-s8-work-computer.247314/. This just takes it further.


----------



## birdie (May 26, 2020)

Thank you @W1zzard for staying factual in your review and conclusions - very precise, very unbiased unlike most of the comments here.

I'd like to remind people that:

Single threaded performance is _always_ more preferrable than MOAR cores because 1) some workloads don't scale 2) some algorithms cannot be parallelized no matter how hard you try 3) in games there's always be a thread which manages all other threads loads which means faster but fewer cores will beat slower but MOAR cores 3) Most desktop applications are"spiky" in terms of CPU load, so if your CPU is able to boost high it'll mean less delays and faster response times for the user. Again, even if you have 128 cores that will not help at all in such scenarios.
Comet Lake, even if it's gen 5 Sky Lake from 2015, is still faster than Zen 2 from 2019 in the resolution most competitive gamers game at, i.e. 1080p because FPS does matter even if you don't like it or like to downplay it. I would love to see your comments again once Zen 3/Ryzen 4000 has been released (hopefully it'll *reliably *boost to 4.8GHz and has at least a 15% IPC increase, which will allow AMD to gain that 1080p gaming crown and AMD fans will finally claim the victory).
Yes, PL1 power consumption is quite high, however at least Intel is *not* deceiving its customers about it. 10900K sustained TDP is exactly 125W, v.s. e.g. Ryzen 7 3700X whose sustained TDP is around 90W which is a far cry from the advertised 65W. 3800X has a similar story: 90W advertised, around 120W real life power consumption.
Much touted AMD's advantage in power consumption/efficiency is kinda insincere to say the least. Last time I checked AMD had sold GloFo which is stuck at 12nm (which is a lot worse than Intel's 14nm) and has been using TSMC for the past several years. Intel on the other hand doesn't have this luxury while they shot too high with their initial 10nm plans whose "benefits" they have been reaping for the past two years starting with a failure called Cannon Lake.
Speaking of power consumption. AMD *recommends* water cooling for its 3900X and 3950X CPUs, so it makes perfect sense for the 10900K as well. Also, people who buy high end CPUs can normally afford good cooling solutions and they couldn't care less whether their CPU consumes 150 or 250W under load. In no way I excuse Intel for dragging their 14nm node for the sixth year straight but given the circumstances I don't think it's such a big issue.
The game of waiting for AMD fans continues, "Zen 3 is around the corner". "This year/next year/soon AMD will become the indisputable leader in performance". Aren't you tired of it? It's been like that for the past 10 years already if not more.
Now I'm going to disagree with @W1zzard on the cost of motherboards. X570 mobos were and remain quite expensive and some cost up to ... $1000, e.g. Asrock X570 Aqua. At the same time Intel has already released four more chipsets which will manifest in cheaper motherboards. B550 based mobos have just started to be sold *almost a year after* Ryzen 3000 CPU were released. Both companies are not alien to gouging.
Kudos to AMD for retaining socket compatibility for so many years. This is where Intel just doesn't want to yield. But then again, I wonder who on Earth upgrades their CPUs every year. I'm now rocking Ryzen 7 3700X and my previous purchase (Core i5 2500) was almost ten years ago. This whole intergenerational CPU socket compability is important maybe for less than 3% of customers, yet every review we'll find dozens of people who are extremely upset about that. But then you realize that most people never leave comments and the ones you see commenting do not represent the status quo.


----------



## Shatun_Bear (May 26, 2020)

birdie said:


> Thank you @W1zzard for staying factual in your review and conclusions - very precise, very unbiased unlike most of the comments here.
> 
> I'd like to remind people that:
> 
> ...



Lucky then that CPUs are not just used for 1080p gaming...and we have graphics cards for that stuff, the hint is in the name.


----------



## N3M3515 (May 27, 2020)

cucker tarlson said:


> for 10900k it's just plain stupid to include one.that'd have to raise the cost,better not to include it and let ppl decide what kind of cooling they wanna run with it.



Well, this cpu overclock is worth shit anyway, whatever cooler decent enough is fine. That said, 540 for the cpu + 50 cooler + 100 mobo = 690 vs 410 so, 50%+ for ten more frames at 110fps?, not to mention it runs hotter and consumes more power? WTF? not even an intel fan would buy this over a 3900X.


----------



## chrcoluk (May 27, 2020)

Looks like OC isnt worth it on intel anymore, those temp figures are horrible for the asus bios profile and for the manual OC.

Similar on my 9900K as well, intel now get these chips close to a reasonable max efficiency out of the box now.

On my 9900k I basically reduced the vcore,sa,io voltages and slightly boosted all core overclock by 100mhz and left it at that.



Shatun_Bear said:


> Lucky then that CPUs are not just used for 1080p gaming...and we have graphics cards for that stuff, the hint is in the name.



You will see in many of the tests decompression, math calculation etc. intel still is a fair bit faster than AMD, even a 4.3ghz 8600k beats a 4400k AMD chip.

Most people have been influenced by AMD doing well in cinebench which to be frank is not a realistic test, it only represents encoding and nothing else, and the vast majority of consumers dont encode at all, never mind on their cpu.

If you want best bang for buck, honestly most people wont need more than a quad core chip regardless if its intel or AMD, this hype about more cores has led to people paying for cores they dont need, because a few people (namely reviewers) encode videos for a living they push it as some kind of need on everyone.  8 cores will be a bit better for optimal smoothness, but after that its ego and niche purpose only.  Anyone buying a 3900x just for gaming is wasting money.  9900k is poor value for money so that deserves the same criticism as well but at least with a 9900 the extra single core performance will mean something to more people even if its only a little.  

We can clearly see the impact of per core performance when comparing the 9900k to a 9900ks, a 9900ks is probably what the 9900k should have been and its a shame it was limited supply only.  The 9900ks in many of the tests mixes with these 10xxx chips.


----------



## Caring1 (May 27, 2020)

@W1zzard  Any chance of a review of the 10900X?


----------



## Kissamies (May 27, 2020)

The good old FX-9590 has been beaten in power draw, finally.


----------



## W1zzard (May 27, 2020)

Caring1 said:


> @W1zzard  Any chance of a review of the 10900X?


No plans for HEDT reviews


----------



## cmaxvt (May 28, 2020)

The real question is.. when are these gonna be back in stock. They're sold out everywhere..and now that I'm finally not just playing Rocket League, my OC'd 3770k is starting to really show it's age xD


----------



## Elysium (May 28, 2020)

chrcoluk said:


> We can clearly see the impact of per core performance when comparing the 9900k to a 9900ks, a 9900ks is probably what the 9900k should have been and its a shame it was limited supply only.  The 9900ks in many of the tests mixes with these 10xxx chips.


Man I thought the exact same thing when Coffee Lake refresh emerged. If they had so much of this 14nm silicon lying around, why did they not just make the 9900KS a not-so-special-edition chip and bump everything that didn't make the bin down to 9700K? I mean, it's not much of a self-inflicted gunshot wound all by itself but now that we see what Comet Lake is made of, hindsight is really showing how bad of a strategic decision that was.


----------



## Kissamies (May 28, 2020)

cmaxvt said:


> The real question is.. when are these gonna be back in stock. They're sold out everywhere..and now that I'm finally not just playing Rocket League, my OC'd 3770k is starting to really show it's age xD


The new Intels are always sold out immediately (which I don't understand since this is Skylake 5.0, but I guess Intel's milking works), takes a while that you can really buy them.

I'm kicking with 2600K as I sold my R5 2600 + B450, this runs RE games and that's fair enough.


----------



## N3M3515 (May 28, 2020)

Chloe Price said:


> The new Intels are always sold out immediately (which I don't understand since this is Skylake 5.0, but I guess Intel's milking works), takes a while that you can really buy them.
> 
> I'm kicking with 2600K as I sold my R5 2600 + B450, this runs RE games and that's fair enough.



Low supply


----------



## Kissamies (May 28, 2020)

N3M3515 said:


> Low supply


Exactly, there's more demand than they can manufacture.


----------



## theGryphon (May 28, 2020)

Chloe Price said:


> Exactly, there's more demand than they can manufacture.




Well, you can manufacture 1000 units and claim that statement (as many companies did many times). Or, you can manufacture 10,000,000 and state it. Which one is it now? No telling without actual data...


----------



## Kissamies (May 28, 2020)

theGryphon said:


> Well, you can manufacture 1000 units and claim that statement (as many companies did many times). Or, you can manufacture 10,000,000 and state it. Which one is it now? No telling without actual data...


They need to put a number in those like in limited editions.


----------



## cmaxvt (May 28, 2020)

You are right that on release intel usually sells out quickly. I'm just wondering what the normal cadence is for stuff being back in stock. Was about to pull the trigger on a new build and they sold out everywhere.

Just for kicks - anyone have opinion on i7 vs i9? I'm leaning i7 right now as I am almost entirely using this for gaming and light workloads. Not sure it's worth the premium for the i9 without needing it for things like encoding/etc.


----------



## N3M3515 (May 28, 2020)

cmaxvt said:


> You are right that on release intel usually sells out quickly. I'm just wondering what the normal cadence is for stuff being back in stock. Was about to pull the trigger on a new build and they sold out everywhere.
> 
> Just for kicks - anyone have opinion on i7 vs i9? I'm leaning i7 right now as I am almost entirely using this for gaming and light workloads. Not sure it's worth the premium for the i9 without needing it for things like encoding/etc.


Obviously I5 10600k, the only one worth anything.


----------



## cmaxvt (May 28, 2020)

N3M3515 said:


> Obviously I5 10600k, the only one worth anything.



Nah I prefer the i7 honestly. Last gen I bought I originally bought the i5 3570k and upgraded to the i7-3770k and it made a significant difference in minimum FPS. part of it was a few 100 more mhz of overclock, and part of it was possibly more threads. All I know is in all the games I tested, I was seeing 20-30 FPS higher on the minimum end, and that's the whole reason I'm upgrading CPU.


----------



## N3M3515 (May 28, 2020)

cmaxvt said:


> Nah I prefer the i7 honestly. Last gen I bought I originally bought the i5 3570k and upgraded to the i7-3770k and it made a significant difference in minimum FPS. part of it was a few 100 more mhz of overclock, and part of it was possibly more threads. All I know is in all the games I tested, I was seeing 20-30 FPS higher on the minimum end, and that's the whole reason I'm upgrading CPU.



i5 is the same performance in games than 10900k rofl....



cmaxvt said:


> Nah I prefer the i7 honestly. Last gen I bought I originally bought the i5 3570k and upgraded to the i7-3770k and it made a significant difference in minimum FPS. part of it was a few 100 more mhz of overclock, and part of it was possibly more threads. All I know is in all the games I tested, I was seeing 20-30 FPS higher on the minimum end, and that's the whole reason I'm upgrading CPU.



There you go


----------



## cmaxvt (May 28, 2020)

N3M3515 said:


> i5 is the same performance in games than 10900k rofl....
> 
> 
> 
> There you go


i7 isn't on there!


----------



## N3M3515 (May 28, 2020)

cmaxvt said:


> i7 isn't on there!



Sorry but i don't get it, why would you need an i7 if the i5 is on par with even the i9 on gaming (which is you main focus)


----------



## cmaxvt (May 28, 2020)

N3M3515 said:


> Sorry but i don't get it, why would you need an i7 if the i5 is on par with even the i9 on gaming (which is you main focus)



I currently have the 4 core/8 thread 3770k. Although my primary use is gaming, I still run multiple apps and have streamed at times. Plus I run a virtual desktop for my work now with WFH. It's worth me having more cores and a little more horsepower, but probably not worth it to go all the up to an i9, especially considering the power consumption. i7 seems like a nice compromise without being SOLELY for gaming.


----------



## dgianstefani (May 28, 2020)

The way techpowerup tests, the i5 and the i9 don't really show much difference, but other sites with more comprehensive testing show better 1% lows for the i9 and i7

The i9 also has consistently higher hfr gaming results than another other cpu.


----------



## Caring1 (May 28, 2020)

cmaxvt said:


> Nah I prefer the i7 honestly. Last gen I bought I originally bought the i5 3570k and upgraded to the i7-3770k and it made a significant difference in minimum FPS. part of it was a few 100 more mhz of overclock, and part of it was possibly more threads. All I know is in all the games I tested, I was seeing 20-30 FPS higher on the minimum end, and that's the whole reason I'm upgrading CPU.


Comparing Apples and Oranges there.
The i5-3570k was good, but not capable of competing with the best i7 of the day, which is exactly what the i5-10600k is doing as shown in tests.


----------



## cmaxvt (May 28, 2020)

Caring1 said:


> Comparing Apples and Oranges there.
> The i5-3570k was good, but not capable of competing with the best i7 of the day, which is exactly what the i5-10600k is doing as shown in tests.


Hmm. I've got time to wait since none of them are in stock, but I still feel like the i7 is the best option.

What do people think about overclocking now? BACK IN MY DAY... there wasn't all this turbo boosting and stuff. Per the review, it seems like this vague 2nd tier of turbo boost does the work, and you see very minimal gains from OC? When I overclocked my current chips it was just a massive increase in the floor for FPS on games.


----------



## N3M3515 (May 29, 2020)

cmaxvt said:


> Hmm. I've got time to wait since none of them are in stock, but I still feel like the i7 is the best option.
> 
> What do people think about overclocking now? BACK IN MY DAY... there wasn't all this turbo boosting and stuff. Per the review, it seems like this vague 2nd tier of turbo boost does the work, and you see very minimal gains from OC? When I overclocked my current chips it was just a massive increase in the floor for FPS on games.



Intel practically squeezed  all the performance they can from 14mm by now i guess


----------



## JRMBelgium (May 30, 2020)

dgianstefani said:


> Testing still doesn't include 1% and 0.1% lows. Shame really since that's where you really see the differences in a CPU throttled game and one that is smooth, regardless of averages.
> 
> View attachment 156632
> 
> It's noticable when you do include these results. E.g. the stock 10900k has higher 1% lows than the AMD 3900x average FPS.



You are 10000% correct!


----------



## HenrySomeone (May 30, 2020)

LOL, that chart is just ... embarassing for AMD; yes, I know Far Cry 5 is pretty much their worst case scenario, but still - best vs best chip, Intel is over 40% better in average and 50(!!!) for the 1% lows, lmao. Sandy vs. Buldy all over again...


----------



## N3M3515 (May 30, 2020)

HenrySomeone said:


> LOL, that chart is just ... embarassing for AMD; yes, I know Far Cry 5 is pretty much their worst case scenario, but still - best vs best chip, Intel is over 40% better in average and 50(!!!) for the 1% lows, lmao. Sandy vs. Buldy all over again...


More like $400 chip vs $600 chip


----------



## dgianstefani (May 31, 2020)

Do you mean the 10900k, a sub $500 chip, or the 10600k a sub $300 chip, both of which beat the 3950x, which is a $750 chip. 

But yeah, AMD sure is the value gaming proposition right


----------



## N3M3515 (May 31, 2020)

dgianstefani said:


> Do you mean the 10900k, a sub $500 chip, or the 10600k a sub $300 chip, both of which beat the 3950x, which is a $750 chip.
> 
> But yeah, AMD sure is the value gaming proposition right



That's right, $150 more for 10 fps, that's some beating!


----------



## dgianstefani (Jun 1, 2020)

It's not 10fps though is it? Learn to read a graph. And $150 more for what? What are you comparing?


----------



## N3M3515 (Jun 1, 2020)

dgianstefani said:


> It's not 10fps though is it? Learn to read a graph. And $150 more for what? What are you comparing?







Look at that DEMOLISHING!!, omg!, that's 1440p WITH a 2080ti btw.........


----------



## ThrashZone (Jun 1, 2020)

N3M3515 said:


> That's right, $150 more for 10 fps, that's some beating!


Hi,
No offense intended but you seem butthurt over this platform


----------



## dgianstefani (Jun 1, 2020)

Oh right - you're linking non 1%/0.1% low CPU testing in 1440p... Right...

Wow what a surprise high resolution testing results in you actually testing the GPU far more than the CPU. Big think.

So in your mind a chart showing the r3 3300x as being within 5% of the gaming performance of a 10900k is accurate and not misleading at all.   Disregard an average/1% low discrepancy of almost 90 FPS when actually in a CPU limited scenario .


----------



## HenrySomeone (Jun 1, 2020)

AMD fans are just butthurt because for all the hum-hum about Ryzens, they still trail behind in gaming, in many cases very significantly


----------



## ThrashZone (Jun 1, 2020)

HenrySomeone said:


> AMD fans are just butthurt because for all the hum-hum about Ryzens, they still trail behind in gaming, in many cases very significantly


Hi,
Yeah maybe so 
I was bummed when I saw 3960x price was double 3950x price was 
I didn't blow gasket on the amd party threads about it though I just didn't buy a 3960x


----------



## HenrySomeone (Jun 1, 2020)

Yup, 3960x and 3970x are great examples that AMD isn't giving out any favors to consumers or at least no more than Intel. They are more or less their first two very good cpus in a long, long time (for their use cases of course) and immediately they are priced accordingly (=quite high). Ryzens are cheap (or at least cheaper than Intels) because they are still inferior core per core. They know very well that if say 3800x was priced the same as 9900k, nobody would buy it (except perhaps some of their most hardcore fanboys, but there wouldn't be many, especially not those that would be willing to dish out 500$ )


----------



## chrcoluk (Jun 3, 2020)

Elysium said:


> Man I thought the exact same thing when Coffee Lake refresh emerged. If they had so much of this 14nm silicon lying around, why did they not just make the 9900KS a not-so-special-edition chip and bump everything that didn't make the bin down to 9700K? I mean, it's not much of a self-inflicted gunshot wound all by itself but now that we see what Comet Lake is made of, hindsight is really showing how bad of a strategic decision that was.



They should have carried it on up until 9 series was EOL, by the time I got my 9900k, the 9900ks was out of supply in most places, the one store that had it gouged it up to mount everest levels.  I did get my 9900k super cheap though so I cant complain that much.  But I think the way it as priced people probably were snapping them up vs basic 9900k's.  So I expect also 9900k sales nosedived whilst it was available.



Chloe Price said:


> The new Intels are always sold out immediately (which I don't understand since this is Skylake 5.0, but I guess Intel's milking works), takes a while that you can really buy them.
> 
> I'm kicking with 2600K as I sold my R5 2600 + B450, this runs RE games and that's fair enough.



Easy to understand, the supply numbers are really low, its happened in so many generations now I believe it to be deliberate to keep prices high.

If you short supply, it can make consumers think demand is high rather than supply is low, and with how the human mind works it also makes them more sought after.



cmaxvt said:


> Hmm. I've got time to wait since none of them are in stock, but I still feel like the i7 is the best option.
> 
> What do people think about overclocking now? BACK IN MY DAY... there wasn't all this turbo boosting and stuff. Per the review, it seems like this vague 2nd tier of turbo boost does the work, and you see very minimal gains from OC? When I overclocked my current chips it was just a massive increase in the floor for FPS on games.



These new chips have "preferred cores" enabled, this to me means if you running a new OS, the values of overclocking are now very low.

What this new feature does is if the chip only can run top clocks on 1-2 cores, which is the case on intel turbo clock, then the OS will make sure low threaded apps "always" use those cores.  So the benefit on pushing all cores to top clocks (which would previously ensure that) is now more limited.

On top of this high end intel chips are pushed close to their limit now in terms of power and heat out of the box.  It can be seen on the OC and asus tuned bios results the temperatures were stupid high and not practical.  All for a mere 100-200mhz boost.

On my 9900k which is previous gen, I decided to stop pushing vcore up to silly levels for small gains, instead I actually reduced the voltage and still managed to get an all core 4.8ghz.  My 8600k needed 1.37v for an all core 4.7, my 9900k runs an all core 4.8 at 1.25v.  That feels much better than I would if I perhaps set it to run 5ghz at 1.4v.  The temps and power draw are now very respectable.

So my suggestion for 10 series chips is to either not tinker, or just reduce voltage, I would also disable any board manufacturer tuned modes as they likely just giving minor improvements for huge heat gains.

The K chips will still have some value as they are better binned chips and do still come with higher out of the box specifications than non K chips.


----------



## newtekie1 (Jun 5, 2020)

@W1zzard I just noticed on the 2nd page, it is referred to as the i5-10900K twice.


----------



## W1zzard (Jun 5, 2020)

newtekie1 said:


> @W1zzard I just noticed on the 2nd page, it is referred to as the i5-10900K twice.


Fixed, thanks!


----------



## gregorypeck (Apr 9, 2021)

The Core i9-10900K is the fastest gaming chip available, but not by much considering the cost and platform trade-offs. Intel's own $370 Core i7-9700K is plenty for most gamers.


----------



## Kissamies (Apr 9, 2021)

gregorypeck said:


> Intel's own $370 Core i7-9700K is plenty for most gamers.


Getting a chip for an EOL socket is pointless (unless it's cheap and you have a motherboard already). I did that mistake with 7700K about three years ago.


----------



## Makaveli (Apr 9, 2021)

Chloe Price said:


> Getting a chip for an EOL socket is pointless (unless it's cheap and you have a motherboard already). I did that mistake with 7700K about three years ago.


I agree but if they don't upgrade often and don't mind sitting on older hardware for couple years it maybe worth it still. Will really come down to your needs on the machine, however if the budget is there I always say go brand new.


----------



## Vayra86 (Apr 9, 2021)

dgianstefani said:


> Oh right - you're linking non 1%/0.1% low CPU testing in 1440p... Right...
> 
> Wow what a surprise high resolution testing results in you actually testing the GPU far more than the CPU. Big think.
> 
> So in your mind a chart showing the r3 3300x as being within 5% of the gaming performance of a 10900k is accurate and not misleading at all.   Disregard an average/1% low discrepancy of almost 90 FPS when actually in a CPU limited scenario .



Nah you have to understand, game testing that fully loads the CPU and doesn't fully load the GPU, is not actually testing the game or the CPU performance, see. Real gamers load that GPU in such a way that they never feel CPU bottlenecking. You can also disregard every topic about stutter, we barely have them on TPU anyway as it is. Competitive gaming also doesn't exist, people clearly prefer 30 FPS over 60 and definitely over 120.

Its not a real life situation so its not relevant. Rather, its much easier to get colored bars force fed to your brain and stop thinking  This goes especially if the CPU in question is the one you want or have bought.

(I reckon you get the ./s here)


----------

