Monday, April 20th 2020

Intel Core i7-10700K and i5-10600K Geekbenched, Inch Ahead of 3800X and 3600X

The week has begun with sporadic leaks about Intel's upcoming 10th generation Core "Comet Lake-S" desktop processor family, be it pictures of various socket LGA1200 motherboards, or leaked performance scores. Thai PC enthusiast TUM_APISAK posted links to Geekbench V4 entries of a handful 10th gen Core processors. These include the Core i7-10700K (8-core/16-thread), and the Core i5-10600K (6-core/12-thread). Comparisons with incumbent AMD offerings are inescapable. The i7-10700K locks horns with the Ryzen 7 3800X, while the i5-10600K takes the battle to the Ryzen 5 3600X.

The Core i7-10700K scores 34133 points in the multi-core test, and 5989 in the single-core one. The i5-10600K, on the other hand, puts out 28523 points in the multi-threaded test, and 6081 points in the single-core test. Both scores appear to be a single-digit percentage ahead of the AMD rivals in the multi-threaded test. The Intel chips appear to offer slightly better less-parallelized performance owing to higher boost frequencies for single-threaded or less parallelized workloads. These include an impressive 5.10 GHz max boost frequency for the i7-10700K, and 4.80 GHz for the i5-10600K. APISAK also posted scores of the iGPU-disabled Core i5-10600KF, which is roughly on par with the i5-10600K since it's basically the same chip with its eyes poked out.
Source: TUM_APISAK (Twitter)
Add your own comment

80 Comments on Intel Core i7-10700K and i5-10600K Geekbenched, Inch Ahead of 3800X and 3600X

#52
TheinsanegamerN
kapone32Well this is not going to convince anyone who bought into Ryzen to change back or over to Intel. Not that the CPU(s) are not impressive (for gaming) but that continued conceit that you must buy a new MB is even more pronounced in a world where you could use (for the most part) a MB from 2017 with a CPU from 2019 or vice versa.



Well it is a 12 core processor....I guess.
The i9-10xxx series are 10 cores being compared to AMD 8 cores.
Posted on Reply
#53
Tsukiyomi91
core count is no longer relevant in 2020. It's all about efficiency, IPCs & all-core boost duration (in regards to thermals, workloads etc). Also I wouldn't want to spend $1k or more on just the processor & mobo alone each time there's a new generation being released, leaving my RAM, SSD, Cooler & GPU out in the cold? no freaking way. Sad to say this but Intel has come to the point where I see AMD as the better option coz I know I don't need to spend a ton of money each time a new processor is released that requires a new board. Also, Intel's top of the line Z490 is still using PCIe Gen3 while AMD's mid-end B550 has PCIe Gen4 support.
Posted on Reply
#54
RandallFlagg
Not mentioned here seems to be price. It looks like it is significantly faster than a 3600X, but if it's priced like a 3700X or 3800X that would be the proper comparison.
Posted on Reply
#55
Th3pwn3r
RandallFlaggNot mentioned here seems to be price. It looks like it is significantly faster than a 3600X, but if it's priced like a 3700X or 3800X that would be the proper comparison.
Lol, exactly. The performance doesn't mean anything without dollar amounts. Price to performance ratio is what most people care about.
Posted on Reply
#56
RandallFlagg
Th3pwn3rLol, exactly. The performance doesn't mean anything without dollar amounts. Price to performance ratio is what most people care about.
That's exactly what I care about. I love AMD for the competition they've provided hence driving that ration.

But that said, look at most benchmarks and at the top Intel sits. I don't buy top end CPUs, but it's a fact. With 6 cores / 12 threads the gen 10 Intel i5 chips do look promising, in particular the i5-10400 through 10600k. And I could care a rats about an extra 30 or 50W of power draw on a desktop chip.
Posted on Reply
#57
mahoney
theoneandonlymrkIf so, it'll be at considerably more power used, a point that's missing from this but wasn't during similar reports of the FX series back int day.
If you care for the earth you can't buy Intel, :p

No I jest at least Intel are competing on some points.
We wouldn't want a one horse race again.
Use your head for a bit
Because FX was proper shiett!!
Posted on Reply
#59
TheoneandonlyMrK
mahoneyUse your head for a bit
Because FX was proper shiett!!
Consider there was sarcasm in my post while I am not sure there is in yours, sound's like baiting yours.
To me £159 -5 years and when retired it was within a margin of the same FPS as an i7 @4K
Some upgraded from peasant 1080p:p years ago:p :D.

You like hot and hungry then go you.
Buy Intel, save on heating this Christmas.
Posted on Reply
#60
EarthDog
theoneandonlymrkTo me £159 -5 years and when retired it was within a margin of the same FPS as an i7 @4K
Some upgraded from peasant 1080p:p years ago:p :D.
less than 1% of ya at 4k... :p
Posted on Reply
#61
TheoneandonlyMrK
EarthDogless than 1% of ya at 4k... :p
At 4k most CPU are still equal
EarthDogless than 1% of ya at 4k... :p
Yeah indeed, n don't tell him my intel Nvidia laptop is 1080p pls:D
Posted on Reply
#62
EarthDog
theoneandonlymrkAt 4k most CPU are still equal
Yep... like a stock civic and ferrari both rolling down a hill. :p

It's too bad an overwhelming majority are at 1080p and less.
Posted on Reply
#63
TheoneandonlyMrK
EarthDogYep... like a stock civic and ferrari both rolling down a hill. :p

It's too bad an overwhelming majority are at 1080p and less.
Less, no one seriously games on a phat back Tv anymore and any laptop with less than 1080p is not a gaming laptop really.

You did have to tune the snot out that civic to be fair.:)
Posted on Reply
#64
EarthDog
theoneandonlymrkLess, no one seriously games on a phat back Tv anymore and any laptop with less than 1080p is not a gaming laptop really.

You did have to tune the snot out that civic to be fair.:)
steam stats, my homie. Youd be shocked what it pulls...(please spare me the disclaimers we all know :) ).

Tuned? No. I said rolling down a hill. The point was that takes the engine out of the equation and is therefore similar in speed.... akin to running that potato for a cpu in 4k... it works, but it's still a civic. :p

Though it's funny because few can afford the graphical horsepower required in the first place... typically a 4k gamer isnt going to have a 2080+ and bulldozer. :)
Posted on Reply
#65
TheoneandonlyMrK
EarthDogsteam stats, my homie. Youd be shocked what it pulls...(please spare me the disclaimers we all know :) ).

Tuned? No. I said rolling down a hill. The point was that takes the engine out of the equation and is therefore similar in speed.... akin to running that potato for a cpu in 4k... it works, but it's still a civic. :p

Though it's funny because few can afford the graphical horsepower required in the first place... typically a 4k gamer isnt going to have a 2080+ and bulldozer. :)
His point was FX was shit, it alludes to the fact that on release as I was pointing out it (the FX series) was initially hammered on it's performance per watt, it's actual performance against it's contemporary wasn't That bad but at way more power, roll into nowadays and as I implied, power draw is glanced over, hypocrisy by some not all and his FX is shit statement is therefore laughable as he believes these chips won't be seen the same in ten years.

I'd say 5.
Posted on Reply
#66
ARF
theoneandonlymrkHis point was FX was shit, it alludes to the fact that on release as I was pointing out it (the FX series) was initially hammered on it's performance per watt, it's actual performance against it's contemporary wasn't That bad but at way more power, roll into nowadays and as I implied, power draw is glanced over, hypocrisy by some not all and his FX is shit statement is therefore laughable as he believes these chips won't be seen the same in ten years.

I'd say 5.
FX was at least 10 years ahead of its time. What would you think if right now AMD released a CPU with 64 small cores in place of the Ryzen 7 3700X ?
It only needs developer support and will be top notch.
Posted on Reply
#67
TheoneandonlyMrK
ARFFX was at least 10 years ahead of its time. What would you think if right now AMD released a CPU with 64 small cores in place of the Ryzen 7 3700X ?
It only needs developer support and will be top notch.
The conversation should be brought to bare on this chip really, not AMD's, I was comparing the presentation, I would prefer the same level of attention on it's power use is all.

We already have upto 64 big cores thanks, I'm fine with those all the small cores stuff I've used were Very limited in use and remain so.

But we'll see how it pans out.
Posted on Reply
#68
EarthDog
theoneandonlymrkHis point was FX was shit, it alludes to the fact that on release as I was pointing out it (the FX series) was initially hammered on it's performance per watt, it's actual performance against it's contemporary wasn't That bad but at way more power, roll into nowadays and as I implied, power draw is glanced over, hypocrisy by some not all and his FX is shit statement is therefore laughable as he believes these chips won't be seen the same in ten years.

I'd say 5.
That's a rosey outlook. These things didnt compare favorably to sandy bridge in a pot of tests (and a 2500k vs 8350, mind you) nonetheless ivybridge...and that is raw performance across everything not zip files, x264 and cinebench multi, lol.

www.anandtech.com/show/6396/the-vishera-review-amd-fx8350-fx8320-fx6300-and-fx4300-tested/2

All piledriver/bd/vishera had going for them is price.
Posted on Reply
#69
TheoneandonlyMrK
EarthDogThat's a rosey outlook. These things didnt compare favorably to sandy bridge in a pot of tests (and a 2500k vs 8350, mind you) nonetheless ivybridge...and that is raw performance across everything not zip files, x264 and cinebench multi, lol.

www.anandtech.com/show/6396/the-vishera-review-amd-fx8350-fx8320-fx6300-and-fx4300-tested/2

All piledriver/bd/vishera had going for them is price.
I agree but as I said, back then power use and heat were it's main detractors in forums etc as is not the case with these new chips, but it should be, at least more than it is, no mega dramma required though.

Poorly optimised hardware with poorly optimised software will do that , and that's mostly what it was IMHO back then.
Posted on Reply
#70
ARF
theoneandonlymrkThe conversation should be brought to bare on this chip really, not AMD's, I was comparing the presentation, I would prefer the same level of attention on it's power use is all.

We already have upto 64 big cores thanks, I'm fine with those all the small cores stuff I've used were Very limited in use and remain so.

But we'll see how it pans out.
We have but most applications are still limited to the usage of 4 or 6 of them, and rarely more.
EarthDogThat's a rosey outlook. These things didnt compare favorably to sandy bridge in a pot of tests (and a 2500k vs 8350, mind you) nonetheless ivybridge...and that is raw performance across everything not zip files, x264 and cinebench multi, lol.

www.anandtech.com/show/6396/the-vishera-review-amd-fx8350-fx8320-fx6300-and-fx4300-tested/2

All piledriver/bd/vishera had going for them is price.
Back in 2011 most applications used only a single or dual core maximum. Since this, most of the 8-thread FX was not utilised optimally.
Posted on Reply
#71
EarthDog
theoneandonlymrkI agree but as I said, back then power use and heat were it's main detractors in forums etc as is not the case with these new chips, but it should be, at least more than it is, no mega dramma required though.

Poorly optimised hardware with poorly optimised software will do that , and that's mostly what it was IMHO back then.
ARFBack in 2011 most applications used only a single or dual core maximum. Since this, most of the 8-thread FX was not utilised optimally.
True.. but that bar was equal for both Intel and amd CPUs, both had 8t. ;)

There are newer tests that show the same story. It didnt age well either. Slow is slow, my dude. What made these so attractive was the price... that's about it. It surely wasnt single threaded performamce, IPC, nor did it do well gaming at resolutions even more common back then (and less).

Edit: but I digress, this thread isnt about piledriver/bd/vishera. ;)
Posted on Reply
#72
RandallFlagg
theoneandonlymrkI agree but as I said, back then power use and heat were it's main detractors in forums etc as is not the case with these new chips, but it should be, at least more than it is, no mega dramma required though.

Poorly optimised hardware with poorly optimised software will do that , and that's mostly what it was IMHO back then.
FWIW, I had an FX (8300 I think). I did not then, nor now, care about power use on a desktop.

I have never understood the focus on that aspect of desktop chips for typical users at home. For mobile insofar as it affects thermals and battery life yes, for business class PCs (usually these are SFF), for servers, or worstation farms where the system remains at high use much of the time then sure.

But at home, where 99% of folks here are talking about?

Doing the math on typical workloads and based on my own experience with kilowatt measurements, you might be talking about +40W for 4 or 5 hours a day from a high power draw CPU - all else being equal - and assuming you put your system under heavy load for 4-5 hours per day 365/7 (which is a lot to average, even for power users and the most avid of gamers). That comes out to about 200WH / day or 1.4KWH / week. X52 weeks per year you get 72KWH per year.
The average KWH in the USA is 0.12c/kwh, so the cost here is 72KWH * 0.12c/KWH = $8.64 per year.

That isn't even worth anyone's time to discuss.

I mean, if you tell me chip A draws 250W vs chip B drawing 65W with the same performance, I might listen just a little. But 95W vs 135W? 65W vs 95W? No man, who gives a rat?

Now if I were buying 3000 PCs for my employer, I would care, but I'm not doing that and very few here are.
Posted on Reply
#73
TheoneandonlyMrK
RandallFlaggFWIW, I had an FX (8300 I think). I did not then, nor now, care about power use on a desktop.

I have never understood the focus on that aspect of desktop chips for typical users at home. For mobile insofar as it affects thermals and battery life yes, for business class PCs (usually these are SFF), for servers, or worstation farms where the system remains at high use much of the time then sure.

But at home, where 99% of folks here are talking about?

Doing the math on typical workloads and based on my own experience with kilowatt measurements, you might be talking about +40W for 4 or 5 hours a day from a high power draw CPU - all else being equal - and assuming you put your system under heavy load for 4-5 hours per day 365/7 (which is a lot to average, even for power users and the most avid of gamers). That comes out to about 400WH / day or 1.4KWH / week. X52 weeks per year you get 72KWH per year.
The average KWH in the USA is 0.12c/kwh, so the cost here is 14.6KWH * 0.12c/KWH = $8.74 per year.

That isn't even worth anyone's time to discuss.

I mean, if you tell me chip A draws 250W vs chip B drawing 65W with the same performance, I might listen just a little. But 95W vs 135W? 65W vs 95W? No man, who gives a rat?

Now if I were buying 3000 PCs for my employer, I would care, but I'm not doing that and very few here are.
Were all different, and our uses and reasons are ,my pc is on all day everyday for example with as high a load as my cooling system will support, others are similar, the large majority of pc user's don't care I agree.
But as far as simplifying everyone into one bracket, that's a stretch IMHO.

I've paid about £30-50 for pc power alone for years and at one point 20 times that amount.

But some do not care indeed.
Posted on Reply
#74
RandallFlagg
theoneandonlymrkWere all different, and our uses and reasons are ,my pc is on all day everyday for example with as high a load as my cooling system will support, others are similar, the large majority of pc user's don't care I agree.
But as far as simplifying everyone into one bracket, that's a stretch IMHO.

I've paid about £30-50 for pc power alone for years and at one point 20 times that amount.

But some do not care indeed.
Did you just virtue signal?

In my experience the type of people who legitimately care about power draw do not hang out here, nor in other similar sites. You'll find them in the forums on the specific applications they are using, because they are pros in those areas (rendering, video editing, AI/distributed computing, etc) and their ability to make a living is directly tied to their skill in those apps - not by perusing hardware tech forums. They also don't use consumer grade midrange desktop chips in their workstations.

Saving time is always more important to someone using their PC to make a living, which is why I question the veracity of someone who cares how much power an i5 / i7 or similar desktop chips draws.
Posted on Reply
#75
TheoneandonlyMrK
RandallFlaggDid you just virtue signal?

In my experience the type of people who legitimately care about power draw do not hang out here, nor in other similar sites. You'll find them in the forums on the specific applications they are using, because they are pros in those areas (rendering, video editing, AI/distributed computing, etc) and their ability to make a living is directly tied to their skill in those apps - not by perusing hardware tech forums. They also don't use consumer grade midrange desktop chips in their workstations.

Saving time is always more important to someone using their PC to make a living, which is why I question the veracity of someone who cares how much power an i5 / i7 or similar desktop chips draws.
As I implied previously it's not wise to assume you know how I and others use hardware, I have no workstation, see my label's below for a clue.
As for your slur if you want to discuss with me about virtue signalling pm me for the verbose offensive reply, I was mearly defending my opinion by describing my thoughts due to costs.

And Also I put plain enough I agree the majority don't care , that's ok but that's not everyone like you said.

I care what I spend on anything I buy ,am I weird?, I doubt I'm alone.

There's a awful lot of folders new to this due to covid that are about to understand their computers power use per bill period:).

If they are as you say they we're, unconcerned before.
Posted on Reply
Add your own comment
Jan 16th, 2025 04:46 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts