# 10700 vs 5600X



## wheresmycar (Nov 6, 2020)

AMD Ryzen 5 5600X Review
					

Six Zen 3 cores beating eight Zen 2 Cores? That's exactly what's happening with the Ryzen 5 5600X. AMD's massive IPC gain helped it overcome a two-core deficitm, even in productivity tests. The Ryzen 5 5600X redefines what you really need for a high-end gaming PC.




					www.techpowerup.com
				




According to the above 5600X review, the non-K series 10700 comes out ahead @ 1080p (in most games). Also beats the 10700K and 10900k (stock).

I'm just just trying to wrap my head around how this was made possible. I've always been lead to believe the K-series CPU's are "marginally faster" out of the box + the added benefit of overclocking. My kaby lake 7700K sees pretty wide single threaded perf gains over the non-K 7700.... _"so vat da hell iz goin on with 10th Gen?"_

Is the 10700 reference in these reviews based on a premium Z-series board, more expensive lower latency memory and overclocking (as I understand some OC is possible with non-K series)? Or was it based on stock configurations with a B/H series board and not-so expensive RAM? Also your know-how thoughts on how the non-K is able to beat intels flagship i9 SKU?


----------



## phanbuey (Nov 6, 2020)

That's probably well within margin of error - .6%


----------



## ExcuseMeWtf (Nov 6, 2020)

Yeah those kinds of differences are blown way out of proportion on social media in general...


----------



## X71200 (Nov 6, 2020)

It's some form of %0.1 difference... you can see that these chips were indeed getting decent results in the tests so the better one was either capped by something else, or simply didn't help marginally at FHD.

As for overclocking locked SKUs, you're going to have to work on your bCLK which could lead to anything from USB device unreliability to failure. It's not something I'd suggest.


----------



## John Naylor (Nov 6, 2020)

The $324 10700 is 3.6% faster in Gaming than the 5600x but is $24 more than the MSRP of the 5600X

Keep in mind that this will change somewhat at higher resolutions.   If you see yourself upgrading to 1440p / 2160p, take a look at those.

The surprise to me was at 1440p where. in gaming,  the $174 10400F outperformed very one of the new Ryzen CPUs, including the $550 5900x.


----------



## wheresmycar (Nov 6, 2020)

X71200 said:


> It's some form of %0.1 difference... you can see that these chips were indeed getting decent results in the tests so the better one was either capped by something else, or simply didn't help marginally at FHD.
> 
> As for overclocking locked SKUs, you're going to have to work on your bCLK which could lead to anything from USB device unreliability to failure. It's not something I'd suggest.



Does the 10700 reference in the charts rope in overclocking performance gains? Eg. raised BCLK + power limits? The 10700 for me is only a desirable option if it's running at stock with auto-boost enabled without having to invest in a premium Z-series board + 14CL 3200Mhz kits. Alternatively I am more than happy to follow through with the more power efficient 5600X.


----------



## Zach_01 (Nov 6, 2020)

wheresmycar said:


> AMD Ryzen 5 5600X Review
> 
> 
> Six Zen 3 cores beating eight Zen 2 Cores? That's exactly what's happening with the Ryzen 5 5600X. AMD's massive IPC gain helped it overcome a two-core deficitm, even in productivity tests. The Ryzen 5 5600X redefines what you really need for a high-end gaming PC.
> ...


Both platforms on the end of their lifetime. 5600X and 10700 are really close in perf. 0.5~3% is identical in real life.
On gaming performance the 5600X does it with a little less power that both 10700/10700K and on multi it stands between them on perf but a little lower in power consumption than the nonK and a lot lower than the K.
With 5600X you get PCI-E 4.0 and something tells me that with RX6000 is going to be a lot more relevant than fast NVMe(s), if you are interested for a new GPU also.
_(see SAM feature)_


----------



## X71200 (Nov 6, 2020)

Get the 5600X, you could use it with the stock cooler for your needs.


----------



## TheLostSwede (Nov 6, 2020)

Keep in mind that the tests that were performed here was with an RTX 2080 Ti and the recently launched and announced will shift things around somewhat.


----------



## Max(IT) (Nov 6, 2020)

X71200 said:


> Get the 5600X, you could use it with the stock cooler for your needs.


The Wraith Stealth? Garbage for gaming...
I would buy a 30/40€ cooler like the Arctic Freezer 34 at least


----------



## X71200 (Nov 6, 2020)

Max(IT) said:


> The Wraith Stealth? Garbage for gaming...
> I would buy a 30/40€ cooler like the Arctic Freezer 34 at least



Gaming does not load the CPU to its limits, if you actually made an example like rendering, where the whole chip is used... that might have made more sense.

I have couple of those and they work just fine when you don't push the CPU, which is what the OP sounded like he is going to do. Especially now that winter is coming.


----------



## wheresmycar (Nov 6, 2020)

John Naylor said:


> The $324 10700 is 3.6% faster in Gaming than the 5600x but is $24 more than the MSRP of the 5600X
> 
> Keep in mind that this will change somewhat at higher resolutions.   If you see yourself upgrading to 1440p / 2160p, take a look at those.
> 
> The surprise to me was at 1440p where. in gaming,  the $174 10400F outperformed very one of the new Ryzen CPUs, including the $550 5900x.



My thoughts too. I've been looking at all three (10400/10600K/10700) to compare with Zen 3 (5600X). For me the 10400F only demands £150 (GBP) which is a steal!! For gaming @ 1440p I wouldn't have looked past a 10400/10400F/3600 - problem being I am looking for a 8-core chip with growing thread utilisation demands (multiple DT applications, occasional rendering + video transcoding, i guess games may also benefit to some extent in the long run, etc).

I'm in no rush though - my 7700K is holding up to anything and everything listed in the review charts and multi-threaded inferiority isn't a big deal with my "current" requirements....which brings me to: What are your thoughts on waiting for Rocket lake for a more meaningful ST advantage over the 7700K or slashed ZEN 3 prices? Money isn't a problem and I don't mind paying less excruciating over-spends but i'm 50/50 whether it's worth upgrading now. My next upgrade will be put to use for 4yrs+ or earlier if DDR5 platforms sees larger than life performance uplifts. So many options... and I already know i'm over-thinking... (SOMEBODY HOLD MY HAND!! hehe)


----------



## Zach_01 (Nov 6, 2020)

In that case if I didn’t had any serious gaming issues I would keep it for another year at least. After AlderLake, or even after Zen4


----------



## Vya Domus (Nov 6, 2020)

In terms of gaming the differences are practically zero, however, despite having 2 less cores it outperforms the 10700 in just about every other task. I'd get the 5600X.


----------



## wheresmycar (Nov 6, 2020)

Zach_01 said:


> Both platforms on the end of their lifetime. 5600X and 10700 are really close in perf. 0.5~3% is identical in real life.
> On gaming performance the 5600X does it with a little less power that both 10700/10700K and on multi it stands between them on perf but a little lower in power consumption than the nonK and a lot lower than the K.
> With 5600X you get PCI-E 4.0 and something tells me that with RX6000 is going to be a lot more relevant than fast NVMe(s), if you are interested for a new GPU also.
> _(see SAM feature)_



Excellent point!! I forgot about the announcement notes on smart cache and 4-11% perf gains (i think) when pairing Zen 3 with a AMD GPU. Yes also have a GPU upgrade in mind. I'm definitely open to RDNA 2 but don't see myself pulling the trigger at launch. I've got a 1080 TI hence don't mind waiting a lot longer for big navi to hit the shelves, Nvidia to respond with TI/super variants and who knows RDNA 2 counterattacks too (not needing the absolute best, but hopefully some price slash action). I'll wait until the dust settles. I definitely want to see how well the smart cache feature performs in a select number of games I play.



X71200 said:


> Get the 5600X, you could use it with the stock cooler for your needs.



Honestly I don't like stock coolers. Even if it came with a Wraith Prism i'd get something better. Anyway cooling isn't a problem... i got a couple of monster-like air coolers already.


----------



## Zach_01 (Nov 6, 2020)

wheresmycar said:


> Excellent point!! I forgot about the announcement notes on smart cache and 4-11% perf gains (i think) when pairing Zen 3 with a AMD GPU. Yes also have a GPU upgrade in mind. I'm definitely open to RDNA 2 but don't see myself pulling the trigger at launch. I've got a 1080 TI hence don't mind waiting a lot longer for big navi to hit the shelves, Nvidia to respond with TI/super variants and who knows RDNA 2 counterattacks too (not needing the absolute best, but hopefully some price slash action). I'll wait until the dust settles. I definitely want to see how well the smart cache feature performs in a select number of games I play.


Just a clarification. It’s SmartAccessMemory and not smart cache. The large Infinity Cache all RX6000 have, is something different and unrelated to SAM.
IC is just an enhancement of GPU<->VRAM communication. SAM is CPU access to full VRAM.


----------



## wheresmycar (Nov 6, 2020)

Zach_01 said:


> In that case if I didn’t had any serious gaming issues I would keep it for another year at least. After AlderLake, or even after Zen4



I agree. Even my multi-threaded workloads don't urgently require a 6core+ option although it would be nice. My only worry is running multiple applications whilst day trading and a whole bunch of side-charts/market feeds/multiple browser windows and a couple of other tasks... i'm a beginner at DT hence not an immediate requirement from the get-go. Bottom line, i've hit a point with hardware enthusiasm where an "upgrade" is an itch I can't scratch lol I can't shake it off but willing to a wait a little longer until 11th Gen hits the shelves.



Zach_01 said:


> Just a clarification. It’s SmartAccessMemory and not smart cache. The large Infinity Cache all RX6000 have, is something different and unrelated to SAM.
> IC is just an enhancement of GPU<->VRAM communication. SAM is CPU access to full VRAM.



thank you for the correction. I swear I knew should have reverted back to the Zen 3 slides before making an a$$ of myself hehe. 

Actually I was meaning to ask... are there any current independent reviews for SAM in action (other than AMD slides)? Or is this something expected with RDNA 2 NDA's lifted at launch? 

Also in your words, can you simplify the SAM process? For example, my not-so researched understanding is the CPU uses both system memory (RAM) and VRAM for the added performance uplift. If games don't utilise more than lets say 10GB RAM, and I've already got 16gigs... does that mean I won't benefit with SAM or have i got it all wrong?


----------



## Zach_01 (Nov 6, 2020)

GPUs (6000) must be released first...
Around the 18th

We don’t know the specifics yet, but from my understanding, CPU will be using empty VRAM for its own workload during gaming.


----------



## phanbuey (Nov 6, 2020)

Zach_01 said:


> GPUs (6000) must be released first...
> Around the 18th
> 
> We don’t know the specifics yet, but from my understanding, CPU will be using empty VRAM for its own workload during gaming.



I am curious to see how this works.  Could be massive, could be marketing nonsense.


----------



## Max(IT) (Nov 6, 2020)

X71200 said:


> Gaming does not load the CPU to its limits, if you actually made an example like rendering, where the whole chip is used... that might have made more sense.
> 
> I have couple of those and they work just fine when you don't push the CPU, which is what the OP sounded like he is going to do. Especially now that winter is coming.


I'm speaking from direct experience.
Bought a 3600X for my son, in a "cheap" gaming rig (3600X + MSI B450M Mortar Max + GTX 1660 Super) and the stock cooler was terrible: while gaming it struggled to keep temperature under control with the fan at maximum speed.
Swapped with the Wraith Prism brand new from my 3900X (I'm using a Dark Rock 4 with it) and the situation dramatically improved in both temperatures and noise.
The Wraith Prism isn't a great cooler, but it is more than adeguate for a 3600X/3700X while the stock Wraith Stealth is not.

And if you don't believe my words, you can check by yourself: stealth vs prism
Even in gaming the difference is huge.

Even the cheap Spire is better than the garbage Stealth, and I don't know why AMD was providing the Spire for 3600X and now is giving the Stealth to the more expensive 5600X.


----------



## wheresmycar (Nov 6, 2020)

Zach_01 said:


> GPUs (6000) must be released first...
> Around the 18th
> 
> We don’t know the specifics yet, but from my understanding, CPU will be using empty VRAM for its own workload during gaming.



Ah, that makes sense. I guess we'll have to see how much of an impact this feature will have


----------



## londiste (Nov 6, 2020)

wheresmycar said:


> Also in your words, can you simplify the SAM process? For example, my not-so researched understanding is the CPU uses both system memory (RAM) and VRAM for the added performance uplift. If games don't utilise more than lets say 10GB RAM, and I've already got 16gigs... does that mean I won't benefit with SAM or have i got it all wrong?


AMD has not revealed details of what SAM does. Basically, there is nothing new on hardware side, there is GPU and it can access system RAM over PCIe bus. This has been the case since the very beginning of PCIe GPUs (and AGP before that). Best educated guess is that SAM is some sort of software/firmware/API solution that does traffic management or QoS to prioritize certain traffic.


----------



## wheresmycar (Nov 6, 2020)

londiste said:


> AMD has not revealed details of what SAM does. Basically, there is nothing new on hardware side, there is GPU and it can access system RAM over PCIe bus. This has been the case since the very beginning of PCIe GPUs (and AGP before that). Best educated guess is that SAM is some sort of software/firmware/API solution that does traffic management or QoS to prioritize certain traffic.



I was under the impression SAM is an older inferior feature being revamped for superior gains. I guess we're all in the same boat with the becoming (not in your boat lol, yours and Zachs technical know-how is definitely more advanced) but SAM does seem interesting or like phanbuey put it "marketing nonsense".


----------



## Zach_01 (Nov 6, 2020)

Max(IT) said:


> I'm speaking from direct experience.
> Bought a 3600X for my son, in a "cheap" gaming rig (3600X + MSI B450M Mortar Max + GTX 1660 Super) and the stock cooler was terrible: while gaming it struggled to keep temperature under control with the fan at maximum speed.
> Swapped with the Wraith Prism brand new from my 3900X (I'm using a Dark Rock 4 with it) and the situation dramatically improved in both temperatures and noise.
> The Wraith Prism isn't a great cooler, but it is more than adeguate for a 3600X/3700X while the stock Wraith Stealth is not.
> ...


I agree that the stealth is bare minimum and just to work, but it’s not about CPU price. It’s about power draw. All 3600, 3700X, 5600X are 65W TDP parts. 3600X is a 95W TDP (88W PPT vs 125W PPT).


londiste said:


> AMD has not revealed details of what SAM does. Basically, there is nothing new on hardware side, there is GPU and it can access system RAM over PCIe bus. This has been the case since the very beginning of PCIe GPUs (and AGP before that). Best educated guess is that SAM is some sort of software/firmware/API solution that does traffic management or QoS to prioritize certain traffic.


Yes CPU access to VRAM is old but the difference from SAM is that up until now CPU could/can access only a 256MB part of VRAM. Now AMD is giving full access to entire VRAM. And I suspect that PCI-E 4.0 is a requirement for such operation that can involve large amount of data back and forth for the CPU alone...
They said it... “Full CPU access to VRAM” and there is a video that explains it on AMD channel.
If it’s marketing nonsense we will see. It’s not up to us to decide, especially when we don’t know the specifics...


----------



## X71200 (Nov 6, 2020)

Max(IT) said:


> I'm speaking from direct experience.
> Bought a 3600X for my son, in a "cheap" gaming rig (3600X + MSI B450M Mortar Max + GTX 1660 Super) and the stock cooler was terrible: while gaming it struggled to keep temperature under control with the fan at maximum speed.
> Swapped with the Wraith Prism brand new from my 3900X (I'm using a Dark Rock 4 with it) and the situation dramatically improved in both temperatures and noise.
> The Wraith Prism isn't a great cooler, but it is more than adeguate for a 3600X/3700X while the stock Wraith Stealth is not.
> ...



If you don't know what you're doing, you can overheat under any condition. Such as auto voltages. A properly tuned 5600X, at STOCK settings, will not impact your gaming experience anywhere as bad as you're making it out to be.

Of course the cooler with the heatpipes will perform better, I have MULTIPLE of them and I use those effectively in numerous rigs. The main problem with the big Wraith is noise under heavier loads. Your link is also with Pinnacle Ridge counterparts...

the cooler comes with TIM pre-applied, it's a slap on solution and it DOES work, even at %100 load:



			https://tpucdn.com/review/amd-ryzen-5-5600x/images/cpu-temperature.png


----------



## Zach_01 (Nov 6, 2020)

X71200 said:


> If you don't know what you're doing, you can overheat under any condition. Such as auto voltages. A properly tuned 5600X, at STOCK settings, will not impact your gaming experience anywhere as bad as you're making it out to be.
> 
> Of course the cooler with the heatpipes will perform better, I have MULTIPLE of them and I use those effectively in numerous rigs. The main problem with the big Wraith is noise under heavier loads. Your link is also with Pinnacle Ridge counterparts...
> 
> ...


From TPU review of 5600X:

_"We used a Noctua NH-U14S to measure the CPU temperature while running Blender. We picked an actual application as that better reflects real-life usage than a stress-testing application like Prime95."_

See temperature page...

3600 with its 88W (actual) max power draw can easily run  80~85C depending case airflow. Maybe 5600X with its way better effeciency (76W actual power draw) could run like 5C lower but nowhere near that 59C TPU resulted.


----------



## X71200 (Nov 6, 2020)

Yeah I thought it was stock, but anyways, point stands really.


----------



## RandallFlagg (Nov 6, 2020)

wheresmycar said:


> AMD Ryzen 5 5600X Review
> 
> 
> Six Zen 3 cores beating eight Zen 2 Cores? That's exactly what's happening with the Ryzen 5 5600X. AMD's massive IPC gain helped it overcome a two-core deficitm, even in productivity tests. The Ryzen 5 5600X redefines what you really need for a high-end gaming PC.
> ...




10700 doesn't have Intel's turbo boost  3.0, while the K models do.   That may be part of it.  As far as the max power levels and so on, within the review of the 10700 TPU breaks it down.  They use the green bar for reference in future benchmarks, but you can see the effect of unlocking power limits and the 3% overclock possible on non-K chips.  

Really I think the whole thing is explained by chaos theory.

Take a look at the goofy results AnandTech got below the TPU image.







Goofy AT results :


----------



## _UV_ (Nov 6, 2020)

wheresmycar said:


> I'm in no rush though - my 7700K is holding up to anything and everything listed in the review charts and multi-threaded inferiority isn't a big deal with my "current" requirements....
> 
> My next upgrade will be put to use for 4yrs+


my 2 cents

I7 2600 nonK + W3670 systems here in use allmost 10 years with minor upgrades like SSD, GPU, RAM and that was a breakthrough performance upgrade from Core 2 Q6600 (upgraded from Pentium D). The only thing that frustrating me - speed of Adobe Illustrator saving big files (allmost single core), anything related to encoding video handled by NVENC or CUDA. Year or two earlier i was aiming to upgrade to something 16 cores, now with current tests showing that mainstream applications still use just 8c/16t i think i will finally upgrade to 10850K and it will be fine for next 5 years. It brings for me double single core IPC and about 4 times multicore performance. Yes, 5950x will give about 10x multicore boost, but i have no tasks for it.

In your case performance boost would be between 20 and 70% depending on the task.


----------



## wheresmycar (Nov 7, 2020)

RandallFlagg said:


> 10700 doesn't have Intel's turbo boost  3.0, while the K models do.   That may be part of it.  As far as the max power levels and so on, within the review of the 10700 TPU breaks it down.  They use the green bar for reference in future benchmarks, but you can see the effect of unlocking power limits and the 3% overclock possible on non-K chips.
> 
> Really I think the whole thing is explained by chaos theory.
> 
> ...



OK this ones got me baffled lol Although it's 4K but i totally get it - "random irregularities" or just different types of games, applications, power configurations or system conditions achieving different results. The 2700X beating the 3700X  and the 10700 beating the 10700K definitely puts the chaos theory to action. 

Ok *RandallFlag, i gotto ask - assuming you were upgrading today with no budget concerns and the targeted workload consists of the following

1. Day trading with 3 broker applications running simultaneously (a bunch of other tools/indicators/market feeds/browser windows/etc)

2. Gaming 1440p 144hz 

3. Light-weight video rendering/transcoding*

....which CPU would you target from the Zen 3 or 10th Gen SKUs? Overclocking isn't a requirement unless the performance returns make a meaningful difference for the intended tasks.


----------



## RandallFlagg (Nov 7, 2020)

wheresmycar said:


> OK this ones got me baffled lol Although it's 4K but i totally get it - "random irregularities" or just different types of games, applications, power configurations or system conditions achieving different results. The 2700X beating the 3700X  and the 10700 beating the 10700K definitely puts the chaos theory to action.
> 
> Ok *RandallFlag, i gotto ask - assuming you were upgrading today with no budget concerns and the targeted workload consists of the following
> 
> ...



If money is no object I would get a 5950X and pair it with some very high quality DDR4-3600 memory - at least from a pure performance perspective.  

I'm not sure if that's really what you were asking though  

I think the only weak point in the Zen 3 lineup is in the $250 - $400 range where Intel's 10700 / 10700K and sometimes 10850K (on sale) wind up staring down the 5600X.  i.e. I would rather have any of those 3 than a 5600X, even for your use cases.  If day trading speed is of #1 importance though, the Zen 3 is winning on web tests as well as java, which means it would probably perform better overall on the types of applications a day trader uses.

Also of note, if stability and such is of great importance, I would also go Intel for that.  I'll get mobbed for saying that here, but there's no doubt in my mind that Intel is a more stable platform.  It should be by now, after all it's basically just an evolution of a 5 year old platform.


----------



## thesmokingman (Nov 7, 2020)

lmao more stable... /facepalm



wheresmycar said:


> ....which CPU would you target from the Zen 3 or 10th Gen SKUs? Overclocking isn't a requirement unless the performance returns make a meaningful difference for the intended tasks.



A 5800x or 5900x (for padding) will more than cover that. A 5950x doesn't make sense on a cost basis unless you can actually make use of those extra cores and you won't be doing that with the workload you described.


----------



## Melvis (Nov 7, 2020)

Apples to Apples


----------



## dir_d (Nov 7, 2020)

Melvis said:


> Apples to Apples


Really weird that she had problems in warzone where the CPU flew in every other review


----------



## phanbuey (Nov 7, 2020)

The only issue is that the 10600K easily hits 5.1Ghz and 48x ring, while the 5600x doesn't seem to OC very well at all so... Those percentages shrink massively as a tuned 10600K can match a 10900k at stock.  At the end of the day you won't be able to tell the difference either way.


----------



## RandallFlagg (Nov 7, 2020)

phanbuey said:


> The only issue is that the 10600K easily hits 5.1Ghz and 48x ring, while the 5600x doesn't seem to OC very well at all so... Those percentages shrink massively as a tuned 10600K can match a 10900k at stock.  At the end of the day you won't be able to tell the difference either way.



The OC capability of the K series Intels is why they dominate user submitted benchmark sites.  10900K's running 5.6 Ghz and 10980XE's at 5.1Ghz with DDR4-4600 dominate the top 100 spots on PCMark's site.


----------



## Vya Domus (Nov 7, 2020)

RandallFlagg said:


> I'll get mobbed for saying that here, but there's no doubt in my mind that Intel is a more stable platform



Yes you would because it's untrue.


----------



## X71200 (Nov 7, 2020)

Stability is not %100 relevant to how mature a platform is, at all. He doesn't even talk about the node itself, or how the VRMs react to loads. It's just pure fanboy nonsense. If he did, he would have talked about the actual chip or the boards, etc. You could make some random CPU and it could be more stable than your grossly produced Ryzen, or it could also be a lot less stable. All this is obviously related to what comes out of the wafer, the power quality you're getting from your VRMs, your power supply and such. If anything, the AMD platform actually works at far lower temperatures. There are other bits like the Intel chips being LGA sockets so you could mess that up if you like switch CPUs / re-TIM in every once a while. I've grown to come back to pin grid array after messing up couple X299 boards. The pins on the Zen chips do not as easily get FUBAR. Once my CPU got stuck to the TIM and I pulled it straight out of the socket with the water block without opening the lock. I'm still using it without any issues.


----------



## Max(IT) (Nov 7, 2020)

X71200 said:


> If you don't know what you're doing, you can overheat under any condition. Such as auto voltages. A properly tuned 5600X, at STOCK settings, will not impact your gaming experience anywhere as bad as you're making it out to be.
> 
> Of course the cooler with the heatpipes will perform better, I have MULTIPLE of them and I use those effectively in numerous rigs. The main problem with the big Wraith is noise under heavier loads. Your link is also with Pinnacle Ridge counterparts...
> 
> ...


Are you serious ???   
The graph you linked is a test using Noctua UH-12S, a far better cooler than the Stealth...



X71200 said:


> Yeah I thought it was stock, but anyways, point stands really.


No it doesn't.
I showed numbers about how poor the stock cooler is for gaming and rendering.

Yes, if you are mostly doing browsing and office duties, you can keep it. For everything else you need a proper cooler.


----------



## Zach_01 (Nov 7, 2020)

RandallFlagg said:


> The OC capability of the K series Intels is why they dominate user submitted benchmark sites.  10900K's running 5.6 Ghz and 10980XE's at 5.1Ghz with DDR4-4600 dominate the top 100 spots on PCMark's site.


No doubt... but also the top 100 spots for the most power hogs of PCs...
And what about the cooling requirements for a 250~300W power draw of a chip alone?


----------



## X71200 (Nov 7, 2020)

Max(IT) said:


> Are you serious ???
> The graph you linked is a test using Noctua UH-12S, a far better cooler than the Stealth...
> 
> No it doesn't.
> ...



Stop with the nonsense.



			https://eteknix-eteknixltd.netdna-ssl.com/wp-content/uploads/2020/11/25-2.jpg
		


I can't find a test where the stock cooler is used but the thing seems to be even more power friendly than the 3600 due to the improved nm.

Of course the *insert high core count CPU here* will dominate your poor benchmark, Randall. The benchmark you're talking about uses up to whatever core you throw at it.


----------



## Max(IT) (Nov 7, 2020)

X71200 said:


> Stop with the nonsense.
> 
> 
> 
> ...


again you keep posting random test without a stock cooler use.
I DID, and it showed very poor results.


----------



## X71200 (Nov 7, 2020)

Max(IT) said:


> again you keep posting random test without a stock cooler use.
> I DID, and it showed very poor results.



The test you linked to was Pinnacle Ridge CPUs. Do you even know the difference between it and Vermeer?


----------



## Max(IT) (Nov 7, 2020)

X71200 said:


> The test you linked to was Pinnacle Ridge CPUs. Do you even know the difference between it and Vermeer?


the test was about the three AMD coolers using the same CPU. It shows how poor the Stealth is.


----------



## X71200 (Nov 7, 2020)

Max(IT) said:


> the test was about the three AMD coolers using the same CPU. It shows how poor the Stealth is.



I did a test on a car with 3 tires from different companies, and got some results. I didn't do the same test on another car and see how well they do there, I don't care because I simply need to make sure my opinion looks right.


----------



## Max(IT) (Nov 7, 2020)

X71200 said:


> I did a test on a car with 3 tires from different companies, and got some results. I didn't do the same test on another car and see how well they do there, I don't care because I simply need to make sure my opinion looks right.


you are arguing for the sake of it.
You wrote something completely wrong and you don't want to admit it.

Here you can find a Prism vs Stealth comparison on a 3600.

Enough of this.

The Stealth is GARBAGE. And the Prism is barely acceptable.

stealth vs prism


----------



## X71200 (Nov 7, 2020)

Max(IT) said:


> you are arguing for the sake of it.
> You wrote something completely wrong and you don't want to admit it.
> 
> Here you can find a Prism vs Stealth comparison on a 3600.
> ...



Oh really?



			https://static.techspot.com/articles-info/1871/bench/Temps-p.webp
		


Enough of your opiniated nonsense.


----------



## Max(IT) (Nov 7, 2020)

X71200 said:


> Oh really?
> 
> 
> 
> ...


lol you just confirmed my point: 80° in blender is a clear indication of poor cooling in such a CPU.
I don't go past 70/72° using a Prism (that still is a modest cooler)


----------



## X71200 (Nov 7, 2020)

It's a result that came out of 1 hour of Blender, you're never going to see that temperature under normal everyday gaming, usage whatever.

I've used the Prism and the older Bulldozer cooler which is something in between the non-heatpipe coolers and the Prism, both did well on my 3600. I have never seen "barely enough" temperatures on either.


----------



## RandallFlagg (Nov 7, 2020)

Intel "K" chips are easily overclocked 10% and many are getting 15-20% OC.  I haven't heard of a non-K chip that won't take a 3% BCLK out of the box, I'm sure one exists.  

AMD Zen 2 and apparently Zen 3 struggle to get 100Mhz i.e. about the 2-3% you get with a non-K chip on the Intel side.  They are at their limits from the factory.  

Intel chips are known to run with much higher speed RAM.  DDR4-4200 is easy on Intel Z490 platforms.  It is not uncommon to get DDR4-4800 on Intel.   AMD struggles with speeds over DDR4-3800, some get DDR4-4133.  

What this means is that Intel chips / chipsets have a significant engineering margin of error built in.   AMD chips by contrast do not.  


When you put a system, any system, near its limits it will not be a stable as a system that is far from its limits.  

Intel also sets many standards outside of the CPU, for example Thunderbolt.  If you have an AMD board, it's quite likely that board is using an Intel chip somewhere.

It's not like this is news to anyone, these things have been noted by AMD fans many times.  But like I said the AMD fanboi mob would be here if anyone pointed it out in contrast to Intel, or as a plus for Intel.  The emotional nonsense mob aside, what I've presented above are self-evident facts.


----------



## Zach_01 (Nov 7, 2020)

@X71200
@Max(IT) 

Come one guys... Don’t do that to the thread...

The Stealth cooler is just to work and nothing more. On typical (and most) cases it will manage to keep the 88W parts around 85C. Everything else above or below that is not the usual.
And 95+C is not possible without disabling power management of the chip(like OC). 95C is the throttling point of ZEN2.

The 5600X with its 76W draw will probably be around 80 maybe a little less. Under most cases.

When a Fan shroud is thicker that the cooler heatsink... something is not right!

Please now... cease fire...


----------



## X71200 (Nov 7, 2020)

RandallFlagg said:


> Intel "K" chips are easily overclocked 10% and many are getting 15-20% OC.  I haven't heard of a non-K chip that won't take a 3% BCLK out of the box, I'm sure one exists.
> 
> AMD Zen 2 and apparently Zen 3 struggle to get 100Mhz i.e. about the 2-3% you get with a non-K chip on the Intel side.  They are at their limits from the factory.
> 
> ...



Don't make me laugh, the reason of that is in the inherited voltage / clock walls of Zen itself. This doesn't mean it's worse when it performs very good at LOWER clocks.

Then you talk about the "Thunderbolt". Hilarious. A standard dumped by Apple for many years, only to be resurrected on Windows setups with expensive boards and cards first. Still haven't managed to settle down properly. My board uses a Realtek 2.5g ethernet and it works flawlessly. Just like my 10gbit Intel X550 ethernet card which I bought mostly for kicks and giggles. Intel still makes a lot of stuff on worse nanometers and most of their stuff is outdated. My 980 Pro kicks my Optane in the nuts when it comes to 4k write even at a much lower size. You know you're posting Intel sided material in these threads and you should stop.


----------



## sepheronx (Nov 7, 2020)

AMD Ryzen 7 5800X Review
					

Having reviewed Ryzen 5000 12-core and 16-core models, today we're testing the Ryzen 7 5800X, AMD's latest 8-core CPU. So far we've been impressed by the Ryzen...




					www.techspot.com
				




As one can see, the 5800X is also a solid choice CPU.  Its performance is overall on par or exceeds the 10700K.  In some cases only 3% and in some others, by a larger margins.

Got to respect AMD for this.

Now I am looking forward to Intel's Rocket Lake to see how well it performs.  But I may make the switch to AMD by then.  Daughter needs a new PC and I will get her to build one using the Z490 motherboard and the 10500ES chip I got.


----------



## RandallFlagg (Nov 7, 2020)

Zach_01 said:


> No doubt... but also the top 100 spots for the most power hogs of PCs...
> And what about the cooling requirements for a 250~300W power draw of a chip alone?



Lots of sites / videos using various ~$80 to $150 AIO 240mm coolers have the 10700K hitting 5.3Ghz all core.  The boost in multi-core can be close to +30%, and about +10% in single core.   This can be done on $200 motherboards as well.  The power draw of the chip is closer to 200W than 250W.   

Paired up with some DDR4-4200 RAM, that combo is giving a score ~13,000 on Time Spy CPU.  That is a tie with the top performing 5800X right now on those charts.   It destroys the top performing 5600X by about 35%.   <- Note this is for max 5.3Ghz overclock.  The more extreme overclocks are another 10% faster, but they are getting up to 5.4 / 5.5 and will likely have the really extreme cooling solutions.  

I'm just saying you can do good out of the box AIO cooling and pop off +20 - 30% performance on a 10700K without too much trouble.  Going beyond an all core 5.2 / 5.3 turbo does indeed take a lot more effort though.


----------



## Zach_01 (Nov 7, 2020)

Too much for bragging rights. And I was/am talking about the 5.6GHz and the top 100 benchmark lists you mentioned. That can’t happen with 200W. Not on this earth.


----------



## Max(IT) (Nov 7, 2020)

Zach_01 said:


> @X71200
> @Max(IT)
> 
> Come one guys... Don’t do that to the thread...
> ...


As you surely know operating at temperature around 85° is not desirable on a Ryzen, because even if you are well below the maximum 95°, PB2 won't allow the CPU to boost to the maximum theoretical clock. So you are hampering performance using a garbage cooler like the Stealth.
Even a modest 30/40€ cooler could keep a 3600/5600X at least 10° cooler, giving better performance.

But with him is always the same: he is right, even when he is wrong


----------



## RandallFlagg (Nov 7, 2020)

Zach_01 said:


> Too much for bragging rights. And I was/am talking about the 5.6GHz and the top 100 benchmark lists you mentioned. That can’t happen with 200W. Not on this earth.



I don't look at the very top 100 much.  Limit the max frequency to 5.3 on the selector, and sort by CPU score.  5.3 is very achievable, using off the shelf components.  I agree the ones that have like 5.6+ Ghz are not likely to be achieved by a normal user, but that is a red herring.  The K series are meant to be overclocked and most can be overclocked by the average user.

Case in point, this is my lowly 10400 with all core 4.2Ghz.  Stock it's 3.9Ghz.  ATM I just have Intel's stock cooler, and a cheap $150 Asus Prime Plus Z490.   I'm basically getting a +7 - 8% multi-core performance boost for nothing.







Like this - note they are using 10700K stock as baseline 0% :


----------



## X71200 (Nov 7, 2020)

How about you look at these?



			https://img.techpowerup.org/201107/untitled3.jpg
		




			https://img.techpowerup.org/201107/untitled2.jpg
		




			https://img.techpowerup.org/201107/untitled037.jpg
		


Done with 3600 B-Die Trident worth 80 bucks today, at 16-16. Have been testing left to right, no crashes (just updated BIOS).

Of course when things only look good on blue on pouvre on red, the CPU sucks. Now you're here telling me, this older chip performing similarly to yours with 4200 RAM, on a cheap AIO config it can do... then your CPU somehow is able to crush chips that are much better than mine? Like the 5600X on single? What about 5800X and 5900X multiple? Single is also not all that relevant today. I have more CPU usage left when multiple cores are used.

As for the other guy, I could install a setup with the absolute garbage cooler and get some results, but I'm not going to bother since that's more work and he can't understand that 1+ hour of Blender does not equal to gaming.


----------



## RandallFlagg (Nov 7, 2020)

X71200 said:


> How about you look at these?
> 
> 
> 
> ...




Your 12C / 24T 3900X with dual 240mm AIO coolers gets whipped in Time Spy CPU by a 8C / 16T 10700K with a 4.9Ghz all core turbo and DDR4-4400 (which you probably can't do on a Ryzen, since it's not stable with those kind of speeds).  In fact, it only barely beats the power unlocked 10700 non-K systems on those charts.  That's really not very interesting at all given you have 50% more cores and a huge cooling system.  You really should have bought an K Intel chip for what you're trying to do with that rig.  Edit: you also don't have a valid score, which is questionable.


----------



## X71200 (Nov 7, 2020)

RandallFlagg said:


> Your 12C / 24T 3900X with dual 240mm AIO coolers gets whipped in Time Spy CPU by a 8C / 16T 10700K with a 4.9Ghz all core turbo and DDR4-4400 (which you probably can't do on a Ryzen, since it's not stable with those kind of speeds).  In fact, it only barely beats the power unlocked 10700 non-K systems on those charts.  That's really not very interesting at all given you have 50% more cores and a huge cooling system.  You really should have bought an K Intel chip for what you're trying to do with that rig.



I don't need any of that cooling, I have it just because I can. You can keep the Intel CPUs to yourself. Gets whipped? By 100 points right? I can bring Cinebench into argument and tell you that I've gotten over 7900 pts there in past. When I buy a 5900X, you can be ensured you'll be even more laughing stock. It's not about doing the 4400, this RAM can do around 3800 easily. It's about the price you pay for 4400 RAM and the Infinity Fabric link speed. Move on.


----------



## sepheronx (Nov 7, 2020)

You guys are fighting over nothing.

AMD new processors are great, they surpass Intel's processors at modest ranges. Cool. That doesn't stop the Intel processor being good too at the job.

Get the best in price and performance.


----------



## RandallFlagg (Nov 7, 2020)

The only laughing stock is you.


X71200 said:


> I don't need any of that cooling, I have it just because I can. You can keep the Intel CPUs to yourself. Gets whipped? By 100 points right? I can bring Cinebench into argument and tell you that I've gotten over 7900 pts there in past. When I buy a 5900X, you can be ensured you'll be even more laughing stock. It's not about doing the 4400, this RAM can do around 3800 easily. It's about the price you pay for 4400 RAM and the Infinity Fabric link speed. Move on.



Why don't you try running Time Spy CPU with normal settings, instead of "Time Spy Custom", and try getting a valid score instead of that invalid one.  There are a lot of power unlocked 8c 10700 Non-K chips which cannot be overclocked more than 3% on the CPU, scoring over 12,000 on those charts.  And that benchmark is thread sensitive, but it looks like you can't get it up even with 12 cores on that 3900X.  It's not my fault that you spent a bunch of money on cooling for a chip you can't OC for crap.

As far as Cinebench, well that's the one that AMD used to 'prove' that the FX series was 'faster' than Sandy Bridge and so on, now isn't it?


----------



## X71200 (Nov 7, 2020)

RandallFlagg said:


> The only laughing stock is your dera
> 
> 
> Why don't you try running Time Spy CPU with normal settings, instead of "Time Spy Custom", and try getting a valid score instead of that invalid one.  There are a lot of power unlocked 8c 10700 Non-K chips which cannot be overclocked more than 3% on the CPU, scoring over 12,000 on those charts.  And that benchmark is thread sensitive, but it looks like you can't get it up even with 12 cores on that 3900X.  It's not my fault that you spent a bunch of money on cooling for a chip you can't OC for crap.
> ...



I'm going to do this more simple for you.

1) You have no right to personally insult me as your sided information is all over the place with, like your Thunderbolt example.
2) God forbid using your benchmark which names itself after Zalu or some crap in the task manager, goes by a thumbs up Recommended and finishes some badly designed particles in 15 seconds. No wonder why 3DM is so horrid.
3) I ran custom settings because I only ran the CPU bench, I changed nothing. The bench was inconsistent as heck with me getting 12k once, then almost 13k on another. You don't even understand your own benchmark. What a pity that you're using that junk.
4) To repeat, I do not need that cooling. I can pull it off with my crappy Deepcool. You do not understand this, either.
5) FX has nothing to do with this and there are other tests this CPU shreks yours on.
6) Your suggested SKU costs $360, requires more cooling and so on.
7) Last but not the least, you need to move along. You are wrong.


----------



## RandallFlagg (Nov 8, 2020)

X71200 said:


> I'm going to do this more simple for you.
> 
> 1) You have no right to personally insult me as your sided information is all over the place with, like your Thunderbolt example.
> 2) God forbid using your benchmark which names itself after Zalu or some crap in the task manager, goes by a thumbs up Recommended and finishes some badly designed particles in 15 seconds. No wonder why 3DM is so horrid.
> ...



I tried to ignore your juvenile posts, but you kept on trolling, talking to me in 3rd person while I was ignoring your posts, so you now get what you give.  This Intel fanboy suggested a 5950X earlier in this thread, and even noted the 5600X is better at Java / Web that someone asked about.  You on the other hand look to me like a frothing at the mouth partisan with not a lick of common sense and no critical thinking skills whatsoever.


----------



## 95Viper (Nov 8, 2020)

Stay on topic.
Stop the name calling and inciting discontent.
This is a technical discussion... not a discussion about each other.

Thank You.
And, Have A Nice Day


----------

