• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Ryzen 5 3600

"Initial rumors" were pulled out of someone's a**. For $200 this is good enough.
They always offered good value no doubt and keeping intel honest to a degree.
But "Good enough" that's the key word, I was looking forward for AMD to be faster in everything not just good enough.
 
Nice review. So that confirms I have no reason to upgrade from 2600x for 1440p gaming :)
 
So quick question. Is the 3600X worth it over 3600. I can get a decent cooler like H7 Quad Lumi with the 3600 for the same price as 3600X.

Looks like AMD changed bunch of things again and so does the TDP rate (65w vs 95w) affect OC or PBO?
 
Honestly, what do you get with a 9600K? 5% with a 2080 Ti on FHD. It means it gets 199 fps instead of 190. You can't tell the difference. And when you switch to 1440P or 4K (which is the 2080Ti's territory) that 5% disappears. Same is true for 2080 or cheaper GPUs in FHD. 0% difference.

You can tell that AMD not only got better IPC and single threaded workload performance than Intel, but they managed to equal it in gaming.
[/QUOTE
Honestly, what do you get with a 9600K? 5% with a 2080 Ti on FHD. It means it gets 199 fps instead of 190. You can't tell the difference. And when you switch to 1440P or 4K (which is the 2080Ti's territory) that 5% disappears. Same is true for 2080 or cheaper GPUs in FHD. 0% difference.

You can tell that AMD not only got better IPC and single threaded workload performance than Intel, but they managed to equal it in gaming.

Obviously you get a similar CPU results when the GPU becomes the bottleneck and if you have zero plans to ever go beyond the Nvidia 2080 then your CPU choice is less important when all FPS is at similar levels. What the 9600k gives you when OC is two fold

1 - better performance at 720p resulting in more headroom once more powerful GPUs are released and you plan to go past RTX 2080 performance at some point. Many people fail to understand this point and simply believe 1440p or 4k should be the end all be all of gaming performance.

2- from every gaming programmer interview I've read they have stated how much easier Intel is to program for over AMD. That its much easier to load up cores one a time then try to program for HT, SMT, and what happens when user 1 has four cores, user 2 has eight and user 3 has ten.

So those are my reasons for my view. That said it also makes little real world difference at current same price level CPU offerings unless you have a specific niche for a specific CPU. It's not as if the 9600k can a play a game the Ryzen 3600 can't or vice versa. If you watch the video of below you will see that even people who dedicate their profession for PC hardware reviews have a hard time choosing Intel over AMD or vice versa.

 
Last edited:
2- from every gaming programmer interview I've read they have stated how much easier Intel is to program for over AMD. That its much easier to load up cores one a time then try to program for HT, SMT, and what happens when user 1 has four cores, user 2 has eight and user 3 has ten.
Could you link some of these interviews which definetly exist and state that developing for Intel CPUs is easier than developing for AMD CPUs? Given that you're talking in terms of "every one you've read" I'm sure you'll be able to come up with a top five. Cheers!
 
Obviously you get a similar CPU results when the GPU becomes the bottleneck and if you have zero plans to ever go beyond the Nvidia 2080 then your CPU choice is less important when all FPS is at similar levels. What the 9600k gives you when OC is two fold

1 - better performance at 720p resulting in more headroom once more powerful GPUs are released and you plan to go past RTX 2080 performance at some point. Many people fail to understand this point and simply believe 1440p or 4k should be the end all be all of gaming performance.

2- from every gaming programmer interview I've read they have stated how much easier Intel is to program for over AMD. That its much easier to load up cores one a time then try to program for HT, SMT, and what happens when user 1 has four cores, user 2 has eight and user 3 has ten.

So those are my reasons for my view. That said it also makes little real world difference at current same price level CPU offerings unless you have a specific niche for a specific CPU. It's not as if the 9600k can a play a game the Ryzen 3600 can't or vice versa. If you watch the video of below you will see that even people who dedicate their profession for PC hardware reviews have a hard time choosing Intel over AMD or vice versa.

Obviously you get a similar CPU results when the GPU becomes the bottleneck and if you have zero plans to ever go beyond the Nvidia 2080 then your CPU choice is less important when all FPS is at similar levels. What the 9600k gives you when OC is two fold

1 - better performance at 720p resulting in more headroom once more powerful GPUs are released and you plan to go past RTX 2080 performance at some point. Many people fail to understand this point and simply believe 1440p or 4k should be the end all be all of gaming performance.

2- from every gaming programmer interview I've read they have stated how much easier Intel is to program for over AMD. That its much easier to load up cores one a time then try to program for HT, SMT, and what happens when user 1 has four cores, user 2 has eight and user 3 has ten.

So those are my reasons for my view. That said it also makes little real world difference at current same price level CPU offerings unless you have a specific niche for a specific CPU. It's not as if the 9600k can a play a game the Ryzen 3600 can't or vice versa. If you watch the video of below you will see that even people who dedicate their profession for PC hardware reviews have a hard time choosing Intel over AMD or vice versa.

Dude. I was in the same dilemma a few days ago. But watch the vs benchmark videos on youtube. Recently released games Shadow of the tomb raider, AC odyssey, Battlefield 5 are already saturating the i5 to 70~90% utilisation. As someone who has been suffering from 100% CPU usage stutter for a few months, no way in hell am I going back to that mess in a few years by choosing i5 9600k over 3600/x.
 
Could you link some of these interviews which definetly exist and state that developing for Intel CPUs is easier than developing for AMD CPUs? Given that you're talking in terms of "every one you've read" I'm sure you'll be able to come up with a top five. Cheers!

google.com

cheers!

PS. If you watched the linked video you would see they briefly touch on the subject.

Dude. I was in the same dilemma a few days ago. But watch the vs benchmark videos on youtube. Recently released games Shadow of the tomb raider, AC odyssey, Battlefield 5 are already saturating the i5 to 70~90% utilisation. As someone who has been suffering from 100% CPU usage stutter for a few months, no way in hell am I going back to that mess in a few years by choosing i5 9600k over 3600/x.

I'm not recommending one CPU over another for any person, its up to you to get whatever parts deliver the performance you want. I posted my opinion at that price point if the 9600k was selling at the stated price point on the review.

I do know from techspot's testing of ACO, the 8600k had higher CPU utilization then the Ryzen 1600x (those were the chips at the time of the test) but that also resulted in higher GPU utilization and better performance (more then playable on both CPUs) so CPU utilization is not a clear cut provider for performance alone.

Something key to note here is the GPU utilization which is locked pretty much at 97% on the 8600K system. Now it we look at the R5 1600X, the GPU utilization is mixed in with the CPU threads so sorry about that, we can see that GPU utilization is usually around 80% but does fluctuate quite a lot and at times dropped as low as 53%. This is interesting as CPU utilization almost never cracked 90% and was often around 80%. Despite this due to the much lower GPU utilization the Ryzen CPU was overall much slower.

 
Last edited:
Obviously you get a similar CPU results when the GPU becomes the bottleneck and if you have zero plans to ever go beyond the Nvidia 2080 then your CPU choice is less important when all FPS is at similar levels. What the 9600k gives you when OC is two fold

1 - better performance at 720p resulting in more headroom once more powerful GPUs are released and you plan to go past RTX 2080 performance at some point. Many people fail to understand this point and simply believe 1440p or 4k should be the end all be all of gaming performance.

2- from every gaming programmer interview I've read they have stated how much easier Intel is to program for over AMD. That its much easier to load up cores one a time then try to program for HT, SMT, and what happens when user 1 has four cores, user 2 has eight and user 3 has ten.

So those are my reasons for my view. That said it also makes little real world difference at current same price level CPU offerings unless you have a specific niche for a specific CPU. It's not as if the 9600k can a play a game the Ryzen 3600 can't or vice versa. If you watch the video of below you will see that even people who dedicate their profession for PC hardware reviews have a hard time choosing Intel over AMD or vice versa.


1. There's so many things you have to assume for, in a few years, the 9600k still being faster than the 3600 in games. You need to assume games will not better utilize threads on CPU's better in a few years. You need to assume that graphics won't continue to get more GPU intensive, so the CPU will become the bottleneck. You need to assume we'll be getting 2080 Ti performance in a mid range card anytime soon. You need to assume programmers won't get better at optimizing for Zen with both consoles using Zen & coming out in 2020.

2. Old architectures are easier to program for than new ones. Intel has essentially used the same architecture for 8+ years, so of course it's easier to program for, as developers have already learned all the tricks, but they sure better get used to programming for Zen, as both new consoles will be utilizing them.

There's 1 reason I'd ever recommend a 9600k/9700k/9900k over a Ryzen 3xxx series at this point, and that is if you're a gamer that will 100% always lower the settings in your games so you can hit 144 fps on a 144hz+ monitor. Otherwise, at their current prices, the Intel CPU's just aren't worth it. The 3600 is 20% cheaper for 6.6% lower performance at 720p with a $1,300 GPU. With a mid-range GTX 1660 that gap would probably disappear, and that 20% cheaper isn't even taking into account that you'll need to spend another $30+ on a CPU cooler for the Intel part, raising the price gap further.
 
9600k is now 220$ in more and more places, so 3600 is NOT 20% cheaper + it has the crappiest of amd's coolers (Stealth) which pretty much everyone recommends should be upgraded. Besides, 9600k is up to 20% faster in games and by the time 3600 might start catching up due to greater thread count, they will both be ripe for replacement anyway. For pure gaming, it's still 9400f/9600k/9700k all the way.
 
I watched Hardware Unboxed's gigantic 36-game bench comparing the 9900K to the 3900X with the unrealistic test conditions of a 2080 Ti benched at 1080p. Under this academic set-up, the 9900K was only 5% faster or 6% faster when overclocked to 5Ghz.

So much for 'gaming king', the average difference with all types of games tested is almost meaningless, and when you consider the 3900X uses less power under load despite 4-more cores, enjoys up to 45% better productivity perf, comes with a good cooler, and is on a socket with more longevity (Ryzen 4000 compatible), a 6% gap in gaming but only when overclocked using a 2080 Ti @ 1080p is a hilarious justification for opting for the Intel.

With the 3600, the pros and cons are largely the same compared to Intel's price equivalent/s.
 
Leaving cpu testing methodology aside (the principles of which should be abundantly clear to anyone following computers for more than a week), 2080Ti on 1080p isn't quite as unrealistic as it might seem as there are plenty of people who want the maximum framerates while retaining very high settings, which often isn't quite possible at 1440, not even with a 2080Ti, I know 3 such just myself. Granted, none of them has just a 1080p monitor, but still, they are using it for FPS games at least. Next, the quted 6% difference was at stock, it is larger when 9900k is pushed to the limit and also at stock it does NOT consume more power than 3900X, not in gaming at least and especially not when the latter is paired with a X570 chipset. But perhaps more importantly, none of these two chips should be considered just for gaming, especially not the 3900X.
 
9600k is now 220$ in more and more places, so 3600 is NOT 20% cheaper + it has the crappiest of amd's coolers (Stealth) which pretty much everyone recommends should be upgraded. Besides, 9600k is up to 20% faster in games and by the time 3600 might start catching up due to greater thread count, they will both be ripe for replacement anyway. For pure gaming, it's still 9400f/9600k/9700k all the way.
Where's this magical land where the 9600k is only $220? It's $250, everywhere. The 9600k has been $220 before, but it's not right now. The 3600 is $200, everywhere. That is... 20% cheaper.

"Up to 20% faster" is just a way of ignoring that the 9600k is only 6.6% faster in the games on average right on this site, at 720p with a 2080 Ti, no less. There will be outliers, which is why I said:
There's 1 reason I'd ever recommend a 9600k/9700k/9900k over a Ryzen 3xxx series at this point, and that is if you're a gamer that will 100% always lower the settings in your games so you can hit 144 fps on a 144hz+ monitor.... With a mid-range GTX 1660 that gap would probably disappear...
Because of that, if someone absolutely has to game at 144 fps on a 144hz monitor, then Intel might be the right choice. For literally everyone else, Zen offers better value near-equal performance in the majority of games.
 
Leaving cpu testing methodology aside (the principles of which should be abundantly clear to anyone following computers for more than a week), 2080Ti on 1080p isn't quite as unrealistic as it might seem as there are plenty of people who want the maximum framerates while retaining very high settings, which often isn't quite possible at 1440, not even with a 2080Ti, I know 3 such just myself. Granted, none of them has just a 1080p monitor, but still, they are using it for FPS games at least. Next, the quted 6% difference was at stock, it is larger when 9900k is pushed to the limit and also at stock it does NOT consume more power than 3900X, not in gaming at least and especially not when the latter is paired with a X570 chipset. But perhaps more importantly, none of these two chips should be considered just for gaming, especially not the 3900X.


1. Actually, it's only 5% difference when 9900K is overclocked to 5Ghz and the 3900X is set to auto-overclock. So as I said, almost totally insignificant seeing as this is only when benched @ 1080p using a 2080 Ti. You may know people that game at 144Hz with a 2080 Ti @ 1080p, I don't, regardless, that is the nichest of niches.


2. I said less power under load, as in, full load, it draws less power than a 9900K, which is great as it has 4-more cores:

111362.png
 
Where's this magical land where the 9600k is only $220? It's $250, everywhere. The 9600k has been $220 before, but it's not right now. The 3600 is $200, everywhere. That is... 20% cheaper.

"Up to 20% faster" is just a way of ignoring that the 9600k is only 6.6% faster in the games on average right on this site, at 720p with a 2080 Ti, no less. There will be outliers, which is why I said:
Because of that, if someone absolutely has to game at 144 fps on a 144hz monitor, then Intel might be the right choice. For literally everyone else, Zen offers better value near-equal performance in the majority of games.
MicroCenter for one and over in Europe as well
(this is the store that AMD fans like to quote on numbers of cpus sold so now I occasionally check their prices just for reference; oh and as far as I understand, they are shipping continent-wide, so you can't say it's just one store...)
Regarding value - a 3600 might be technically better value than say 8700k, but there is another problem for the 3000 series as far as gaming is concerned - also according to this site, a 150$ 9400f already matches or almost matches their best skus...
relative-performance-games-1920-1080.png
 
Last edited:
MicroCenter for one and over in Europe as well
(this is the store that AMD fans like to quote on numbers of cpus sold so now I occasionally check their prices just for reference; oh and as far as I understand, they are shipping continent-wide, so you can't say it's just one store...)
Regarding value - a 3600 might be technically better value than say 8700k, but there is another problem for the 3000 series as far as gaming is concerned - also according to this site, a 150$ 9400f already matches or almost matches their best skus...
relative-performance-games-1920-1080.png

Silly argument as the 9400F is only 4% behind Intel's 8700K too. And good luck anyone using that for next year's games with its paltry 6-threads.

Aside from that, for general use you pay for what you get - the 9400F is a full 20% slower in CPU tests compared to 3600, which is certainly something you'd notice day to day unlike 5% gap with 2080 Ti @ 1080p. ;)
 
It's time to buy an AMD Processor :peace:
 
Silly argument as the 9400F is only 4% behind Intel's 8700K too. And good luck anyone using that for next year's games with its paltry 6-threads.

Aside from that, for general use you pay for what you get - the 9400F is a full 20% slower in CPU tests compared to 3600, which is certainly something you'd notice day to day unlike 5% gap with 2080 Ti @ 1080p. ;)
What games won't it play next year, I would like a list please?
 
Lol @ people saying the unlocked multiplier is a bonus over Intel when these chips are already running maxed out out of the box.

No they're not maxed out right out of the box - And I'll just say it, misinformation like this is not constructive, it's certainly not the truth.

And even if so, then maybe you'd care to explain how I got my 3600X to run over 4.4 on air with the stock wraith cooler..... All cores going too.

According to you it can't do this.... But it did.... And we're talking about the X version I used that's rated at 95W's, the regular 3600 being reviewed here is rated for 65W's.


 
Last edited:
What games won't it play next year, I would like a list please?
Yeah, that's total bullshit from him - it'll take at least 3 years before 6c/6t chips start to really struggle outside of maybe like one badly optimized game. Developers need to have a resonable common denominator in mind as far as cpu requirement is concerned (much more than for gpus, where settings are far more scalable) and currently that are STILL quad cores and we've only started to shift to 6 as a minimum.
 
No they're not maxed out right out of the box - And I'll just say it, that's total Intel fanboy FUD.

And if so, then maybe you'd care to explain how I got my 3600X to run over 4.4 on air with the stock wraith cooler..... All cores going too.
According to you it can't do this.... But it did.... And we're talking about the X version I used that's rated at 95W's, the regular 3600 is rated for 65W's.

Well, according to this: https://www.techpowerup.com/review/amd-ryzen-5-3600/5.html
You gained a bit over 8% compared to stock 3600X.

Also, you boast about all-core overclock, but you've linked to a single threaded benchmark ;)

What games won't it play next year, I would like a list please?
I hear Minesweeper is getting one thread per square. It's going to murder CPUs :D
 
Well, according to this: https://www.techpowerup.com/review/amd-ryzen-5-3600/5.html
You gained a bit over 8% compared to stock 3600X.

Also, you boast about all-core overclock, but you've linked to a single threaded benchmark ;)
I'm well aware it's singlethreaded and yes, to be fair that is a factor but do consider it still had to boot the OS and all else just to do and complete the run.

My point is these are not maxed out right from the box as claimed, they can and will do more.

And these use all available cores/threads - 4.3+ isn't bad for stock air and not that far off of 4.4 either.

Additional:
The thread itself isn't about the X version but it has been said they can do about as well as the X versions, with their lower wattage rating they could possibly do even better once clocked up.

Oh yeah - Can't wait to try out the new Minesweeper bench! :D
 
Last edited:
I'm well aware it's singlethreaded and yes, to be fair that is a factor but do consider it still had to boot the OS and all else just to do and complete the run.

My point is these are not maxed out right from the box as claimed, they can and will do more.

And these use all available cores/threads - 4.3+ isn't bad for stock air and not that far off of 4.4 either.

Additional:
The thread itself isn't about the X version but it has been said they can do about as well as the X versions, with their lower wattage rating they could possibly do even better once clocked up.
Well, I'm not sure what you're trying to argue here. You're just enforcing precisely what you're trying to refute: compared to a standard configuration there's just about 8% more to squeeze out of these.
Current power management won't squeeze everything out of Zen2, but it's getting pretty darn close.
 
Well, I'm not sure what you're trying to argue here. You're just enforcing precisely what you're trying to refute: compared to a standard configuration there's just about 8% more to squeeze out of these.
Current power management won't squeeze everything out of Zen2, but it's getting pretty darn close.

No, my whole point is these are Not maxed out right from the box.
Also bear in mind what I did was on stock air, if on better air or even water it would do even more, certainly beyond 8% which BTW is still more than just "Out of the box".
 
No, my whole point is these are Not maxed out right from the box.
Also bear in mind what I did was on stock air, if on better air or even water it would do even more, certainly beyond 8% which BTW is still more than just "Out of the box".
Come on, 4.4 is like a brick wall for Zen2. Sure, some will be able to hit 4.6 or maybe 4.8, but can you really compare this to old CPUs that would overclock 30% or more without breaking a sweat?

Plus, the original point was not whether these overclock or not, was about the unlocked multiplier being pointless. Did you need to change the multiplier to achieve your overclock?
 
Back
Top