• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Ryzen 5 3600 turbo speeds

Truth is no one knows LOL, that includes you. That's the reason why Der8auer set out to find out.
So don't talk like it's set in stone what the safe voltages are when all you have are anecdotal evidences.
Seriously, what is your malfunction? There is no godamn safe voltage on Ryzen 2/3. It's freaking 2021, Zen 3 has been out a year, and you troll us all by suggesting waiting for an overclocker to do degradation testing on Zen 2, when it's old fucking news? Apparently most ppl know how these cpus work but you. Read up and stop doubling down on stupid.
 
I have an 80 mm fan just above the CPU cooler acting as exhaust. It's doing something, but not enough, I guess. With the side of the case taken off, PBO enabled and PPT auto (~80 W power consumption), I get 76-77 C in Cinebench.

Honestly, I'm quite disappointed in the 3600 so far. If I knew that the difference in heat output compared to the 3100 would be so huge, I wouldn't have bought it.

How do your temps look if you max out your CPU fan and 80mm fan speed in BIOS?

Between strictly stock settings and PBO just "Enabled", PBO can contribute a fair bit to extra power and heat without providing much of a performance uplift. PBO Auto should always be functionally equivalent to Disabled, but you never know what bugs there might be in firmware.

All things considered it looks pretty normal. The only way I could get down to 71C in Cinebench on my old 83A PBO profile on my former 3700X was with a Dark Rock Pro 4. Otherwise, when stock, it held at about 68C or so.
 
Last edited:
I never managed to make Ryzen Master work for some reason. When I tried to install it last time, it said it's already installed, but I still can't find it anywhere.
Here's the path where it installs: "C:\Program Files\AMD\RyzenMaster\bin\AMD Ryzen Master.exe"

3300X is closer to a 3600 in terms of total power consumption.
In addition to being more difficult to cool. While having the same TDP as the 3600 and 3700X, all the heat is generated by a single CCX. Here's mine after 30" of R20, with 24C ambient:
3300X_R20.jpg


You should check out those new gen Thermalright coolers.. they take the heat away :D
Also a big fan of Thermalright, the temp above courtesy of the Macho rev.C. Manual oc actually brought it down by 3 degs from stock.
 
Last edited:
In addition to being more difficult to cool. While having the same TDP as the 3600 and 3700X, all the heat is generated by a single CCX. Here's mine after 30" of R20, with 24C ambient:
You have a static OC so its not easy to compare. There is a point on what you say.

3100, 3300X, 3600 all are 65W TDP CPUs but that is not their actual consumption (PPT).
65W TDP is their minimum cooling requirements stated by AMD and nothing else.

From a little research, for example on a yCruncher load the CPU power is like this
R3 3100: ~60W
R3 3300X: 75~80W
R5 3600: 80~90W

Between 3100 and 3600 the difference is huge (+35~50%).
R3 3100 should have a 45~50W TDP label.
 
Thanks, everyone, but I'm not planning on overclocking. :D I just want stable clocks with acceptable thermals - which brings me to my next problem. My be quiet! Shadow Rock LP has arrived, but this thing is still sitting at 50 C idle and hits 80 C as soon as I ask it to do something. I played around with the Windows power settings, disabled literally every performance increasing gimmick in BIOS (e.g. PBO, ASUS enhancements, etc.), still no effect. I never had this issue with the 5950X - maybe I should have gone with Vermeer again. *sigh* :shadedshu: Any advice? :(
Those be-quiet coolers might need a better paste.
 
After a little testing, I decided to use the 3100 for now, and pass the 3600 on to a friend who's in need of a new CPU anyway. It's a good CPU, but not for my use case it seems.

The little 3100 maxes out at 68 C in Cinebench all-core with the stock aluminium cube Wraith Stealth cooler on the Silent fan profile in BIOS. It'll probably be enough to feed my low profile GTX 1650, and I'll look at Intel's yard again for a replacement. We can say all bad things about their prehistoric 14 nm production node, but at least their TDP values aren't rainbow unicorn farts from the sky.

Sure, AMD "cares about gamers", but it seems they don't care much about people with cooling constraints. I mean, if there is a 3600X and non-X, why not differentiate the two with a lower TDP for the non-X variant, kind of like Intel's K and non-K lines? Oh well, what do I know... I'm just a random guy trying to cool a CPU in a slim case. :slap:

Here's the path where it installs: "C:\Program Files\AMD\RyzenMaster\bin\AMD Ryzen Master.exe"
There's no AMD folder in my Program Files, but the installer still says it can't proceed because it's installed. :confused: No worries, I've never been a big fan of software control anyway.

In addition to being more difficult to cool. While having the same TDP as the 3600 and 3700X, all the heat is generated by a single CCX. Here's mine after 30" of R20, with 24C ambient:
View attachment 198345


Also a big fan of Thermalright, the temp above courtesy of the Macho rev.C. Manual oc actually brought it down by 3 degs from stock.
70 C with a tower cooler and 68 Watts of package power? That's just... wow! o_O I'm starting to think that 7 nm isn't so great for everything.
 
Last edited:
There's no AMD folder in my Program Files, but the installer still says it can't proceed because it's installed. :confused: No worries, I've never been a big fan of software control anyway.
I shouldn't plug this but here goes. Revo Uninstaller, it's been a saviour for me over the years and It's free.
Open up the GUI and see if Ryzen Master is installed. If so uninstall it and choose the advanced method to clear all traces of it then reinstall the software.
 
I don't know.. I have seen some pretty decent 3600s posted.. but most of them seem like they good for stock and are junk for playing with. They must get the worst part of the wafer or something.
 
3600 is a stupid easy cpu to cool.
 
After a little testing, I decided to use the 3100 for now, and pass the 3600 on to a friend who's in need of a new CPU anyway. It's a good CPU, but not for my use case it seems.

The little 3100 maxes out at 68 C in Cinebench all-core with the stock aluminium cube Wraith Stealth cooler on the Silent fan profile in BIOS. It'll probably be enough to feed my low profile GTX 1650, and I'll look at Intel's yard again for a replacement. We can say all bad things about their prehistoric 14 nm production node, but at least their TDP values aren't rainbow unicorn farts from the sky.

Sure, AMD "cares about gamers", but it seems they don't care much about people with cooling constraints. I mean, if there is a 3600X and non-X, why not differentiate the two with a lower TDP for the non-X variant, kind of like Intel's K and non-K lines? Oh well, what do I know... I'm just a random guy trying to cool a CPU in a slim case. :slap:


There's no AMD folder in my Program Files, but the installer still says it can't proceed because it's installed. :confused: No worries, I've never been a big fan of software control anyway.


70 C with a tower cooler and 68 Watts of package power? That's just... wow! o_O I'm starting to think that 7 nm isn't so great for everything.
3600 and 3600X do have different power requirements.
3600 is a 88W PPT CPU and
3600XT is a 125W PPT CPU

and the 105W TDP CPUs are 142W PPT.

Again, for AMD the TDP designation is for minimum cooling requirement and not the max power consumption.

For Intel, TDP is power consumption on all core base clock boost and this is called PL1(PowerLevel1). On PL2 the Intel CPU can increase its power consumption to +50~100% from TDP.
 
3600x is 95w TDP... close enough.
 
3600x is 95w TDP... close enough.
I know that, I was talking about PPT and not TDP. The actual max power is 125W for the 3600XT as 88W is for 3600nonX
 
I don't know.. I have seen some pretty decent 3600s posted.. but most of them seem like they good for stock and are junk for playing with. They must get the worst part of the wafer or something.
Yep. You'll find those bronze samples in some stores. It's all pot luck unless you want to pay more for a binned chip.
 
Yep. You'll find those bronze samples in some stores. It's all pot luck unless you want to pay more for a binned chip.
I have a bronze 3600. It’s probably from first batch. Purchased Aug2019. Uses higher voltage compared to newer 3600 and has zero OC capability.

In general 3600s are the worst binned CPUs hence the low boost clock of 4.2GHz.
 
I have a bronze 3600. It’s probably from first batch. Purchased Aug2019. Uses higher voltage compared to newer 3600 and has zero OC capability.
Silver sample here produced in May 2020. I leave it at stock and use Ryzen Master. I tried to OC it and it would just creep past 4.2GHz then crash.
Not worth the hassle tinkering with it.
 
After a little testing, I decided to use the 3100 for now, and pass the 3600 on to a friend who's in need of a new CPU anyway. It's a good CPU, but not for my use case it seems.

The little 3100 maxes out at 68 C in Cinebench all-core with the stock aluminium cube Wraith Stealth cooler on the Silent fan profile in BIOS. It'll probably be enough to feed my low profile GTX 1650, and I'll look at Intel's yard again for a replacement. We can say all bad things about their prehistoric 14 nm production node, but at least their TDP values aren't rainbow unicorn farts from the sky.

Sure, AMD "cares about gamers", but it seems they don't care much about people with cooling constraints. I mean, if there is a 3600X and non-X, why not differentiate the two with a lower TDP for the non-X variant, kind of like Intel's K and non-K lines? Oh well, what do I know... I'm just a random guy trying to cool a CPU in a slim case. :slap:


There's no AMD folder in my Program Files, but the installer still says it can't proceed because it's installed. :confused: No worries, I've never been a big fan of software control anyway.


70 C with a tower cooler and 68 Watts of package power? That's just... wow! o_O I'm starting to think that 7 nm isn't so great for everything.

Haha I went through the same phase when I bought my former 3700X a month after launch, June 2019 production. The heat density and the idle behaviour was a bit of a culture shock coming even from a 4790K.

At some point I got used to it like most people do, it just takes time. Having an Asus board is a saving grace for Ryzen idle because of Q-fan's hysteresis controls, gotta use it. I run my 5900X under my souped-up C14S at a nearly flat fan curve of 40-45% between 0-83C, no ramping anymore.

At the end of the day, Intel's TDP is just as offensive, just in a different way. The only difference is that Comet Lake actually performs excellent thermally because of its new IHS and die/substrate thinning, subverting our expectations of thermals. But 11th gen is a complete regression in thermals so it's moot.

The issue with Ryzen will become quite apparent if you get a Renoir or Cezanne APU to play with. The monolithic Ryzens are both quite a bit cooler and seem to draw less power as well while clocking about the same if not better than their chiplet counterparts (same 4.1GHz all-core, albeit slightly slower), which makes them much better suited to SFF coolers. Same NH-U9S same NCASE same airflow, 75C on the 3700X (behaves similarly to the 3600), 60C on the 4650G.

And they run 10C cooler when pulling 110W through the entire chip than 60W through the CPU only. That should persuade you to stop regarding power draw as an indicator of temperature :D

They're technically also 65W parts, on paper. All in all, neither the AMD TDP nor PPT tell you too much. They're all "88W PPT", but the 5600X runs cooler than the 3600 as it doesn't max out its stock power limit, and the 4650G runs cooler than both. They're all "142W PPT", but the 5800X hits 80C+ on air in MT, while the 5900X runs at 70C in MT. Best to disregard advertised numbers and treat each CPU uniquely on thermals, same goes for Intel.
 
PPT is just a power draw number. It doesn't tell you of course thermal behavior. Thermals are about heat density = die sizes, number of active cores per CCX/CCD, number of CCD(s) and so on...
Actually, 5600X has a 76W PPT stock limit and it does reach it, even though it has the same label for minimum cooling capacity (65W TDP).

For AMD, TDP is exactly this: Minimum cooling requirement under specific CPU and ambient thermal conditions.
TDP(Watts) = (tCase°C - tAmbient°C)/(HSF θca)

AMD TDP is not an accurate number for thermal behavior either. Its even less accurate than PPT as AMD just segments a few different SKUs at each TDP level. Its a very rough number about cooling as we can see from the actual power and thermal behavior of 3100, 3300X, 5600X and 3600.

Steve Burke from GamersNexus claims that he had a discussion with AMD before the release of the following video.

 
Silver sample here produced in May 2020. I leave it at stock and use Ryzen Master. I tried to OC it and it would just creep past 4.2GHz then crash.
Not worth the hassle tinkering with it.
Lol, that dumpster fire of a software first reported my 3900X as a platinum sample then as a bronze sample. Make up your mind will ya? I guess it's gonna be a fried sample once your software gives it 1.55v and runs a Light AVX test... that's the only feedback I gave to 1usmus which he didn't reply to, he replied to every other one of my tweets however.

That crap probably took 2 years off my CPU's lifespan.
 
Nah degrading means it requires more voltage for a certain clocks, the chip won't die.
And no it won't degrade under gaming condition, that's taboo.
they do. my 3700x started black screening and needing more and more volts for the same clocks after a few months - changed to a different mobo, same higher voltages needed.

Yes, its a real thing. Zen2 is not good for all core OC's
 
they do. my 3700x started black screening and needing more and more volts for the same clocks after a few months - changed to a different mobo, same higher voltages needed.

Yes, its a real thing. Zen2 is not good for all core OC's

So many factors at play here: voltages, current, workload, temperature, and hell even high auto Vcore and SOC can degrade the chip, so why does manual Vcore get all the blame LOL.
If high Vcore alone kill CPU then extreme overclockers could have killed their CPU under LN2 in matter of seconds.

Anyways I don't want to drag on, everyone has different take on safe voltages for Ryzen 2/3, even Der8auer think 1.4Vcore is safe for 24/7 usage, as long as temperature is under control. Der8auer has access to RMA statistic from multiple retailers so I tend to believe him. Trusting anecdotal evidences at face value is just crazy, chip could have failed with auto voltages for all I know.

I will just point again to Der8eur video on degradation testing

He is putting 3 Ryzen 5000 CPU under 1.45Vcore and constant stress test, let see his results after 6 months and a year. My 3600 has been running a full year at 1.37Vcore, let see how it goes....

Seriously, what is your malfunction? There is no godamn safe voltage on Ryzen 2/3. It's freaking 2021, Zen 3 has been out a year, and you troll us all by suggesting waiting for an overclocker to do degradation testing on Zen 2, when it's old fucking news? Apparently most ppl know how these cpus work but you. Read up and stop doubling down on stupid.

Yeah sure 5600X and 5800X are Zen 2 CPU, sure buddy. Heard the saying "ignorance begets confidence"? :roll:.
If I were to trust anedotal evidences at face value then Covid vaccine would be 100% fatality rate :kookoo:
 
Last edited:
Haha I went through the same phase when I bought my former 3700X a month after launch, June 2019 production. The heat density and the idle behaviour was a bit of a culture shock coming even from a 4790K.

At some point I got used to it like most people do, it just takes time. Having an Asus board is a saving grace for Ryzen idle because of Q-fan's hysteresis controls, gotta use it. I run my 5900X under my souped-up C14S at a nearly flat fan curve of 40-45% between 0-83C, no ramping anymore.

At the end of the day, Intel's TDP is just as offensive, just in a different way. The only difference is that Comet Lake actually performs excellent thermally because of its new IHS and die/substrate thinning, subverting our expectations of thermals. But 11th gen is a complete regression in thermals so it's moot.

The issue with Ryzen will become quite apparent if you get a Renoir or Cezanne APU to play with. The monolithic Ryzens are both quite a bit cooler and seem to draw less power as well while clocking about the same if not better than their chiplet counterparts (same 4.1GHz all-core, albeit slightly slower), which makes them much better suited to SFF coolers. Same NH-U9S same NCASE same airflow, 75C on the 3700X (behaves similarly to the 3600), 60C on the 4650G.

And they run 10C cooler when pulling 110W through the entire chip than 60W through the CPU only. That should persuade you to stop regarding power draw as an indicator of temperature :D

They're technically also 65W parts, on paper. All in all, neither the AMD TDP nor PPT tell you too much. They're all "88W PPT", but the 5600X runs cooler than the 3600 as it doesn't max out its stock power limit, and the 4650G runs cooler than both. They're all "142W PPT", but the 5800X hits 80C+ on air in MT, while the 5900X runs at 70C in MT. Best to disregard advertised numbers and treat each CPU uniquely on thermals, same goes for Intel.
True, but at least Intel keeps their locked SKUs under leash - I mean, 65 W will be 65 W no matter what. It might reduce its turbo clocks a bit, but it will keep to its limits (unless you play with your Z-series motherboard's BIOS). 14 nm is old, but at least people know what to expect. AMD and their 7 nm is an experiment. A fun one, but still relatively new and unknown.

TDP is the only indication you have regarding heat output before buying a processor. It's vague as hell, I give you that, but there's nothing else to tell you what kind of cooling solution you need. If you don't know how to cool a chip before buying one, then you might argue that you have no idea what you're buying, and that's technically the manufacturer's / chip designer's fault in this case. I honestly don't mind, as I love toying around new hardware (heck, I might even sell this newly built slim PC tomorrow if I wake up in that mood), but the average Joe building his first gaming computer might have a rough time - or might not even notice, just kill his CPU by accident.

Edit: The other option is using the supplied cooler, which is an excellent idea with the 3100, but apparently a really crappy one with the 3600. Same 65 W TDP, same Wraith Stealth in the box. Surely, this is not right.

3600 is a stupid easy cpu to cool.
Of course it is... in a normal PC case with a tower cooler or cheap AIO. But that's not what I'm aiming for this time. ;)
 
Last edited:
Lol, that dumpster fire of a software first reported my 3900X as a platinum sample then as a bronze sample. Make up your mind will ya? I guess it's gonna be a fried sample once your software gives it 1.55v and runs a Light AVX test... that's the only feedback I gave to 1usmus which he didn't reply to, he replied to every other one of my tweets however.

That crap probably took 2 years off my CPU's lifespan.
I agree the software is poop. My CPU is borderline Bronze. I didn't want to push it any further for fear of killing it and threw in the towel @4.2GHZ + a bit more.
Simply not worth hassle. I let Ryzen Master do it's job and leave it at that. ;)
 
Hi, hope it's OK to add my query on this thread. I thought it was OK as it's about Ryzen 5 3600 turbo speeds.
After reading a bit on here and other forums I'm worried about the OC I have just achieved using AI Auite III?
I'm a newbie when it comes to OC so I thought it best to ask what is going on. Attached screenshots.
Thanks for any replies. Hope I haven't broke any rules :)

PC Specs:

ASUS ROG Strix X570-F
Corsair AX850
Ryzen 5 3600
Gkill Trident Neo 3200mhz CL16 (2x8GB)
Noctua NH-U12S
MSI GTX 1660 SUPER Gaming X
 

Attachments

  • Screenshot (74).png
    Screenshot (74).png
    236.6 KB · Views: 110
  • Screenshot (77).png
    Screenshot (77).png
    296.7 KB · Views: 178
Last edited:
Hi, hope it's OK to add my query on this thread. I thought it was OK as it's about Ryzen 5 3600 turbo speeds.
After reading a bit on here and other forums I'm worried about the OC I have just achieved using AI Auite III?
I'm a newbie when it comes to OC so I thought it best to ask what is going on. Attached screenshots.
Thanks for any replies. Hope I haven't broke any rules :)

PC Specs:

ASUS ROG Strix X570-F
Corsair AX850
Ryzen 5 3600
Gkill Trident Neo 3200mhz CL16 (2x8GB)
Noctua NH-U12S
MSI GTX 1660 SUPER Gaming X
Static 1.4v?? Christ, revert that immediately
 
Back
Top