• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD's Robert Hallock Confirms Lack of Manual CPU Overclocking for Ryzen 7 5800X3D

No big deal since static OC on ryzen cpus is kinda pointless. Limiting the voltage to 1.3-1.35 I am guessing pretty much nerfs boost clock override offset but again it's no big deal since you get almost none gaming performance from this. Only thing that has a meaning to do in ryzen cpus is raising the PBO Limits as much as your cooling allows to keep higher all-core clocks if you need them.
 
Incorrect. They said it themself the Cache is dependent of the CPU core voltage. They coud'nt design a seperate voltage rail among it due to compatible pin layouts. So you go with what you have, really. Since it's a EPYC gimmick, epyc's where never tested against OC's since the clocks of those CPU avg on 2 to 3.4Ghz.

Whether or not the extra cache can be powered down is irrelevant. It's a multi-layer cpu on a 7nm process and whether it was designed by AMD or Intel, it will always run hotter than a similar cpu design on a single layer.
 
Im pretty sure there is a huge difference in all core workloads. For example i think I can hit 15k (thats how much 5800x gets right?) cbr23 at around 45-50w
I thought you are talking about gaming not CBR23.
 
No big deal since static OC on ryzen cpus is kinda pointless. Limiting the voltage to 1.3-1.35 I am guessing pretty much nerfs boost clock override offset but again it's no big deal since you get almost none gaming performance from this. Only thing that has a meaning to do in ryzen cpus is raising the PBO Limits as much as your cooling allows to keep higher all-core clocks if you need them.
It varies chip to chip
My 5800x for example is one of the hotter running ones, so a static OC gains me 200Mhz all core, loses 500Mhz single threaded, but also runs 30C colder (well, with 40C ambients it did)

These are simply chips to be left on auto, or with minimal PBO tweaking (curve offset may remain) - and that appeals to a MUCH larger userbase than the overclockers
 
Will this be 6nm or 7nm?
 
In gaming the difference is even bigger
Not to start another war, but I'm not so sure since games generally utilize few cores most of the time. My 5600X in SOTTR (a game that utilizes many cores well) get 230-240fps CPU game avg 1080p highest running 76W limit, but I get 215-225fps running 45W limit so only around 5% fps loss last I checked. In both cases IO-die uses 15-20W. It may be more or less in other games, but generally less since most games uses less cores than SOTTR.

If you can test your 12th gen in SOTTR 1080p highest with unlimited and 45W limit it would be great :)

Ryzen 5k scales poorly efficiencywise in allcore loads if voltage is above 1.25V-ish (meaning 4.5-4.75GHz allcore depending on bin, CO value etc). I'm not sure where sæefficient scaling stops on 12th gen, but wouldn't be surprised if it were around 1.2-1.3v allcore like Ryzen 5k, maybe lower. My 12400F only runs allcore at 1.0v so can't test scaling on that one.
 
Last edited:
Not to start another war, but I'm not so sure since games generally utilize few cores most of the time. My 5600X in SOTTR (a game that utilizes many cores well) get 230-240fps CPU game avg 1080p highest running 76W limit, but I get 215-225fps running 45W limit so only around 5% fps loss last I checked. In both cases IO-die uses 15-20W. It may be more or less in other games, but generally less since most games uses less cores than SOTTR.

If you can test your 12th gen in SOTTR 1080p highest with unlimited and 45W limit it would be great :)

Ryzen 5k scales poorly efficiencywise in allcore loads if voltage is above 1.25V-ish (meaning 4.5-4.75GHz allcore depending on bin, CO value etc). I'm not sure where sæefficient scaling stops on 12th gen, but wouldn't be surprised if it were around 1.2-1.3v allcore like Ryzen 5k, maybe lower.
Unless we have the same card it's kind of pointless. Ill win in efficiency just because i habe a 3090 that pumps 300 fps in 1080p. We could try locking the framerate to 150 or something and then check the power draw i guess
Igorslab and derbauer ran some tests and adl are insane in gaming efficiency
 
Unless we have the same card it's kind of pointless. Ill win in efficiency just because i habe a 3090 that pumps 300 fps in 1080p. We could try locking the framerate to 150 or something and then check the power draw i guess
Igorslab and derbauer ran some tests and adl are insane in gaming efficiency
No, you can compare, look at CPU game avg, this is what your CPU produces and will be equal no matter what GPU you use :) You may be right that you lose less perf pwr limiting a 12th gen, but it would be nice to see how it affects. You will get different numbers than me, but the point is to show how much lower perf gets if pwr limited :)
 
No, you can compare, look at CPU game avg, this is what your CPU produces and will be equal no matter what GPU you use :) You may be right that you lose less perf pwr limiting a 12th gen, but it would be nice to see how it affects. You will get different numbers than me, but the point is to show how much lower perf gets if pwr limited :)
The cpu game is affected by the gpu actually, downclock your gpu and youll get higher numbers. I dont know why it works that way but it does
 
The cpu game is affected by the gpu actually, downclock your gpu and youll get higher numbers. I dont know why it works that way but it does
I have not seen that, I get same cpu game avg running UV or OC profile on GPU, but your system might behave different. Still using same GPU settings you can show how your CPU, that is what I was curious about? Can you do that? :)
 
I have not seen that, I get same cpu game avg running UV or OC profile on GPU, but your system might behave different. Still using same GPU settings you can show how your CPU, that is what I was curious about? Can you do that? :)
Just got back home, i get 225 @ 45watts with e cores off. Too bored to try e cores on right now, maybe later :P
 
Just got back home, i get 225 @ 45watts with e cores off. Too bored to try e cores on right now, maybe later :p
But try at stock pwr and compare :)
 
Stock with no power limits i get around 330 if I remember correctly, but i've never checked how much it actually consumes.
Okay, you lose 33% perf then. How much does it use in SOTTR when not pwr limited? I haven't tested the 5800X, but considering I only lose 5% fps going from 76W to 45W I would be surprised if 5800X lose a lot more. All this considered it seems gaming is less impacted by pwr limit that productivity which for instance the 10900K test on HWUB showed.
 
Okay, you lose 33% perf then. How much does it use in SOTTR when not pwr limited? I haven't tested the 5800X, but considering I only lose 5% fps going from 76W to 45W I would be surprised if 5800X lose a lot more. All this considered it seems gaming is less impacted by pwr limit that productivity which for instance the 10900K test on HWUB showed.
Ill check, i never really paid attention. Sotr though is kind of a weird game to test for this cause its really memory and cache dependant more than it cares about the actual cpu.

The only game where I've noticed high power consumption is cyberpunk with RT on at very low resolution to make it cpu bound. Ive seen peaks at 170 watts which is insane for a game, usually its under 100watts in most games.
 
Ill check, i never really paid attention. Sotr though is kind of a weird game to test for this cause its really memory and cache dependant more than it cares about the actual cpu.

The only game where I've noticed high power consumption is cyberpunk with RT on at very low resolution to make it cpu bound. Ive seen peaks at 170 watts which is insane for a game, usually its under 100watts in most games.
SOTTR scales with everything which makes it weird, but good for testing :) Cyberpunk has high CPU usage using DLSS or native low res, it scales good with bandwith (loves DDR5 and performs much better than with DDR4), too bad the built in bench is inconsistent :/ If I remember correct CP has some sections that utilize AVX2, that might explain high CPU usage.
 
Ill check, i never really paid attention. Sotr though is kind of a weird game to test for this cause its really memory and cache dependant more than it cares about the actual cpu.

The only game where I've noticed high power consumption is cyberpunk with RT on at very low resolution to make it cpu bound. Ive seen peaks at 170 watts which is insane for a game, usually its under 100watts in most games.
I just remembered that TPU did a powerscaling test:
relative-performance-games-1280-720.png

It seems ADL is very efficient down to around 75W limit or maybe a bit below, but get serious problems at 50W so somewhere between 50 and 75W gamingperformance tanks completely. 5600X at 76W limit is equally efficient as 12900K at 75W.

5800X does not behave the same and scales well at 65W vs stock 142W.
gamingperf at 65W is 98-99% of 142W :O How low you can go on 5800X before performance tanks is a big question, but I'm sure it loses a lot less perf at 45W than 12900K, since it loses over 40% perf at 50W, while I lose 5% at 45W.

The voltage/frequency curve of Ryzen 5k is a bit weird where you get very good linear scaling to around 1.1v.
My 5600X tested at various speeds, lowest voltage/powerusage in CB23:
4.2@0.99v 56W
4.3@1.02v 60W
4.4@1.05v 64W
4.5@1.10v 69W
4.6@1.18v 76W
4.7@1.26v 95W
4.8@1.34v 115W
I think it is quite comparable to 5800X. All these test were done with 4000 ram so I/O-die uses a bit more power than an avg 5600X.

Even though the I/O-die uses a bit of power (10-15W load 3200MHz ram, 20-30W load 4000MHz ram) the mem controller on ADL, I/O etc uses a fair amount of power, but at the same die as cores.

I bet 12600K will have better perf at low tdp due to less cores/cache.
 
Low quality post by sillyman5454
Well, AMD did what it promised:
"From this small sample though, we found that the 5800X3D is slightly faster than the 12900K when using the same DDR4 memory.

It's a small 5% margin, but that did make it 19% faster than the 5900X on average, so AMD's 15% claim is looking good."

1649953329022.png


Quite an impressive comeback within the generation even without DDR5 support.
 
I just remembered that TPU did a powerscaling test:
relative-performance-games-1280-720.png

It seems ADL is very efficient down to around 75W limit or maybe a bit below, but get serious problems at 50W so somewhere between 50 and 75W gamingperformance tanks completely. 5600X at 76W limit is equally efficient as 12900K at 75W.

5800X does not behave the same and scales well at 65W vs stock 142W.
gamingperf at 65W is 98-99% of 142W :O How low you can go on 5800X before performance tanks is a big question, but I'm sure it loses a lot less perf at 45W than 12900K, since it loses over 40% perf at 50W, while I lose 5% at 45W.

The voltage/frequency curve of Ryzen 5k is a bit weird where you get very good linear scaling to around 1.1v.
My 5600X tested at various speeds, lowest voltage/powerusage in CB23:
4.2@0.99v 56W
4.3@1.02v 60W
4.4@1.05v 64W
4.5@1.10v 69W
4.6@1.18v 76W
4.7@1.26v 95W
4.8@1.34v 115W
I think it is quite comparable to 5800X. All these test were done with 4000 ram so I/O-die uses a bit more power than an avg 5600X.

Even though the I/O-die uses a bit of power (10-15W load 3200MHz ram, 20-30W load 4000MHz ram) the mem controller on ADL, I/O etc uses a fair amount of power, but at the same die as cores.

I bet 12600K will have better perf at low tdp due to less cores/cache.
That's because the voltage is set a bit to high by default on the 12900k. With a little undervolting I managed 14k cbr23 score @ 35 watts. I can max out my 3090 with that power limit :P
 
That's because the voltage is set a bit to high by default on the 12900k. With a little undervolting I managed 14k cbr23 score @ 35 watts. I can max out my 3090 with that power limit :p
One could argue that AMD does the same, and above 1.2v the efficiencycurve is garbage, probably the same on ADL.
 
That's because the voltage is set a bit to high by default on the 12900k. With a little undervolting I managed 14k cbr23 score @ 35 watts. I can max out my 3090 with that power limit :p
I can max out my 5800x + 3090 and stay under 400W, the moment you tweak values it's an unfair comparison
 
Back
Top