• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

14900k - Tuned for efficiency - Gaming power draw

Joined
Nov 16, 2023
Messages
1,575 (3.75/day)
Location
Nowhere
System Name I don't name my rig
Processor 14700K
Motherboard Asus TUF Z790
Cooling Air/water/DryIce
Memory DDR5 G.Skill Z5 RGB 6000mhz C36
Video Card(s) RTX 4070 Super
Storage 980 Pro
Display(s) Some LED 1080P TV
Case Open bench
Audio Device(s) Some Old Sherwood stereo and old cabinet speakers
Power Supply Corsair 1050w HX series
Mouse Razor Mamba Tournament Edition
Keyboard Logitech G910
VR HMD Quest 2
Software Windows
Benchmark Scores Max Freq 13700K 6.7ghz DryIce Max Freq 14700K 7.0ghz DryIce Max all time Freq FX-8300 7685mhz LN2
If they'd stayed at lower frequencies all together, these chips would not have a lot of issues we deal with, mainly power usage converted to thermal dissipation difficulties.

Just too much leakage. AMD and Intel both suffer from it. Should have boxed the chips at least 400mhz lower boost and leave the headroom for the guys that want to OC to obtain it.

Hopefully, sometime I can get a benchmark comparison of CyberPunk'd 2077 times. 60 bucks just to run an in game benchmark. lol. I haven't actually really played it yet....

Anyhow, I'll do all core P&E 4000mhz and 5.5ghz P cores only with and Maybe without HT enabled. Not sure how deep I want to get, I'm just really curious.

So far, with all honesty, The frequency doesn't seem to make that impact overclocking like it once did. But I suppose that's from having no headroom.
 
Joined
Mar 21, 2016
Messages
2,508 (0.78/day)
My best CPU-Z benchmark score so far on 14700K was by pushing All P-cores x55/All E-cores x44 with a x48 Ring. I haven't had any luck in getting All E-cores to boot up into windows unfortunately though it will bios post. start loading Windows and then PC hard crashes. :banghead: Great chip overall though wish I could get all E-core x45 or better still x46 ratio stable because it would bump up the MT significantly and make the E-core performance more consistent.
 
Joined
Nov 16, 2023
Messages
1,575 (3.75/day)
Location
Nowhere
System Name I don't name my rig
Processor 14700K
Motherboard Asus TUF Z790
Cooling Air/water/DryIce
Memory DDR5 G.Skill Z5 RGB 6000mhz C36
Video Card(s) RTX 4070 Super
Storage 980 Pro
Display(s) Some LED 1080P TV
Case Open bench
Audio Device(s) Some Old Sherwood stereo and old cabinet speakers
Power Supply Corsair 1050w HX series
Mouse Razor Mamba Tournament Edition
Keyboard Logitech G910
VR HMD Quest 2
Software Windows
Benchmark Scores Max Freq 13700K 6.7ghz DryIce Max Freq 14700K 7.0ghz DryIce Max all time Freq FX-8300 7685mhz LN2
My best CPU-Z benchmark score so far on 14700K was by pushing All P-cores x55/All E-cores x44 with a x48 Ring. I haven't had any luck in getting All E-cores to boot up into windows unfortunately though it will bios post. start loading Windows and then PC hard crashes. :banghead: Great chip overall though wish I could get all E-core x45 or better still x46 ratio stable because it would bump up the MT significantly and make the E-core performance more consistent.
If the E-cores disappeared, I believe you would have a 6ghz P-core chip. Mine does happily 5.9ghz on the water loop Ambient. Another consideration for a 14700K actually.
 
Joined
Jun 6, 2022
Messages
622 (0.66/day)
Zen 5 will be faster than Zen 4? You don't say? :eek:

Seriously though, Coffee Lake launched two months after I bought my 7700K, essentially relegating it to Core i5 status. Then a year later, Coffee Lake Refresh came, then Comet Lake, and my 7700K was suddenly competing with an i3. Then AMD launched the R3 3100 and 3300X which ran miles around the 7700K while costing a fraction of what I paid for it, and using half as much power. It's normal, it's called progress.

Or do you think the 14900K will remain the fastest Intel CPU forever? Look at the charts again and see where the 12900K, 11900K, or the 9900K are now.

Nobody in their right mind is expecting X3D CPUs to hold up against future generations. But we do expect them to fare a little bit better in the long run than their non-X3D siblings.
This is exactly what I'm trying to explain to you, because X3D costs far too much in relation to what it offers (the perf/$ chapter in the reviews leaves no room for discussion), and that in this field, one year is equivalent to 25 of a person's life.
And the history of the 5800X3D is a clear indication of the future of the 7800X3D. This, while the much cheaper 7700X will keep intact its superior performance in most applications, the current 7800X3D@$450 will deliver the performance of the future 8600(X) in gaming.

After the ferocity with which you defend this 7800X3D, AMD's marketing has worked wonderfully for you.

Question: if with a 4090 your processor cannot move away from the 7600X in 4K (5800X3D@450$ it barely exceeds a poor 12100F by 1%), what is the difference obtained with the 4080, your video card?
Hint: not even the 4090 can deliver 300 fps in 4K/AAA games. Not even in 1440p. If it cannot deliver so many frames per second, the processor must not be top in this chapter. The reviews prove it. and, I repeat again, we see only 1% in 4K between 5800X3D@450$ and 12100F@90$.

If you look closely, the 7800X3D gets that small lead thanks to some games heavily optimized for AMD.

Let's talk about these X3Ds when they will make a difference. For now, to pay more by ~25% for a maximum of 1% extra performance in gaming (I compare 7700X with 7800X3D), extra performance canceled by the superiority of 7700X in most applications, seems to me at least ridiculous.
 
Last edited:
Joined
Mar 21, 2016
Messages
2,508 (0.78/day)
If the E-cores disappeared, I believe you would have a 6ghz P-core chip. Mine does happily 5.9ghz on the water loop Ambient. Another consideration for a 14700K actually.

I doubt it's stable enough even with disabling all the E-cores or not w/o jamming a ton more voltage at it than I would want to daily it at. Also I'd much sooner keep the MT higher than the ST myself. It's of higher practical use all around even if it's nearly impractically hard to fully utilize at 100% damn near any of the time outside of benchmarks and stress tests.
 
Joined
Jan 14, 2019
Messages
13,198 (6.03/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
Current AMD 3D technology is a repurposed server stuff for gaming and as such is not ideal - see problems with cooling and lower frequency. Much better would be the cache on the same piece of silicone as cores.
It's not repurposed server stuff. More cache doesn't do anything with server workloads, as far as I'm aware. It only helps with gaming.

This is exactly what I'm trying to explain to you, because X3D costs far too much in relation to what it offers (the perf/$ chapter in the reviews leaves no room for discussion), and that in this field, one year is equivalent to 25 of a person's life.
And the history of the 5800X3D is a clear indication of the future of the 7800X3D. This, while the much cheaper 7700X will keep intact its superior performance in most applications, the current 7800X3D@$450 will deliver the performance of the future 8600(X) in gaming.

After the ferocity with which you defend this 7800X3D, AMD's marketing has worked wonderfully for you.

Question: if with a 4090 your processor cannot move away from the 7600X in 4K (5800X3D@450$ it barely exceeds a poor 12100F by 1%), what is the difference obtained with the 4080, your video card?
Hint: not even the 4090 can deliver 300 fps in 4K/AAA games. Not even in 1440p. If it cannot deliver so many frames per second, the processor must not be top in this chapter. The reviews prove it. and, I repeat again, we see only 1% in 4K between 5800X3D@450$ and 12100F@90$.

If you look closely, the 7800X3D gets that small lead thanks to some games heavily optimized for AMD.

Let's talk about these X3Ds when they will make a difference. For now, to pay more by ~25% for a maximum of 1% extra performance in gaming (I compare 7700X with 7800X3D), extra performance canceled by the superiority of 7700X in most applications, seems to me at least ridiculous.
I cannot see any logic in your post. Was the 5800X3D faster than the 5800X when it launched? Yes. Is it still faster by the same margin? Yes. So where's the problem?

You seriously cannot expect last gen's top CPUs to stay on top forever. Did you actually look at where the 11900K is? Every single current gen CPU is ahead of it by a mile. It's way lower down the chart than the 5800X3D. And the 11900K was way more expensive when it launched than the 7800X3D is now, mind you.

If the 7800X3D and the 14900K are on par on the gaming charts now, they will most probably be on par two years from now as well, and we're taking about a £370 vs a £560 CPU. On the other hand, expecting either of them to stay on top and compete with future high-end CPUs is lunacy.
 
Joined
Jun 6, 2022
Messages
622 (0.66/day)
@AusWolf
As I said, marketing works very well on you.
At the same price as the 5800X3D you would find the 5900X, which is now 1-2 percent weaker than the 5800X3D in games (with 4090, ofcourse), but keeps the same colossal lead in the rest. I think 30% in multitasking.
Question: what would be the difference between them, in gaming, if the 4090 was not used? Let's say 4080. 1%, 0.1% or zero?
What is the difference between X3D and non-X3D with weaker video cards even weaker than 4080?

You seem convinced that all players are using 4090 and 7900XTX, the rest of the video cards being used for 1+1 in Excel and helping to draft invoices.

This time, AMD's trick didn't last as it did with the 5000 series. Prices are continuously falling, 7950X3D reaching the level of 13/14900K.
They tried but it didn't work. Another argument that you are among the few who still support the exorbitant price of these processors for a tiny, even non-existent, increase in performance and only in gaming.

Don't tell me that a 7600/12600/13600 or 7700 causes problems for a 4070/7800X... what do you want below 4080/7900X. At the price of the 7800X3D, you can find the 7900X, which ridicules it in multitasking. Not to mention 13/14700K(F).
Your argument is a fps gained and the economy of 1KW in two years of gaming in nolife mode, I spend on the processor an extra with which you pay the bill for a MW.

Because we are on a topic about Intel efficiency, I present two captures with the behavior of the 14700KF in two games, the ones I play the most.
In the first one, you can see ~20W CPU at a 100 fps limitation of the video card (a session of about 30 minutes, 20 watts average to the CPU).
In the second, you can see ~40W CPU without framerate limitation, the video card rendering, in some places, over 300 fps.

Another aspect I want to mention is that the wattmeter indicates 21W min with 13500 in idle. On the other system, the wattmeter does not drop below 67W with the same processor. The processor is not to blame, but the other components (different motherboard, more fans, AIO, etc.). All the LEDs are off, otherwise the consumption goes up to 90W.
The difference is 46W, minimum, which makes the discussion ridiculous in this chapter as well. To talk about 10-20W difference in a processor while you "decorate" your PC like a Christmas tree, resembles the nonsense of those who want to save the planet but use turbojets and powerful cars at meetings.

gaming 100fps vsyn con.jpg
Gaming over 300fps.jpg
 
Joined
Nov 16, 2023
Messages
1,575 (3.75/day)
Location
Nowhere
System Name I don't name my rig
Processor 14700K
Motherboard Asus TUF Z790
Cooling Air/water/DryIce
Memory DDR5 G.Skill Z5 RGB 6000mhz C36
Video Card(s) RTX 4070 Super
Storage 980 Pro
Display(s) Some LED 1080P TV
Case Open bench
Audio Device(s) Some Old Sherwood stereo and old cabinet speakers
Power Supply Corsair 1050w HX series
Mouse Razor Mamba Tournament Edition
Keyboard Logitech G910
VR HMD Quest 2
Software Windows
Benchmark Scores Max Freq 13700K 6.7ghz DryIce Max Freq 14700K 7.0ghz DryIce Max all time Freq FX-8300 7685mhz LN2
I doubt it's stable enough even with disabling all the E-cores or not w/o jamming a ton more voltage at it than I would want to daily it at. Also I'd much sooner keep the MT higher than the ST myself. It's of higher practical use all around even if it's nearly impractically hard to fully utilize at 100% damn near any of the time outside of benchmarks and stress tests.
Oh for sure. I understand totally. Most all the other PCs in the house run pretty much stock.
I abuse them for a while and then use them for a while. Of course there's an office PC - Kids Gaming PCs and then my bench PC where I do all the beating on stuff.

V-core? Just depends on -
A. Fear Factor. (not saying you, people in general)
B. Cooling. (Not easy to cool with OV)

But V-core shouldn't worry people.
The 13700K I DryIced did 6.7ghz 1.670v
I've been doing screen shots in this thread with overclocks and underclocks with the very same chip. You'll know if it dies, cause I won't have any more screen shots to share! lol.
 
Joined
Oct 30, 2022
Messages
243 (0.30/day)
Location
Australia
System Name Blytzen
Processor Ryzen 7 7800X3D
Motherboard ASRock B650E Taichi Lite
Cooling Deepcool LS520 (240mm)
Memory G.Skill Trident Z5 Neo RGB 64 GB (2 x 32 GB) DDR5-6000 CL30
Video Card(s) Powercolor 6800XT Red Dragon (16 gig)
Storage 2TB Crucial P5 Plus SSD, 80TB spinning rust in a NAS
Display(s) MSI MPG321URX QD-OLED (32", 4k, 240hz), Samsung 32" 4k
Case Coolermaster HAF 500
Audio Device(s) Logitech G733 and a Z5500 running in a 2.1 config (I yeeted the mid and 2 satellites)
Power Supply Corsair HX850
Mouse Logitech G502X lightspeed
Keyboard Logitech G915 TKL tactile
Benchmark Scores Squats and calf raises
I'm still amused by Beagles "Ryzen's are hard to cool."

you seem to perpetually equate temperature with how hard it is to cool. It's not about the temperature alone it's about the wattage and energy.

You really need to learn the difference.
 
Joined
Nov 16, 2023
Messages
1,575 (3.75/day)
Location
Nowhere
System Name I don't name my rig
Processor 14700K
Motherboard Asus TUF Z790
Cooling Air/water/DryIce
Memory DDR5 G.Skill Z5 RGB 6000mhz C36
Video Card(s) RTX 4070 Super
Storage 980 Pro
Display(s) Some LED 1080P TV
Case Open bench
Audio Device(s) Some Old Sherwood stereo and old cabinet speakers
Power Supply Corsair 1050w HX series
Mouse Razor Mamba Tournament Edition
Keyboard Logitech G910
VR HMD Quest 2
Software Windows
Benchmark Scores Max Freq 13700K 6.7ghz DryIce Max Freq 14700K 7.0ghz DryIce Max all time Freq FX-8300 7685mhz LN2
I'm still amused by Beagles "Ryzen's are hard to cool."

you seem to perpetually equate temperature with how hard it is to cool. It's not about the temperature alone it's about the wattage and energy.

You really need to learn the difference.
It's all one in the same.

The wattage you put in simply gets converted to a thermal energy.
If you are "using" 105w, you simply need to move 358 Btu/hr.
Doing this while keeping the AMD cpu under 90/95c depending on generation.

If you can't move the BTU fast enough, they are hard to cool.
A delid and direct die will decrease the time it takes to move X BTU/hr.

Above is how I approach cooling. I know I need to get water as close to the source as possible. Then I know water doesn't shed heat as fast as it can absorb it. So extra radiator. 120.2 isn't enough for most high/er end chips. Cold plate is too small. Acrylic top. In loop pump adding to the water delta.

Really, you can only cool a cpu in 2 dimensions on a single surface.

My next hat trick.
Try and run 13700K with passive cooling. XD
 
Joined
Oct 30, 2022
Messages
243 (0.30/day)
Location
Australia
System Name Blytzen
Processor Ryzen 7 7800X3D
Motherboard ASRock B650E Taichi Lite
Cooling Deepcool LS520 (240mm)
Memory G.Skill Trident Z5 Neo RGB 64 GB (2 x 32 GB) DDR5-6000 CL30
Video Card(s) Powercolor 6800XT Red Dragon (16 gig)
Storage 2TB Crucial P5 Plus SSD, 80TB spinning rust in a NAS
Display(s) MSI MPG321URX QD-OLED (32", 4k, 240hz), Samsung 32" 4k
Case Coolermaster HAF 500
Audio Device(s) Logitech G733 and a Z5500 running in a 2.1 config (I yeeted the mid and 2 satellites)
Power Supply Corsair HX850
Mouse Logitech G502X lightspeed
Keyboard Logitech G915 TKL tactile
Benchmark Scores Squats and calf raises
It's all one in the same.

The wattage you put in simply gets converted to a thermal energy.
If you are "using" 105w, you simply need to move 358 Btu/hr.
Doing this while keeping the AMD cpu under 90/95c depending on generation.

It's not the same

an AMD cpu running at 95 (at 100w) vs a 14900K running @ 95 (at 200+) the AMD solution WILL require less cooling.

Temperature is only part of the equation.

Otherwise you could cool a rock the size of a golfball @ 100C with the same bucket of water you could cool a soccer ball sized rock @ 100C

It's about thermal energy and capacitance measured as thermal output (joules)

Or more succinctly, it takes 10x energy to heat 10 units of water (gallon, litre, cup) than it does a single unit. Otherwise I could boil a kettle and make a cup of coffee and heat a bath to the same temperature.

the wattage on cpu readouts are also misnomers (see when you run a 65w eco cap and see how many cpu's pull just 65w)
 
Joined
May 24, 2023
Messages
957 (1.61/day)
Heat flow depends on temperature difference and thermal resistance. The higher temperature difference, the higher heat flow. The higher thermal resistance, the lower heat flow.

(thermal resistance increases with material thickness, decreases with surface area through which the heat flows and depends on the type of the material, some of the best materials for conducting heat are silver and copper, aluminium is also pretty good)

You run say 14600K and 7800X3D both at 75W with the same cooler on them. 75W is the electric power draw which transforms into heat flow.

7800X3D has a piece of memory chip on top of the CPU chip and a thick heatspreader on top of that, it has significantly higher thermal resistance.

If you want to remove 75W from 7800X3D, it needs to heat up to significantly higher temperature than 14600K to overcome higher thermal resistance.

Or restated:

With both 14600K and 7800X3D at the same temperature, thanks to lower thermal resistance you can remove significantly more heat from 14600K compared to 7800X3D. For example, with both CPUs at 80°C you would be able to remove 75W from 7800X3D and 200W from 14600K.
 
Last edited:
Joined
Mar 13, 2021
Messages
483 (0.35/day)
Processor AMD 7600x
Motherboard Asrock x670e Steel Legend
Cooling Silver Arrow Extreme IBe Rev B with 2x 120 Gentle Typhoons
Memory 4x16Gb Patriot Viper Non RGB @ 6000 30-36-36-36-40
Video Card(s) XFX 6950XT MERC 319
Storage 2x Crucial P5 Plus 1Tb NVME
Display(s) 3x Dell Ultrasharp U2414h
Case Coolermaster Stacker 832
Power Supply Thermaltake Toughpower PF3 850 watt
Mouse Logitech G502 (OG)
Keyboard Logitech G512
Current AMD 3D technology is a repurposed server stuff for gaming and as such is not ideal - see problems with cooling and lower frequency. Much better would be the cache on the same piece of silicone as cores.
Yes but that incurs its own penalties. Die space which in turn causes increase in cost AND higher likelihood of more defective parts.

X3D is an alternative way of massively increasing cache sizes without impacting other areas of the chip processing areas. Plus you get the added advantage of having one main core design for 2 high end products.

If you want to remove 75W from 7800X3D, it needs to heat up to significantly higher temperature than 14600K to overcome higher thermal resistance.

A flawed design choice by AMD in my opinion by keeping the thicker IHS to keep compatability with existing AM4 coolers at the cost of cooling efficency. We have already seen the insane drops by removing the IHS on AM5 ¬_¬
 
Last edited:
Joined
Jan 14, 2019
Messages
13,198 (6.03/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
@AusWolf
As I said, marketing works very well on you.
At the same price as the 5800X3D you would find the 5900X, which is now 1-2 percent weaker than the 5800X3D in games (with 4090, ofcourse), but keeps the same colossal lead in the rest. I think 30% in multitasking.
Question: what would be the difference between them, in gaming, if the 4090 was not used? Let's say 4080. 1%, 0.1% or zero?
What is the difference between X3D and non-X3D with weaker video cards even weaker than 4080?
Why would I buy a 1-2, or any % weaker CPU for gaming? How many times do I have to repeat myself that I don't care about MT tests, rendering, or anything else that requires lots of cores? You can give me a CPU that is 5000% faster in Cinebench, but if it's slower in games, I won't care. You use your CPU for whatever you want to, but I only need it for gaming. Being faster in a program that you never use is placebo at most.

Oh, did I mention that I used to own a 5950X at some point? But then, games used about 15-20% of it at most, so I sold it, swapped it for a R5 3600 (which I didn't like), then an 11700, which I still have. You can probably guess how much performance difference I saw between the 5950X and the 11700 in games: none.

You seem convinced that all players are using 4090 and 7900XTX, the rest of the video cards being used for 1+1 in Excel and helping to draft invoices.
No, I am not. But I am convinced that people with a 7800 XT, like myself, will eventually upgrade for a 9800 XT or 6060 Ti, or something similar.

I am also not saying that a gamer has to buy a 7800X3D with a top-range GPU to enjoy gaming, I don't know where you get that idea from.

Don't tell me that a 7600/12600/13600 or 7700 causes problems for a 4070/7800X... what do you want below 4080/7900X. At the price of the 7800X3D, you can find the 7900X, which ridicules it in multitasking. Not to mention 13/14700K(F).
Your argument is a fps gained and the economy of 1KW in two years of gaming in nolife mode, I spend on the processor an extra with which you pay the bill for a MW.
Did you read my post where I said that if I could start 2023 all over again, I'd buy a 7700 non-X and call it quits? Paired with a mid-range card, a 7800X3D is absolutely not necessary, but neither is a 14700K.

I bought the 7800X3D because I was curious. I'm a hardware enthusiast, I like tinkering with my PC even when I have absolutely no logical reason to. Do I recommend others to do the same? Of course not.

If someone wants mid-range gaming advice from me, I'll say a 13500 or 7600 non-X and a 4070 or 7700 XT do the job just fine.

Edit: Saying that a 7900X is better than the 7800X3D even if it's worse in gaming just because it has more cores is the same argument as saying that an FX-8150 is better than a 2700K because it has more cores. Back then, I listened to this argument and bought an 8150 only to be terribly CPU limited in some games that didn't utilise lots of cores. Assassin's Creed 3 stuttering at 25-30 FPS with a Radeon HD 7970 that was utilised up to 40% due to a huge (ST) CPU bottleneck was some amazing experience, believe me! It's an exaggerated example, but I hope you get the point.
 
Last edited:
Joined
Jul 30, 2019
Messages
3,374 (1.70/day)
System Name Still not a thread ripper but pretty good.
Processor Ryzen 9 7950x, Thermal Grizzly AM5 Offset Mounting Kit, Thermal Grizzly Extreme Paste
Motherboard ASRock B650 LiveMixer (BIOS/UEFI version P3.08, AGESA 1.2.0.2)
Cooling EK-Quantum Velocity, EK-Quantum Reflection PC-O11, D5 PWM, EK-CoolStream PE 360, XSPC TX360
Memory Micron DDR5-5600 ECC Unbuffered Memory (2 sticks, 64GB, MTC20C2085S1EC56BD1) + JONSBO NF-1
Video Card(s) XFX Radeon RX 5700 & EK-Quantum Vector Radeon RX 5700 +XT & Backplate
Storage Samsung 4TB 980 PRO, 2 x Optane 905p 1.5TB (striped), AMD Radeon RAMDisk
Display(s) 2 x 4K LG 27UL600-W (and HUANUO Dual Monitor Mount)
Case Lian Li PC-O11 Dynamic Black (original model)
Audio Device(s) Corsair Commander Pro for Fans, RGB, & Temp Sensors (x4)
Power Supply Corsair RM750x
Mouse Logitech M575
Keyboard Corsair Strafe RGB MK.2
Software Windows 10 Professional (64bit)
Benchmark Scores RIP Ryzen 9 5950x, ASRock X570 Taichi (v1.06), 128GB Micron DDR4-3200 ECC UDIMM (18ASF4G72AZ-3G2F1)
the wattage on cpu readouts are also misnomers (see when you run a 65w eco cap and see how many cpu's pull just 65w)
If I recall correctly on AM4 the 65w eco cap is based on TDP. If you want to control total watts you need to adjust PPT.
 
Joined
Jan 14, 2019
Messages
13,198 (6.03/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
If I recall correctly on AM4 the 65w eco cap is based on TDP. If you want to control total watts you need to adjust PPT.
Yes. Intel's TDP (or PL) = maximum Watts. AMD's PPT = an arbitrary number * 1.35 = x Watts. So basically, Intel's TDP/PL and AMD's PPT are the same, but their TDPs are entirely different.

Edit: Some basic TDP/PPT values on AMD:
65 W TDP = 88 W PPT,
95 W TDP = 128 W PPT,
105 W TDP = 142 W PPT,
120 W TDP = 162 W PPT,
170 W TDP = 230 W PPT.
 
Joined
Jul 30, 2019
Messages
3,374 (1.70/day)
System Name Still not a thread ripper but pretty good.
Processor Ryzen 9 7950x, Thermal Grizzly AM5 Offset Mounting Kit, Thermal Grizzly Extreme Paste
Motherboard ASRock B650 LiveMixer (BIOS/UEFI version P3.08, AGESA 1.2.0.2)
Cooling EK-Quantum Velocity, EK-Quantum Reflection PC-O11, D5 PWM, EK-CoolStream PE 360, XSPC TX360
Memory Micron DDR5-5600 ECC Unbuffered Memory (2 sticks, 64GB, MTC20C2085S1EC56BD1) + JONSBO NF-1
Video Card(s) XFX Radeon RX 5700 & EK-Quantum Vector Radeon RX 5700 +XT & Backplate
Storage Samsung 4TB 980 PRO, 2 x Optane 905p 1.5TB (striped), AMD Radeon RAMDisk
Display(s) 2 x 4K LG 27UL600-W (and HUANUO Dual Monitor Mount)
Case Lian Li PC-O11 Dynamic Black (original model)
Audio Device(s) Corsair Commander Pro for Fans, RGB, & Temp Sensors (x4)
Power Supply Corsair RM750x
Mouse Logitech M575
Keyboard Corsair Strafe RGB MK.2
Software Windows 10 Professional (64bit)
Benchmark Scores RIP Ryzen 9 5950x, ASRock X570 Taichi (v1.06), 128GB Micron DDR4-3200 ECC UDIMM (18ASF4G72AZ-3G2F1)
It's not repurposed server stuff. More cache doesn't do anything with server workloads, as far as I'm aware. It only helps with gaming.
This is where I see a problem with AMD's 3D cache design. They literally threw more cache at the performance problem like politicians tend to throw cash at political problems. More cache/cash can only get you so far, one still needs to use it effectively. This is where I hope AMD has a follow-up plan in trying to maximize the usage/potential for software to use it effectively. If they don't then they have a whole lot of cache not doing much but keeping the TDP of chips low and winning the power efficiency game.
 
Joined
Jan 14, 2019
Messages
13,198 (6.03/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
This is where I see a problem with AMD's 3D cache design. They literally threw more cache at the performance problem like politicians tend to throw cash at political problems. More cache/cash can only get you so far, one still needs to use it effectively. This is where I hope AMD has a follow-up plan in trying to maximize the usage/potential for software to use it effectively. If they don't then they have a whole lot of cache not doing much but keeping the TDP of chips low and winning the power efficiency game.
Throwing cache, or cores at the problem is the same kind of dumb solution, in my opinion. But whatever works, works, you can't argue with results.
 
Joined
Jul 30, 2019
Messages
3,374 (1.70/day)
System Name Still not a thread ripper but pretty good.
Processor Ryzen 9 7950x, Thermal Grizzly AM5 Offset Mounting Kit, Thermal Grizzly Extreme Paste
Motherboard ASRock B650 LiveMixer (BIOS/UEFI version P3.08, AGESA 1.2.0.2)
Cooling EK-Quantum Velocity, EK-Quantum Reflection PC-O11, D5 PWM, EK-CoolStream PE 360, XSPC TX360
Memory Micron DDR5-5600 ECC Unbuffered Memory (2 sticks, 64GB, MTC20C2085S1EC56BD1) + JONSBO NF-1
Video Card(s) XFX Radeon RX 5700 & EK-Quantum Vector Radeon RX 5700 +XT & Backplate
Storage Samsung 4TB 980 PRO, 2 x Optane 905p 1.5TB (striped), AMD Radeon RAMDisk
Display(s) 2 x 4K LG 27UL600-W (and HUANUO Dual Monitor Mount)
Case Lian Li PC-O11 Dynamic Black (original model)
Audio Device(s) Corsair Commander Pro for Fans, RGB, & Temp Sensors (x4)
Power Supply Corsair RM750x
Mouse Logitech M575
Keyboard Corsair Strafe RGB MK.2
Software Windows 10 Professional (64bit)
Benchmark Scores RIP Ryzen 9 5950x, ASRock X570 Taichi (v1.06), 128GB Micron DDR4-3200 ECC UDIMM (18ASF4G72AZ-3G2F1)
Throwing cache, or cores at the problem is the same kind of dumb solution, in my opinion. But whatever works, works, you can't argue with results.
More cores at least can be 100% utilized. 3d cache being 100% utilized I think is a challenge AMD needs to face toward the future otherwise the manufacturing complexity and cost of it may not be worth it. This is why I was wondering if the trend of new games benefitting from x3d cache was going up or down. If it's going down then x3d cpus become obsolete going into the future and lose significant value. AMD needs to ensure x3d retains value.
 
Joined
Sep 17, 2014
Messages
22,830 (6.06/day)
Location
The Washing Machine
System Name Tiny the White Yeti
Processor 7800X3D
Motherboard MSI MAG Mortar b650m wifi
Cooling CPU: Thermalright Peerless Assassin / Case: Phanteks T30-120 x3
Memory 32GB Corsair Vengeance 30CL6000
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Lexar NM790 4TB + Samsung 850 EVO 1TB + Samsung 980 1TB + Crucial BX100 250GB
Display(s) Gigabyte G34QWC (3440x1440)
Case Lian Li A3 mATX White
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse Steelseries Aerox 5
Keyboard Lenovo Thinkpad Trackpoint II
VR HMD HD 420 - Green Edition ;)
Software W11 IoT Enterprise LTSC
Benchmark Scores Over 9000
Fair enough, but consider the price of a new graphics card, plus losing on the performance. You'd have to game for roughly 10 years on the 4070 instead of the 7900 for the difference in your bill to return its price.
That and a 7900XT is much fasterso you can either limit it harder (framerate cap) or run it harder if needed. And yes you can also get that card down to 150W in game. I doubt it will perform notably worse than a 4070 at that point.

Power usage is barely if ever an argument of cost; it is much more one of heat and longevity. If you dont want to use energy, dont game. Efficient gaming is in the larger scheme of things simply a trade off: quality & FPS versus cost. Regardless of card.
 
Joined
Jan 14, 2019
Messages
13,198 (6.03/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
More cores at least can be 100% utilized.
Can be utilised is not the same as will be utilised. Paying more for more cache isn't a bigger gamble than paying for more cores, except that bigger cache can also be used by today's games:
relative-performance-games-1280-720.png
 
Last edited:
Joined
Nov 16, 2023
Messages
1,575 (3.75/day)
Location
Nowhere
System Name I don't name my rig
Processor 14700K
Motherboard Asus TUF Z790
Cooling Air/water/DryIce
Memory DDR5 G.Skill Z5 RGB 6000mhz C36
Video Card(s) RTX 4070 Super
Storage 980 Pro
Display(s) Some LED 1080P TV
Case Open bench
Audio Device(s) Some Old Sherwood stereo and old cabinet speakers
Power Supply Corsair 1050w HX series
Mouse Razor Mamba Tournament Edition
Keyboard Logitech G910
VR HMD Quest 2
Software Windows
Benchmark Scores Max Freq 13700K 6.7ghz DryIce Max Freq 14700K 7.0ghz DryIce Max all time Freq FX-8300 7685mhz LN2
Ouch! Passive cooling works only if the heatsink gets pretty hot, you need larger gaps between fins than with forced airflow cooler and you also need to orient the fins so that the air can flow freely upwards.

Yes, well I don't have a designed passive heatsink,

But from past experience, I might be able to passively cool up to about 50w. Usually not more because of exponential temp increase.

But right now, with the fan in silent mode, it's passively cooled. The load is the challenge!!
 
Top