• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

14900k - Tuned for efficiency - Gaming power draw

Cyberpunk 2077 has been known to use all/most available threads.

View attachment 325762
Even in cyberpunk HT off is faster...

I've tested running very crowded areas that peg the cpu HT on or OFF and the results are the same with HT slightly ahead. Im testing with RT off DLSS ON and the 4090 hitting a CPU bottleneck. Once you turn on RT it's all gpu bound again.
 
Even in cyberpunk HT off is faster.

I've tested running very crowded areas that peg the cpu HT on or OFF and the results are the same with HT slightly ahead.
Good to know. I've been running the last couple of days testing HT on/off but haven't gotten to try 2077 yet.
 
I think HT basically relies on pipeline stalls and dead time from cache misses etc. to keep the CPU busy -- so it's a fake thread endpoint that basically sits and waits for dead time to use the processor. It seems like games cant really use that effectively. So even though Cyberpunk is loading those threads, they're not really performing in a way that boosts performance.
 
Core # are generally overrated for games.

Our RAM tests show this, as does individual testing with HT disabled for example.

1702843155315.png
 
Core # are generally overrated for games.

Our RAM tests show this, as does individual testing with HT disabled for example.

View attachment 325768
far cry 5 is a great test for this as it runs one of the most brittle async shader loading background processes I've ever had to tune for (or outright disable). If anything is even remotely off it will hitch like crazy. Not enough threads? - hitching! - too many threads not fast enough? - hitching! ram latency not ludicrously low? SSD not at the perfect temperature?... you get the idea...
 
Hi,
Those silly world record creators with 9GHz that high hell they could of done it with 100w :laugh:
Yes, I agree.

So want to test something more realistic to the topic. But I don't own anything intense or modern I guess.

At the store. Got 20 feet 1/2" nylon tubing and a new submersible water pump.

Have to decide which water block, I have several. They all have performance impacts.

far cry 5 is a great test for this as it runs one of the most brittle async shader loading background processes I've ever had to tune for (or outright disable). If anything is even remotely off it will hitch like crazy. Not enough threads? - hitching! - too many threads not fast enough? - hitching! ram latency not ludicrously low? SSD not at the perfect temperature?... you get the idea...
Is Far Cry DX12 at least?
 
Well at first 253w really really meant something to you. (Other people too!)

And now 50w means nothing!!!
With 3 cores? Thanks, but this isn't 2010 anymore.

If I want pure wattage, I'll use my netbook with a dual-core Celeron that tops out at 6 W with both the CPU and iGPU loaded.

Well, given the fact you can do this on air, I say, this is pretty neat, I mean it shows how far these Intel procs stretch, both towards the top and the bottom, all at the same time. I don't think Zen has nearly that tweakability; you can tweak it, but not to these extremes.

Say you love Starcraft. 6 Ghz 13th gen core will play that game so hard. :D
It's funny because Intel used to be the plug-and-play CPU brand, and AMD the tweakers' dream, but nowadays, it's the other way around.

Not that I care, to be honest. Whatever works, works.
 
Last edited:
With 3 cores? Thanks, but this isn't 2010 anymore.

If I want pure wattage, I'll use my netbook with a dual-core Celeron that tops out at 6 W with both the CPU and iGPU loaded.
I can handle the 13700K with box cooler MSI preset enabled and will hit 100c by the end of a CBR24 24t multi-core score of around 1640pts. With a wraith prism.

So that's 253w restriction, the cpu pulls up to 210w with an average v-core of 1.31. If changing cpu multiplier, the board automatically sets 4094w or w/e and it just throttles a little earlier with an average clock speed of 5196mhz because of bclk droop. Which drives me nuts honestly. But that's part of the power saving standards I suppose.

But you're right. None of this applies to gaming.

Far Cry 5 is DX9 and requires a Ryzen 1600. Did I read that right?

This can't be 2018 testing any more??? Not valid??

FML.

Where then is the happy medium? I'd gladly test some gaming benchmarks, but would think to the standard of DX-12 and maybe 1440p to target between 1080p and 4k gamers. So middle class gpu, I have 6700 XT.

Target performance metric, Frame rates increase from CPU changes only, card must remain defaults.

But I need an adequate game. Flight Sim. DX11 and 12. Can fly with an Xbox controller? I'll buy it to try it.

Last time I played flight simulator was maybe 1992. On an apple Green monochrome screen. Have Forza on Xbox one though.... prefer to drive :)
 
I may have to look into the HT on/off differences with a few games eventually after I get new system better setup. I'm curious how it behave if you enable all HT disable all HT as well as disabling just a portion of HT cores and also seeing how much if at all it matters which HT core threads are enabled or disabled. Like I'd probably look into trying running half as many or quarter as many HT threads seeing if it matters if you phase alternate them differently from every other other core or all sequential from more of a primary or secondary HT thread enabling access.

The bios has a lot of options on how to enable/disable HT per p-core so i would be nice to look at what can be done with them and how. I think eventually on 14700K I'll have to see how it behaves disabling HT on all, but the last 2P cores and then set P cores 2-5 at 49 ratio 700MHz lower than the 56 boost on first two cores 0-1 and on the remaining two cores 6-7 drop ratio to 42 or 1400MHz lower than the first two cores along with HT enabled on those last two P cores so 4.2GHz with HT and 5.6GHz without along with 4.9GHz in between of more steady performance between the two. I set the AVX offset for 28 for the time being which is exactly half of the 56 boost ratio though may bump it up to x35 or x42 later. I'm just looking towards balanced usage, performance, and efficiency overall.

The use of hotkeys to change on demand usage seem like a great idea in concept. I'd like if it I could use the case reset button to change Asus MB profiles if they could modify the software to allow for that. There is already a bios option to assign the reset switch to a few things on my z790-H Strix so doesn't seem too fetched that perhaps it could control the Asus software to toggle between two different profiels or even cycle thru more than two. Basically a more modern day turbo button or turbo/efficiency button.
 
Not that I care, to be honest. Whatever works, works.
Why so serious.

Its a benchmark, with bench settings, c'mon now. You have completely missed the point.

But you are just a gamer right, no benching behind you?
 
Why so serious.

Its a benchmark, with bench settings, c'mon now. You have completely missed the point.

But you are just a gamer right, no benching behind you?
I didn't mean it that way - I'm happy for all hardware. :)

I just don't see the point of running 6 GHz if I have to disable nearly the whole CPU to do so.

As for any benchmarking background - nah, not really. I use benchmarks to see if my system is running right, but that's all. Like I said, I'm a practical person - I'm looking for real-world gains: more performance in games, easier cooling, less noise, etc. The only place where numbers interest me is my banking app. But each to their own. :)
 
Why so serious.

Its a benchmark, with bench settings, c'mon now. You have completely missed the point.

But you are just a gamer right, no benching behind you?
Cyber Punk 2077 any good? I'm DLing the 62GB initial instal. Lol.

I want to ask if I should update W10. It's at the base install of the thumb drive. Think there will be any compatibility issues there?
 
I want to ask if I should update W10. It's at the base install of the thumb drive. Think there will be any compatibility issues there?
I probably would try it first, you might get lucky :)
 
Another topic strictly about intel and inflamed by AMD supporters. Didn't you have a garden or something? Play there.

Another topic strictly about intel and inflamed by AMD supporters. Didn't you have a garden or something? Play there.

14700KF cheaper than 7800X3D (at least here)
In single tests, destroy this X3D. Not even the 7950X, AMD's fastest single processor, can beat it.
In the multithread tests, it destroys it even at a consumption set at ~100W. Here you get results comparable to the non-X3D 7900X.
In gaming, I repeat, what does X3D help me with? I have a 3070Ti now.
Yes, the 7800X3D gets a minimum of 2% if you use the RTX 4090. What is the percentage of those using the 4090?

Ex. V-ray
7800X3D = 14334
14700KF ~100W optimized = see screenshot
14700KF 170A vray 5.jpg
 
I didn't mean it that way - I'm happy for all hardware. :)

I just don't see the point of running 6 GHz if I have to disable nearly the whole CPU to do so.

As for any benchmarking background - nah, not really. I use benchmarks to see if my system is running right, but that's all. Like I said, I'm a practical person - I'm looking for real-world gains: more performance in games, easier cooling, less noise, etc. The only place where numbers interest me is my banking app. But each to their own. :)

You don't exactly "need" to disable nearly the entire CPU to run 6GHz though. Basically why heat a unoccupied room!? I mean PBO kind of does similar though more centered around boosting performance rather than curbing some performance for the sake of efficiency. Intel obvious pushed things a bit more extreme stock performance, but that's also exactly why people are interested in taming that a bit since it was a bit excessive at least on the higher end unlocked chip parts at full load which I imagine most people aren't going to leverage too heavily all the time. I picked up a 14700K not because I need that much ST boost or MT collective performance levels all the time, but because it's there if I need or want it and benefits me to make available. Basically the same reasons for having a certain threshold of system memory or VRAM or storage capacity.
 
Another topic strictly about intel and inflamed by AMD supporters. Didn't you have a garden or something? Play there.

Another topic strictly about intel and inflamed by AMD supporters. Didn't you have a garden or something? Play there.

14700KF cheaper than 7800X3D (at least here)
In single tests, destroy this X3D. Not even the 7950X, AMD's fastest single processor, can beat it.
In the multithread tests, it destroys it even at a consumption set at ~100W. Here you get results comparable to the non-X3D 7900X.
In gaming, I repeat, what does X3D help me with? I have a 3070Ti now.
Yes, the 7800X3D gets a minimum of 2% if you use the RTX 4090. What is the percentage of those using the 4090?

Ex. V-ray
7800X3D = 14334
14700KF ~100W optimized = see screenshot
View attachment 325816
Yes, yes we are know the 7800x3d is a gaming centric chip that is really good at gaming, but thats it. I chose a 14700kf myself, I like intel, I like the fast startup times and I already had a compatible motherboard.

However, I don't think its fair to compare the 7800x3d in things other than games, you know? Because its a gaming, and... pretty much, a gaming only chip, though general use is fine too of course.
 
Honestly I strongly considered a 7900X3D chip, but the pricing combo on 14700K and Z790-H STRIX at $550 was hard to pass up on value and similar enough overall on performance. Yeah there are inherent differences between the two CPU's and I would say objectively some are clearly better and worse in both directions for different usage cases and I somewhat fall into the either/or usage case for both so I wasn't over concerned with either the value for dollar was the biggest factor and it just felt better with the combo that was available.

I would say there are a few cases where higher ST/MT on 14700K is of higher importance than the L3 cache performance on the 7900X3D however that weighed into my decision a little bit as well since I dabble with a lot of different software at times from my DAW interest to things like operating a game server and/or 3D modeling as well as stuff like photo editing. There are a lot of instances where core count is pretty important if not more important than cache size, but you've got scenarios that are also the opposite especially so in the case of low latency gaming.

I don't at all think it's at all as big a issue as most individuals tend to overstate, but people have different expectations and many get a bit over swept up in somewhat of a placebo effect of relative performance uplift. There are clearly performance differences that are measurable though I'd say some individuals are overly critical over minor nuisances of differences between two similar setups that both perform very god damn well compared to PC's 5 to 10 years ago.

I'd argue it's becoming increasing harder to build a bad performing PC today with a moderate budget. I went a bit heavy handed on CPU upgrade, but want it to last and can update the GPU later eventually. The 14700K could realistically last me a half decade to decade easily. It's going to be a solid overall performer for a good long while. It could end up being the last desktop I ever build hypothetically since portable performance is only going to improve and doing away with the PC isn't far fetched at the very least your desktop a decade from now might be reserved for strictly external storage array and/or external GPU only. The desktop will certainly still exist though won't really be necessary for today's high end desktop's at the same time is what I suspect due to node refine bringing down thermal requirements and improving overall efficiency by leaps and bounds.
 
I'd argue it's becoming increasing harder to build a bad performing PC today with a moderate budget.
Totally. Pretty hard to get a bad cpu these days unless like you specifically go out of your way to find one. I was just using a 12100 for a week or so while I waiting for my 14700kf replacement and I can't say I noticed any difference in day to day tasks. Never tried gaming on it though... too addicted to Shining Force II on Genesis right now. But yeah, I was hoping this 14700kf and 4090 could last me as long as my 3570k and gtx 1070 lasted me ( a long ass time!).
 
Searched through some older screenshots. GW2 is usually always CPU limited. It's almost impossible to benchmark do to the nature of online and never the same scene twice. Still utilizing all the cores, but pegs P-cores 3 and 4 and still only 40fps.

View attachment 325764

Those are your "prime cores", the ones that hit the thermal velocity boost target of 5.8 GHz on your 13900K. On my 13900KS, these are the same ones that run 6 GHz consistently. Seems that Windows likes to throw things on those two whenever it can.
 
Searched through some older screenshots. GW2 is usually always CPU limited. It's almost impossible to benchmark do to the nature of online and never the same scene twice. Still utilizing all the cores, but pegs P-cores 3 and 4 and still only 40fps.

View attachment 325764
Yeah too bad the engine is hard limited to I believe 100 or 160 FPS, or something in between, either way, on my 8700K I would constantly bump into it outside of raids and towns. The game chokes hard on loading lots of assets, try reducing the number of different character models in the scene (there's a setting for it) and watch your FPS in busy areas skyrocket.
 
Another topic strictly about intel and inflamed by AMD supporters. Didn't you have a garden or something? Play there.
This is an Intel thread, so if someone feels like sharing some experiences with another vendor, it must be for no other reason than that person being an avid supporter of the opposition, which automatically invalidates their every point, regardless of content. Damn those ignorant peasants trying to contribute from a different perspective, we don't want their dirty feet muddying our clear waters! :rolleyes:

If you were referring to me for whatever reason, then I guess circa 80% of my CPUs during my lifetime as a gamer being Intel, as well as 6 out of the 8 that I currently own, make me an AMD supporter just because the one in my main gaming rig happens to be AMD right now. I guess, hating the R5 3600 that I had for a couple weeks at some point before quickly passing it on, and loving (and still owning) the i7-11700 that I replaced it with, because I could extract way more performance out of it in a SFF case with limited airflow before throttling, also make me an AMD fanboy.

And of course the 7800X3D is absolute garbage because it's only the fastest gaming CPU, nothing else. I probably should have opted for more cores and an architecture that works best with Windows 11 just because you said so, despite the fact that I only care about gaming performance, don't need more cores, and have no intention to install Windows 11 ever.

If you're happy with the 14700K, all the power to you. Nobody says that your experience is any inferior just because other CPUs suit other people's needs better. You have nothing to prove to me, just as I have nothing to prove to you. There's no reason why either of us couldn't be happy with what we have, and there's no reason why both of us couldn't be happy for OP's achievements with the 14900K.

Oh, and the Ryzen owners' thread is open to everyone, as far as I'm aware.

I don't at all think it's at all as big a issue as most individuals tend to overstate, but people have different expectations and many get a bit over swept up in somewhat of a placebo effect of relative performance uplift. There are clearly performance differences that are measurable though I'd say some individuals are overly critical over minor nuisances of differences between two similar setups that both perform very god damn well compared to PC's 5 to 10 years ago.
This is very true! I tend to call it the review syndrome. We look at graphs that over-exaggerate differences just so that they can be shown to the public in a clearly understandable way. No one mentions that those differences are miniscule in a real-life scenario. Nobody notices +5 FPS when you're already over 200.
 
I probably would try it first, you might get lucky :)
Hope So. I know without updates, I may be missing or gaining performance.

5.7ghz all core max benchmark speed.
1.40+ VID
This equates to
E-cores disabled
5.5ghz all core w/ht
So far, average 1.270v
194.786w
Vdroop testing CBr23
90c load 26 idle
Run # 13 complete.
How many? As many as it takes.

I ran a game and got X FPS isn't exactly great testing. I could go as far as binning each core, but all core 5.5ghz scores an average of just over 23K CBR23.

That's about 5900X multi-core performance.
 
Behold the true 35w powa! With 13500t has same fps :)
IMG-20231207-WA0035.jpeg
 
However, I don't think its fair to compare the 7800x3d in things other than games, you know? Because its a gaming, and... pretty much, a gaming only chip, though general use is fine too of course.
I didn't bring X3D here. AMD supporters brought it. It's a topic strictly related to Intel, but children can't help themselves.
Until we have a review of this processor with entry-middle video cards, I think that this processor costs too much for what it offers.

This is an Intel thread, so if someone feels like sharing some experiences with another vendor, it must be for no other reason than that person being an avid supporter of the opposition, which automatically invalidates their every point, regardless of content. Damn those ignorant peasants trying to contribute from a different perspective, we don't want their dirty feet muddying our clear waters! :rolleyes:

If you were referring to me for whatever reason, then I guess circa 80% of my CPUs during my lifetime as a gamer being Intel, as well as 6 out of the 8 that I currently own, make me an AMD supporter just because the one in my main gaming rig happens to be AMD right now. I guess, hating the R5 3600 that I had for a couple weeks at some point before quickly passing it on, and loving (and still owning) the i7-11700 that I replaced it with, because I could extract way more performance out of it in a SFF case with limited airflow before throttling, also make me an AMD fanboy.

And of course the 7800X3D is absolute garbage because it's only the fastest gaming CPU, nothing else. I probably should have opted for more cores and an architecture that works best with Windows 11 just because you said so, despite the fact that I only care about gaming performance, don't need more cores, and have no intention to install Windows 11 ever.

If you're happy with the 14700K, all the power to you. Nobody says that your experience is any inferior just because other CPUs suit other people's needs better. You have nothing to prove to me, just as I have nothing to prove to you. There's no reason why either of us couldn't be happy with what we have, and there's no reason why both of us couldn't be happy for OP's achievements with the 14900K.

Oh, and the Ryzen owners' thread is open to everyone, as far as I'm aware.
Ok, but what are AMD processors doing here?
 
Back
Top