• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

14900k - Tuned for efficiency - Gaming power draw

Joined
Jun 14, 2020
Messages
5,206 (2.88/day)
System Name Mean machine
Processor AMD 6900HS
Memory 2x16 GB 4800C40
Video Card(s) AMD Radeon 6700S
So, since most people / reviewers are pushing the 14900k to over 200 watts and draw their conclusions based on that, let's see what a properly setup with minimal effort 14900k can do. Game of choice is TLOU because it's the game with the highest power draw right now, in every other game power drops to between 50 and 70w after the tuning, so testing those is pointless.

First video is a 14900k running stock, out of the box. Power draw almost hits 200 watts. DCLLs are calibrated so reported power draw is actually true.


Second video is after spending 10 minutes into the bios, no undervolting done, basically I turned off HT and locked cores to 5.5 ghz (and also tuned memory). Now performance is around 15% higher while power draw dropped from 200w to 120w peak.



What are the negatives of turning HT off? Well, you lose around 10% multithreaded performance, CBR23 score went from 41k to 36-37k, but temperatures and power draw dropped considerably. If maximum multithreaded performance isn't a priority (remember, even with HT off the 14900k is still one of the fastest CPUs in MT workloads) and gaming is more of your thing, turning it off is worth it.

I also have some results after tuning and undervolting, power draw dropped to below 100w even in TLOU, but that requires stability testing so I don't feel it's relevant here.

Also below a tuned 12900k just for comparison

 
Crazy results! didn't know you can turn off hyper threading in bios.
I play a lot of Tarkov and that would actually help a lot
 
"My 14900K will draw less power if I disable hyper-threading" is about as smart as "my body will require less food if I cut off both of my legs". If you buy a $600 CPU only to immediately disable half of its features just to get acceptable power draw, you're not being smart - you are, in fact, being the exact opposite.

I'm getting really tired of seeing these "Intel CPUs can be power efficient too" threads/posts. Nobody cares that they can be, the point is that, at stock, they are not. The fact that it's possible to make these CPUs consume sane amounts of power is not the saving grace that everyone who uses them seems to think it is. If it's not good out of the box, i.e. how the vast majority of users will experience it because most users don't tweak CPU power consumption, it's not good period.
 
If you're going to turn off hyperthreading in an i7/i9, just buy an i5 instead.
 
not really. HT is mostly providing 25% uplift and it's doing more damage than good. in this case not even cutting it. 37 to 41 not 25%. well sometimes it's good to have it.
 
Last edited:
Good results, I would opt for UV aswell, probably get below 100W :)
 
"My 14900K will draw less power if I disable hyper-threading" is about as smart as "my body will require less food if I cut off both of my legs". If you buy a $600 CPU only to immediately disable half of its features just to get acceptable power draw, you're not being smart - you are, in fact, being the exact opposite.

I'm getting really tired of seeing these "Intel CPUs can be power efficient too" threads/posts. Nobody cares that they can be, the point is that, at stock, they are not. The fact that it's possible to make these CPUs consume sane amounts of power is not the saving grace that everyone who uses them seems to think it is. If it's not good out of the box, i.e. how the vast majority of users will experience it because most users don't tweak CPU power consumption, it's not good period.
I think the point was gaming performance. if you notice the gaming performance increased... if all you're using the cpu for is gaming then you're not really "disabling HALF the features" you're optimizing what you have for what you do. Also it may be on a per game basis. not saying the 14900k is the best thing in the world, it's just the fastest available intel one right now. most people with a new i9 have a mobo with OC capabilities so this is actually really valuable to a large number of people.

I pretty much only game on my PC. what an idiot I am for buying the best available cpu for gaming then tweaking it for gaming while playing games that don't utilize HT..... LOL
 
Some of y'all are forgetting this.

 
So, since most people / reviewers are pushing the 14900k to over 200 watts and draw their conclusions based on that, let's see what a properly setup with minimal effort 14900k can do. Game of choice is TLOU because it's the game with the highest power draw right now, in every other game power drops to between 50 and 70w after the tuning, so testing those is pointless.

First video is a 14900k running stock, out of the box. Power draw almost hits 200 watts. DCLLs are calibrated so reported power draw is actually true.


Second video is after spending 10 minutes into the bios, no undervolting done, basically I turned off HT and locked cores to 5.5 ghz (and also tuned memory). Now performance is around 15% higher while power draw dropped from 200w to 120w peak.



What are the negatives of turning HT off? Well, you lose around 10% multithreaded performance, CBR23 score went from 41k to 36-37k, but temperatures and power draw dropped considerably. If maximum multithreaded performance isn't a priority (remember, even with HT off the 14900k is still one of the fastest CPUs in MT workloads) and gaming is more of your thing, turning it off is worth it.

I also have some results after tuning and undervolting, power draw dropped to below 100w even in TLOU, but that requires stability testing so I don't feel it's relevant here.

Also below a tuned 12900k just for comparison


An interesting way to approach some tuning. And there is Nothing wrong with it at all. You produced some results, and they look decent.

I usually cut the e-cores for most benchmarks, because I don't really game. And for some, like say, 3DMark IceStorm, I cut HT and E-cores. It's similar to 3DMark06, it doesn't scale past 6 cores for the cpu test.

Now with that much power draw slashed, that gives room for cpu overclocking. It's much easier to obtain an all core 5.7/5.8ghz with only 8c running. And per core epeen is nice with HT off. So, yeah some games and benchmarks are going to reflect HT turned off as long as you have enough cores to cover the game engine and physics so forth.

Thermal headroom is good, but I feel you could fill that headroom in a little by increasing cpu p-core frequency. If you've slashed enough power, you could add 1x multi most likely. Yeah the power draw goes up, but the cpu returns the favor with better performance.

Does it make sense to manage your system on the fly, or are you the set it and forget it kind of guy? If I was gaming, and wanted frame rate increase with power reduction, I feel your approach to be good. If the game isn't using the e-cores ever, might as well cut those too.

Keep on keepin on!
 
This is exactly right Here are my results with and without HT:

SOTR = shadow of the tomb raider benchmark -- it's pretty indicative of how most games will scale. 13700KF.

1702714240506.png


basically turning HT off is one of the best things you can do -- my overall gaming performance increased substantially while power draw and temps decreased i lost about 18% in full mutithread but then again im still pushing 24500K in CB23 - which is pretty damn fast. It also allows you to run less agressive LLC stably as well.

HT on cpus with E cores doesn't have nearly as large a performance impact on non-e core designs.

I have more data, but basically every game is so far benefitting without HT. Cyberpunk and far cry 5 are playing really smoothly -- even starfield seems to like it. Much better minimum FPS.
 
Last edited:
If you're going to turn off hyperthreading in an i7/i9, just buy an i5 instead.
Does an I5 have the same clocks and l3 cache as the i7 and i9 chips then as you seem to be implying its the same but without HT?

I have no problem with the default shipped state of the CPU being reviewed as is, thats on intel. But dont see an issue with users helping each other out to run their chips in a more efficient configuration.
 
Does an I5 have the same clocks and l3 cache as the i7 and i9 chips then as you seem to be implying its the same but without HT?

I have no problem with the default shipped state of the CPU being reviewed as is, thats on intel. But dont see an issue with users helping each other out to run their chips in a more efficient configuration.
It's lower, however not enough to actually damage performance to warrant paying more and taking off HT.
 
14700K + 3070 Ti
Default settings versus my XTU profile, activated by simply pressing those keys when playing.

Nice, what does your profile change?

The video reminded me of when i chopped of about 70-80w of my power draw on the spice wars map screen by changing my GPU profile on the fly. :)
 
PL1/2: 125W
IccMax:170A
Voltage Offset -0.050
Nothing spectacular.

According to the TPU review, 14900K@125W loses 2% in 1440p and 0% in 4K. With RTX 4090. If you lose 6 fps out of 300, the tragedy is only in the graphics of the reviews.
With the 3070 Ti, the 14900K doesn't lose anything even in 720p, I suppose.
 
PL1/2: 125W
IccMax:170A
Voltage Offset -0.050
Nothing spectacular.

According to the TPU review, 14900K@125W loses 2% in 1440p and 0% in 4K. With RTX 4090. If you lose 6 fps out of 300, the tragedy is only in the graphics of the reviews.
With the 3070 Ti, the 14900K doesn't lose anything even in 720p, I suppose.
I have no idea what IccMax does but seen it mentioned more than once now, some reading for me to do. :)
 
IccMax is chip's current capability limitation.

14900K is just a single chip that behaves like Intel chips have been behaving since the Core i9 12900K really.
The efficiency curve of these chips can be optimized quite a lot when some simple downvolt and TDP limitation.

This isn't just restricted to gaming. Multitile rendering can also see quite heavy reductions in power output of the CPU if properly efficiency tuned.
"ok, so why aren't Intel doing this at the factory?"
Well, this has to do with how chips are verified and tested. There's a lot of fat to cut because that's how wafer to product works in the industry today, at AMD's side too.

What I would suggest Intel to do, and have been in the past - prepare an optimization algorithm and put it on XTU for users who want the best efficiency they can per TDP.
Make it so you can select out of a bunch of TDP options and let the CPU run under that testing algorithm to see how efficient it can run.
 
This thread feels like an extension of the other 14900k vs 7800X3D thread, but without mentioning the 7800X3D over here because if you did, a lot of people would probably laugh at you.

Sure you can tune the CPU to be more efficient, but did you actually gain anything? If anything you're going to lose performance in the grand scheme of things unless your thermal solution is just woefully not up to the task. At least the 7800X3D works well out of the box without being a space heater.
 
This thread feels like an extension of the other 14900k vs 7800X3D thread, but without mentioning the 7800X3D over here because if you did, a lot of people would probably laugh at you.

Sure you can tune the CPU to be more efficient, but did you actually gain anything? If anything you're going to lose performance in the grand scheme of things unless your thermal solution is just woefully not up to the task. At least the 7800X3D works well out of the box without being a space heater.
Please dont bring the fanboy stuff in this thread from that one, a rational technical discussion is preferred to CPU bashing.

The problem with many CPUs and GPUs now days is they seem to be tuned for outright performance as if efficiency is of no concern, not all of us agree with this, and this applies to AMD as well, on my AMD rig I have XFR disabled to make it much more efficient.

Some of us are ok losing a bit of performance if efficiency increases, and of course it can actually give performance if you bouncing of a thermal or TDP limit. My GPU performs better after I configured my own custom curve to undervolt it.

On my day to day use I havent seen my chip go much above about 70w package power, but for me efficiency is my version of overclocking, I do it on all my hardware. If i had 170w package power in a game like that video, I would definitely be looking at what I could do about it.
 
Please dont bring the fanboy stuff in this thread from that one, a rational technical discussion is preferred to CPU bashing.

The problem with many CPUs and GPUs now days is they seem to be tuned for outright performance as if efficiency is of no concern, not all of us agree with this, and this applies to AMD as well, on my AMD rig I have XFR disabled to make it much more efficient.

Some of us are ok losing a bit of performance if efficiency increases, and of course it can actually give performance if you bouncing of a thermal or TDP limit. My GPU performs better after I configured my own custom curve to undervolt it.

On my day to day use I havent seen my chip go much above about 70w package power, but for me efficiency is my version of overclocking, I do it on all my hardware. If i had 170w package power in a game like that video, I would definitely be looking at what I could do about it.
That's not the point. It's that the 14900k out of the box consumes as much power as my 3930k with a heavy overclock. I mentioned out of the box right? People shouldn't have to be fiddling with their CPUs to get them within their advertised TDP to avoid having it overvolt like crazy when you're not utilizing the entire CPU. The 14900k was released without any efficiency in mind beyond the E-cores. Can it do it? Sure, but you shouldn't have to.

Also don't start slinging the term Fanboy around. The only AMD products I use right now is a Vega 64 and a Radeon Pro 5600m. It's not like I have an AMD CPU and I'm shilling for them. I like Intel and they make good chips, but for the last 3 or 4 generations, Intel has just been flat out stupid when it comes to power consumption OOTB. It's laughable when the rated TDP is exceeded by double without any modifications whatsoever.

The 14900k in stock form is flat out an inefficient chip regardless of what AMD has in their lineup. The numbers show that for themselves.

Another case and point is, show me a 14th gen SKU that's intended to be low power. You know, like previous T-series CPUs? Right now everything is K and KF which I have to assume is because they don't care about power consumption outside of the server market. Sorry, but for ~$500 USD, I expect better from Intel.
 
Last edited:
That's not the point. It's that the 14900k out of the box consumes as much power as my 3930k with a heavy overclock. I mentioned out of the box right? People shouldn't have to be fiddling with their CPUs to get them within their advertised TDP to avoid having it overvolt like crazy when you're not utilizing the entire CPU. The 14900k was released without any efficiency in mind beyond the E-cores. Can it do it? Sure, but you shouldn't have to.

Also don't start slinging the term Fanboy around. The only AMD products I use right now is a Vega 64 and a Radeon Pro 5600m. It's not like I have an AMD CPU and I'm shilling for them. I like Intel and they make good chips, but for the last 3 or 4 generations, Intel has just been flat out stupid when it comes to power consumption OOTB. It's laughable when the rated TDP is exceeded by double without any modifications whatsoever.

The 14900k in stock form is flat out an inefficient chip regardless of what AMD has in their lineup. The numbers show that for themselves.

Another case and point is, show me a 14th gen SKU that's intended to be low power. You know, like previous T-series CPUs? Right now everything is K and KF which I have to assume is because they don't care about power consumption outside of the server market. Sorry, but for ~$500 USD, I expect better from Intel.

The point is that this is a thread about tuning for efficiency - not if you should or shouldn't have to. It was started weeks before the current jerk-off session in the other thread started.

Nobody wants or cares about your off topic comments and your yelling at the clouds rants. Take it somewhere else.
 
The point is that this is a thread about tuning for efficiency - not if you should or shouldn't have to. It was started weeks before the current jerk-off session in the other thread started.

Nobody wants or cares about your off topic comments and your yelling at the clouds rants. Take it somewhere else.
You do realize that there is an entire review talking about this and that nobody replied to this thread until Thursday. I'm simply calling out the elephant in the room because you wouldn't need to do this if Intel and motherboard manufactures didn't treat the long term boost power as something you can do 100% of the time 24/7/365. It is related and at least I'm not taking pot shots at you personally for your opinion.

Just because you don't like what I'm saying doesn't mean that I'm wrong or that it's not related.
 
and at least I'm not taking pot shots at you personally for your opinion.
You just did ^^

Just because you don't like what I'm saying doesn't mean that I'm wrong or that it's not related.
Read my post again. It has nothing to do with if I agree with you or don't. You've come in to an entirely different thread bringing your ranting about the OOB experience with Intel. There's a whole other thread about that. Again, take it somewhere else.
 
This thread feels like an extension of the other 14900k vs 7800X3D thread, but without mentioning the 7800X3D over here because if you did, a lot of people would probably laugh at you.

Sure you can tune the CPU to be more efficient, but did you actually gain anything? If anything you're going to lose performance in the grand scheme of things unless your thermal solution is just woefully not up to the task. At least the 7800X3D works well out of the box without being a space heater.
Hi,
Yeah op got kicked off that one so he made his own intel love thread.

I don't mind tweaking in bios so more power or in this op case less power to you all out of the box really does not matter to them :laugh:

People I feel sorry for is the owners/ buyers of H mobile chips instead of more expensive HX chips they are stuck with locked chips.
I found out this just in time so I canceled and went amd mobile locked but at least not a space heater :cool:

I don't get the disabling HT either frankly
The enemy is thermal defective e=cores not HT lol
 
Back
Top