• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

14900k - Tuned for efficiency - Gaming power draw

With CPUs.. I go both ways no ties to anyone :)
 
Where does effecient and too much power crosslines?? At the tracks where AMD and Intel meet.

Should I be using Intels "Effeciency Cores" Instead of the power cores? Is that possible for gaming? Or is windblows scheduler gonna force everything to the single P core, which is essential for post.

Scratches head...
so part of Intel is Effecient and the other is performance. Think I'm running all the wrong cores all along.... fml.
 
HT does indeed seem a power hog.

Playing imperium galactica 22w package power with HT, 7w package power without HT.
 
I dunno man.. in hard loads I can see 250w ppt and ~260w package power.. I don’t think it’s very easy to cool lol..
 
I set all P and E cores to 4GHZ club speed. 1.20v nice and cool and seems... effecient!
Oh, 147w CBR24
 
I dunno man.. in hard loads I can see 250w ppt and ~260w package power.. I don’t think it’s very easy to cool lol..

I don't think you cool it... you just sort of let it hit 100C and bounce around at max clocks and dry out your paste.
 
I don't think you cool it... you just sort of let it hit 100C and bounce around at max clocks and dry out your paste.
I wouldn't feel comfortable with my CPU running at Tjmax. I'd probably just lower the thermal or power limits until it's a couple degrees cooler in every workload. Then, I'd buy a bigger cooler, provided that my case allows for it.
 
I didn't bring X3D here. AMD supporters brought it. It's a topic strictly related to Intel, but children can't help themselves.
Until we have a review of this processor with entry-middle video cards, I think that this processor costs too much for what it offers.


Ok, but what are AMD processors doing here?
Okay I actually agree there was no need to start talking about the x3d or anything. It was just an experiment reducing 14900k's gaming power draw.

14900k def not optimal price/perf for only gaming, but w/e if you have money, multiple computing needs and want to play around with your new toy, nothing wrong with that.
 
I wouldn't feel comfortable with my CPU running at Tjmax. I'd probably just lower the thermal or power limits until it's a couple degrees cooler in every workload. Then, I'd buy a bigger cooler, provided that my case allows for it.
I agree -- I like to just push it down to 220W or whatever the cooler is comfortable with -- if Ikeep at 250W or 288W it's going to hit 100C on a 360 aio almost instantly, with just 30W less - 220W you can comfortably cool even on some high end air -- plus I don't see that wattage touched outside of cinebench/prime even with a mild overclock.
 
Same preset in Cyberpunk. Min/Max/Avg falls below 1%, within the margin of error of this benchmark.


As an idea, the cooler is ARCTIC AC Liquid Freezer II 240, bought for 330 RON (~64 euros) a month before the Russian invasion of Ukraine.

I didn't need 14700KF. The 13500 does very well, but I was missing those over/under tools that I enjoy playing with. For 3070 Ti, 13/14400F is enough. A processor from series 10 and 11, as well as ryzen 5, is enough. That's why I ask (because many people only see gaming in their sights) how does X3D help if the video card is not top five?

Even so, you don't buy a premium video card for 1080p, or I don't see big differences between (I'll give an example) 7700X and 7800X3D.
Anyway, the idea is that Intel processors can be "trained" dramatically for consumption. An activity that, at least for me, is pleasant.
 
quad core wins.png

Why spend lot's of money on a cpu just to game ? 13500T on ebay you can have it for 175us stock match the 12600k (35w) when set power limit to 105w can do 12700 levels and to cool it? whatever junk cooler with a 9cm fan.
 
That's why I ask (because many people only see gaming in their sights) how does X3D help if the video card is not top five?
Only in the way that you (potentially) won't have to upgrade it so soon. 1080p and lower benchmarks exist as an attempt to predict future games with future graphics cards in a CPU-bound situation.

There's nothing else to it, really. One could ask the same about anything over i5 or R7 non-X3D.

I don't see big differences between (I'll give an example) 7700X and 7800X3D.
As someone who has owned both at some point (did I say I'm a curious type?), I can say that strictly performance-wise, in current games, with a modern mid-range graphics card, there is none.

With that said, I prefer the 7800X3D way more because it's much more economical with power out of the box, which also makes its boost way smoother.

I mean, the 7700X maxes out its 142 W PPT in all-core work, and boosts only as far as the situation allows. Mine did about 5.05-5.1 GHz in Cinebench, but it may do more or less than that in other programs, with different memory configurations, SoC voltages, etc.
The 7800, on the other hand, has a PPT of 162 W, but only uses around 80-85 max, so it can keep a 4.8 GHz all-core clock in every single application, regardless of memory config.

85 W at 4.8 GHz vs 142 W at 5.1 GHz - this makes the 7800X3D a hands down winner compared to the 7700X, in my opinion. The extra cache is just a bonus that might come in handy in the future.

I'm 100% happy with the X3D, but if I could do 2023 all over again, I'd probably just buy a 7700 non-X and call it a day, as that would give me an identical gaming experience at a much lower price.

But enough about AMD in an Intel thread, right? :ohwell:
 
Some of y'all are forgetting this.


I love all @W1zzard reviews that he do together with the other reviewers here on TPU you do a banger job and I respect the conclusions and use them when talking about computer hardware :lovetpu:

I would love to see instead of OC do UV on the graphics cards since AMD but specially also Nvidia pushes their cards to the max to see what a general uv settings for the different gpu models could be and what each tested card individually could do.

Because I don't feel like OC anymore with these electricity prices and I am sure I am not the only one.

One reason why I am thinking about downgrading from my RX 7900 XT to a RTX 4070 because the 7900 XT's TDP is 300W and the RTX 4070's TDP is 200W I know the RX 7900 XT in relative performance is about 38% better but 50% more power is needed which ain't 50% more performance.

Yes I do undervolt myself and found a stable setting at 1.020V running Alan Wake 2 with max settings at 1440p but again when the RTX 4070 can be undervolted down to about 150watt it's really wow because even if the RX 7900 XT can go down to 280 or 250W it's still not betting the overall performance difference per watt that the 40 series do from Nvidia.

I will properly miss the AMD software and everything that's why I owned 3x6800 XT models and 1x3090 and 1x3070.
 
I would love to see instead of OC do UV on the graphics cards since AMD but specially also Nvidia pushes their cards to the max to see what a general uv settings for the different gpu models could be and what each tested card individually could do.
Not all of them. I've had cards in the last couple of years that run very close to max temperature out of the box, but I've also had some very conservative ones. It all comes down to the manufacturer's design.

One reason why I am thinking about downgrading from my RX 7900 XT to a RTX 4070 because the 7900 XT's TDP is 300W and the RTX 4070's TDP is 200W I know the RX 7900 XT in relative performance is about 38% better but 50% more power is needed which ain't 50% more performance.
I wouldn't waste my money on a new GPU. Gaming 4 hours a day with a £0.3 per kWh price (which is normal for the UK, but quite high by world average) saves you £3.72 per month if your GPU eats 100 W less.

Other than these, I agree. :)
 
Not all of them. I've had cards in the last couple of years that run very close to max temperature out of the box, but I've also had some very conservative ones. It all comes down to the manufacturer's design.


I wouldn't waste my money on a new GPU. Gaming 4 hours a day with a £0.3 per kWh price (which is normal for the UK, but quite high by world average) saves you £3.72 per month if your GPU eats 100 W less.

Other than these, I agree. :)
Well most days and even weekends prices are up to £0.23-0.46 because of the situation in EU so it's no fun.

I use to have powerful machine running 24-7 but with these prices when I use to have an electricity bill of about £57.87 each month today it would be double that if I kept it running.

So I downsized to a tiny which do the same job and I can have my electricity bill £13.89 each month and I mostly only game doing the weekends for 2-8hours depending on what I need to do.

I want to do more gaming but also want to lower power consumption.
 
Well most days and even weekends prices are up to £0.23-0.46 because of the situation in EU so it's no fun.

I use to have powerful machine running 24-7 but with these prices when I use to have an electricity bill of about £57.87 each month today it would be double that if I kept it running.

So I downsized to a tiny which do the same job and I can have my electricity bill £13.89 each month and I mostly only game doing the weekends for 2-8hours depending on what I need to do.

I want to do more gaming but also want to lower power consumption.
Fair enough, but consider the price of a new graphics card, plus losing on the performance. You'd have to game for roughly 10 years on the 4070 instead of the 7900 for the difference in your bill to return its price.
 
It's insane how the power incrase in compute powa in last years. I have a kill a watt on my wall socket
When you start shave some watts to play a game on max 75hz... Memory from 3733mhz to 3200 almost 10w down. Change the coolers rpm or remove drop another 7w... remove the rust disk from system get better ssds to save some watts anothers 7w I have the 6700xt a power rog but down volt down clock and down the tbp for 6%.
Force vsync or free sync you can barely hits 150w on the wall with all gear.
My system from default bios can eat 75 78w with all power efficiency actived can draw 38w to be idle on windows 0.9kw/h per day
 
Why would you turn off HT on a 7800x3D ?
I have a 14900k. What this post is about.
AMD calls it SMT if you were gonna do that though

View attachment 325931
Why spend lot's of money on a cpu just to game ? 13500T on ebay you can have it for 175us stock match the 12600k (35w) when set power limit to 105w can do 12700 levels and to cool it? whatever junk cooler with a 9cm fan.
Why have lots of money and not spend it on a cpu just to game?

The overlocking/undervolting/HT off/ e-core Tweaking WORKS and I gained about 15-20 frames in Tarkov (very noticeable at my resolution/settings) above the already insane OOTB performance. Gonna try to keep pushing up the clock past 6.2/5.7 cause it’s real stable here.

anyone with more specific suggestions or settings that worked for their 14900 I would love to hear it!

I really don’t care about power consumption but I care about a hot PC and this tune so far is faster AND runs way cooler/quieter

Thanks to OP for bringing this to my attention cause it was the first post I found as a dumbass with a new setup wondering why tf the i9 was thermal throttling in benchmarks. Thought I had a bad chip at first. I’ve learned a lot - no thanks to people ranting about some AMD X3D nonsense lol

the HT OFF was the largest gain in performance from one setting in Tarkov and in thermals
 
I have a 14900k. What this post is about.
AMD calls it SMT if you were gonna do that though


Why have lots of money and not spend it on a cpu just to game?

The overlocking/undervolting/HT off/ e-core Tweaking WORKS and I gained about 15-20 frames in Tarkov (very noticeable at my resolution/settings) above the already insane OOTB performance. Gonna try to keep pushing up the clock past 6.2/5.7 cause it’s real stable here.

anyone with more specific suggestions or settings that worked for their 14900 I would love to hear it!

I really don’t care about power consumption but I care about a hot PC and this tune so far is faster AND runs way cooler/quieter

Thanks to OP for bringing this to my attention cause it was the first post I found as a dumbass with a new setup wondering why tf the i9 was thermal throttling in benchmarks. Thought I had a bad chip at first. I’ve learned a lot - no thanks to people ranting about some AMD X3D nonsense lol

the HT OFF was the largest gain in performance from one setting in Tarkov and in thermals

You claim that you bought the best gaming cpu, and now you're saying that you have a 14900k... doesn't quite add up...
 
I don't think you cool it... you just sort of let it hit 100C and bounce around at max clocks and dry out your paste.

I'm probably going to limit my 14700K to 231w on the PL2 or maybe possibly lower to more like 189w once I get settings stable and optimized at stock, but for time being keeping the PL's at stock. I think the highest temps I've seen so far on my 240 AIO is like 72c to 74c, but haven't done a over abundance of "prolonged" full load testing. It looks pretty good so far though in some shorter length full load testing I just don't know how much it'll creep up over time at all full load with the 240 AIO, but it's setup for push/pull so I think it'll cope reasonably well.
 
I wanna see some Linpack numbers.. people are talking wattages and clocks.. let’s see what happens with Linpack Xtreme :)

10GB load at least 5 times :D
 
You claim that you bought the best gaming cpu, and now you're saying that you have a 14900k... doesn't quite add up...
again. i did not claim i bought the best cpu. just the fastest intel one. my god people this is about 14900 tuning. not about what is best for gaming in existence or X3D comparisons.... just the best settings for a 14900. that's all
 

That didn't came as a suprise to be honest :laugh:.
 
Why have lots of money and not spend it on a cpu just to game?
Because it's pointless. Just because you're padded with cash it doesn't mean you have to flush it down the toilet.
 
Back
Top