• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

14900k - Tuned for efficiency - Gaming power draw

Joined
Dec 14, 2023
Messages
10 (0.03/day)
Location
United States
System Name Overbuilt Intel P.O.S. (In White)
Processor i9 14900k 6.2Ghz
Motherboard MSI Z790 MPG
Cooling Lian Li 360
Memory T-Create Expert DDR5 7200 34
Video Card(s) MSI 4080 OC 3Ghz
Storage Samsung 990 Evo 2Tb
Display(s) 27" flat 2k 144hz G sync
Case Lian Li o11 Dynamic Evo
Power Supply ThermalTake 1000W
Mouse Razer Basilisk Ultimate
Keyboard Keychron K2 75%
I think you didn't get the idea.
~2150 pts ST and 25000 pts MT are obtained at a maximum consumption of 76.2W.
I don't know exactly how much it consumes in the single tests, but HWinfo reported that peak after the two tests single/multi in Cinebench and CPU-Z. Default, 14700K(F) gets over 35000 pts at 253W and over 36000 fully unlocked.
So, I can make it consume as much as X3D and get much better performance than this 7800X3D, or I let it fly free and get double the 7800X3D. I repeat: at a lower price.

If someone proves to me that the 7800X3D helps the 3070 Ti achieve better performance than with the 14700K, we have a topic for discussion. Until then, the comparisons from the reviews regarding their potential beyond gaming remain standing.
And one more thing: if the 3070Ti doesn't get anything extra by replacing the i5-13500 with the i7-14700KF (notified by me, I don't need help), then it's clear to me that I have enough reserve for the next series of video cards. Maybe not the future 5090, but with the 5070 I bet it won't have any problems.
The 4000 series is skipped because I paid a lot for the 3070 Ti in the madness of 2021.
I came here to discuss i9 settings and HT information
but this thread has become another AMD/Intel battle. if you want the sad details everyone is spamming here they are

from the tests I've seen the 7800X3D can outperform my tuned i9 (and an i7) in all gaming situations by a small bit. It will almost always be better IN GAMING than your 14700k and you will see slight frame increases even with a beastly or decent gpu ( I had 14700 and tested it) Around +5 frames avg at 2k settings. Your 14700 will almost always outperform a 7800X3D in CPU benchmarks. I don't why anyone cares about benchmarking outside of stability tests if you aren't ever replicating that kind of data usage though. Only valuable benchmark is the games you play/tasks you do most often. 7800X3D It would have been a really good and cheaper option for me (and you) also out of the box performs better. Was fully aware of this, my last build was AMD and i hated it so this time around I wanted to go intel despite the obvious price/performance discrepancy IN GAMING that everyone is freaking out about. Guess what, AMD people are right the cpu friggin rips. Also the new intel shit friggin rips. Game on Gamers

Also the 14700 i tested was tuned and a bit faster than OOTB (about 5.9GHz). tested with a 4080 OC


anyone been able to get their i9 past 6.3Ghz with HT OFF?
 
Last edited:
Joined
Jan 14, 2019
Messages
13,195 (6.03/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
I think you didn't get the idea.
~2150 pts ST and 25000 pts MT are obtained at a maximum consumption of 76.2W.
I don't know exactly how much it consumes in the single tests, but HWinfo reported that peak after the two tests single/multi in Cinebench and CPU-Z. Default, 14700K(F) gets over 35000 pts at 253W and over 36000 fully unlocked.
So, I can make it consume as much as X3D and get much better performance than this 7800X3D, or I let it fly free and get double the 7800X3D. I repeat: at a lower price.

If someone proves to me that the 7800X3D helps the 3070 Ti achieve better performance than with the 14700K, we have a topic for discussion. Until then, the comparisons from the reviews regarding their potential beyond gaming remain standing.
And one more thing: if the 3070Ti doesn't get anything extra by replacing the i5-13500 with the i7-14700KF (notified by me, I don't need help), then it's clear to me that I have enough reserve for the next series of video cards. Maybe not the future 5090, but with the 5070 I bet it won't have any problems.
The 4000 series is skipped because I paid a lot for the 3070 Ti in the madness of 2021.
OK, I think we got this clear:
  • I don't care about superior application performance with the i7 and i9 because I don't work with my CPU - I only need it for gaming. I'm not saying that your score is not great, just that I don't give a damn about Cinebench points. Out of the box, the X3D is better at gaming and much more conservative with power. That's all I care about.
  • You don't care about superior gaming performance with the X3D because it doesn't help the 3070 Ti, and by the time you upgrade your GPU, you'll probably upgrade your CPU as well.
Shall we move on? :)
 
Joined
Nov 16, 2023
Messages
1,575 (3.75/day)
Location
Nowhere
System Name I don't name my rig
Processor 14700K
Motherboard Asus TUF Z790
Cooling Air/water/DryIce
Memory DDR5 G.Skill Z5 RGB 6000mhz C36
Video Card(s) RTX 4070 Super
Storage 980 Pro
Display(s) Some LED 1080P TV
Case Open bench
Audio Device(s) Some Old Sherwood stereo and old cabinet speakers
Power Supply Corsair 1050w HX series
Mouse Razor Mamba Tournament Edition
Keyboard Logitech G910
VR HMD Quest 2
Software Windows
Benchmark Scores Max Freq 13700K 6.7ghz DryIce Max Freq 14700K 7.0ghz DryIce Max all time Freq FX-8300 7685mhz LN2
Sounds fun, let me try get some numbers for comparison...
Ryzen 7 5800X3D (with -30 CO) + Thermalright Burst Assassin 120
View attachment 326235
(had to run R24 twice [I forgot to disable 10min throttle mode...], so Avg will be higher)
Works for me.

I was shocked, but Raptor lake has some impressive IPC at just about any frequency.
 
Joined
Jun 6, 2022
Messages
622 (0.66/day)
OK, I think we got this clear:
  • I don't care about superior application performance with the i7 and i9 because I don't work with my CPU - I only need it for gaming. I'm not saying that your score is not great, just that I don't give a damn about Cinebench points. Out of the box, the X3D is better at gaming and much more conservative with power. That's all I care about.
  • You don't care about superior gaming performance with the X3D because it doesn't help the 3070 Ti, and by the time you upgrade your GPU, you'll probably upgrade your CPU as well.
Shall we move on? :)
I think you should also be interested in the other scores because they reflect the computing power of your system. I doubt you use that PC only as a console.
I have just demonstrated that, at approximately the same consumption, the 14700K(F) outperforms the 7800X3D by ~25% in the single/multi core tests, with the option of getting double in multithreading at maximum power. I repeat again: at a lower cost.
I think I will save this preset with the name "X3D mode" because it offers double what the 3070Ti needs and AMD supporters can only take the hate to the line: "hey, I paid 30% more for X3D, but I saved 1.02 Watts :clap:".
No, I will not need another processor for a future generation of video cards, you know that very well. Even with the king of video cards (this superb 4090) the differences are negligible between the two processors in 1440p, zero in 4K. After the pitiful way in which Intel topics are attacked by AMD fans, I wouldn't be surprised if a new argument would be that I can notice the difference between 394 fps and 381 fps while playing and that this is the reason why I pay 30% more much on X3D.

Don't get me wrong. I'm not saying that the 7800X3D is a weak processor. No, it's not, but it's too expensive for what it offers and I find it embarrassing to argue that a processor is only useful in games. The absolute embarrassment comes especially from those who waved flags with the 5800X3D and now changed them to the 7800X3D. And I ask: hey, what happened to the 5800X3D, it was only released last year? Where did "future proof" go?

I am showing a new Cine23 test package, with the same settings but with the activation of "Instant 6 Ghz" in the BIOS. It has an impact only in singles.

14700K 140A Ciner23multi.jpg

14700K 140A Ciner23single.jpg
 
Last edited:
Joined
Jan 14, 2019
Messages
13,195 (6.03/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
I think you should also be interested in the other scores because they reflect the computing power of your system. I doubt you use that PC only as a console.
You're right - I also use it for web browsing and video playback, which is something that even a Celeron can do, especially with GPU hardware acceleration.

So you're planning to keep the i7 for several generations. That's great! :) I might also keep the X3D for a while as it's way more powerful than what I need right now. The extra cache may become more useful in the long run, just as much as your extra cores might be. I don't think either of our approaches is wrong.

I don't know about people upgrading from a 5800X3D. I guess they just wanted the best, the same as people upgrading from a 12900K to a 13900K or 14900K. We're PC enthusiasts, a lot of upgrades don't make monetary sense, we just do it for fun. As for me, I upgraded from an i7-11700 to a 7700X and while the uplift in application performance was massive, I didn't notice much, if anything at all in games and during everyday use. Then I got the X3D just to see what all the hype was about, which I probably would have sold and switched back to the 7700X if not for the much lower power consumption and much more stable boost clocks out of the box. Whether it was worth it performance-wise, we'll see in the coming years, I guess. Like I said, if I was a sensible person, I'd be just about to start thinking about upgrading from a 7700K right now. But I'm not - in a hobby PC builder, and I'm not pretending that it's not an expensive hobby by any means. :)
 
Joined
Jun 6, 2022
Messages
622 (0.66/day)
There is life beyond gaming, believe me.
I can say the same about this gaming, that I will not notice any difference if I change the 14700K to the 7800X3D, but I have a real reserve of computing power, it didn't cost me anything extra and it is welcome for the future.

As for the future of your processor, in gaming, you can take a look at the fate of the 5800X3D. Last year he was king.
 
Joined
Nov 16, 2023
Messages
1,575 (3.75/day)
Location
Nowhere
System Name I don't name my rig
Processor 14700K
Motherboard Asus TUF Z790
Cooling Air/water/DryIce
Memory DDR5 G.Skill Z5 RGB 6000mhz C36
Video Card(s) RTX 4070 Super
Storage 980 Pro
Display(s) Some LED 1080P TV
Case Open bench
Audio Device(s) Some Old Sherwood stereo and old cabinet speakers
Power Supply Corsair 1050w HX series
Mouse Razor Mamba Tournament Edition
Keyboard Logitech G910
VR HMD Quest 2
Software Windows
Benchmark Scores Max Freq 13700K 6.7ghz DryIce Max Freq 14700K 7.0ghz DryIce Max all time Freq FX-8300 7685mhz LN2
I think you should also be interested in the other scores because they reflect the computing power of your system. I doubt you use that PC only as a console.
I have just demonstrated that, at approximately the same consumption, the 14700K(F) outperforms the 7800X3D by ~25% in the single/multi core tests, with the option of getting double in multithreading at maximum power. I repeat again: at a lower cost.
I think I will save this preset with the name "X3D mode" because it offers double what the 3070Ti needs and AMD supporters can only take the hate to the line: "hey, I paid 30% more for X3D, but I saved 1.02 Watts :clap:".
No, I will not need another processor for a future generation of video cards, you know that very well. Even with the king of video cards (this superb 4090) the differences are negligible between the two processors in 1440p, zero in 4K. After the pitiful way in which Intel topics are attacked by AMD fans, I wouldn't be surprised if a new argument would be that I can notice the difference between 394 fps and 381 fps while playing and that this is the reason why I pay 30% more much on X3D.

Don't get me wrong. I'm not saying that the 7800X3D is a weak processor. No, it's not, but it's too expensive for what it offers and I find it embarrassing to argue that a processor is only useful in games. The absolute embarrassment comes especially from those who waved flags with the 5800X3D and now changed them to the 7800X3D. And I ask: hey, what happened to the 5800X3D, it was only released last year? Where did "future proof" go?

I am showing a new Cine23 test package, with the same settings but with the activation of "Instant 6 Ghz" in the BIOS. It has an impact only in singles.

View attachment 326275
View attachment 326276
Sorry to cut in here.

77w And the score is 10k low. I demonstrated an all core 4000mhz at 120w with the same score.

Can you fill me in what happened, was that a tuned all core run or something?
 
Joined
Jan 22, 2007
Messages
932 (0.14/day)
Location
Round Rock, TX
Processor 9950x
Motherboard Asus Strix X870E-E
Cooling Kraken Elite 280
Memory 64GB G.skill 6000mhz CL30
Video Card(s) Sapphire 7900XTX Pulse
Storage 1X 4TB MP700 Pro - 1 X 4TB SN850X
Display(s) LG 32" 4K OLED + LG 38" IPS
Case Lian Li o11 Air Mini
Power Supply Corsair RM1000x
Software WIndows 11 Pro
I seriously dont get all the intel vs amd love/hate.. i like both and i own both.

also, lets take a moment and realize how nice it is that these conversations are even happening, it wasn't very long ago AMD was so far behind this wasn't even a discussion, they have definitely redeemed themselves.
 
Joined
Jul 30, 2019
Messages
3,374 (1.70/day)
System Name Still not a thread ripper but pretty good.
Processor Ryzen 9 7950x, Thermal Grizzly AM5 Offset Mounting Kit, Thermal Grizzly Extreme Paste
Motherboard ASRock B650 LiveMixer (BIOS/UEFI version P3.08, AGESA 1.2.0.2)
Cooling EK-Quantum Velocity, EK-Quantum Reflection PC-O11, D5 PWM, EK-CoolStream PE 360, XSPC TX360
Memory Micron DDR5-5600 ECC Unbuffered Memory (2 sticks, 64GB, MTC20C2085S1EC56BD1) + JONSBO NF-1
Video Card(s) XFX Radeon RX 5700 & EK-Quantum Vector Radeon RX 5700 +XT & Backplate
Storage Samsung 4TB 980 PRO, 2 x Optane 905p 1.5TB (striped), AMD Radeon RAMDisk
Display(s) 2 x 4K LG 27UL600-W (and HUANUO Dual Monitor Mount)
Case Lian Li PC-O11 Dynamic Black (original model)
Audio Device(s) Corsair Commander Pro for Fans, RGB, & Temp Sensors (x4)
Power Supply Corsair RM750x
Mouse Logitech M575
Keyboard Corsair Strafe RGB MK.2
Software Windows 10 Professional (64bit)
Benchmark Scores RIP Ryzen 9 5950x, ASRock X570 Taichi (v1.06), 128GB Micron DDR4-3200 ECC UDIMM (18ASF4G72AZ-3G2F1)
I seriously dont get all the intel vs amd love/hate.. i like both and i own both.
There are no bad cpu's only bad prices and PITA RMA processes.
also, lets take a moment and realize how nice it is that these conversations are even happening, it wasn't very long ago AMD was so far behind this wasn't even a discussion, they have definitely redeemed themselves.
AMD almost got snuffed out of existence. If they hadn't Ryzen to the occasion we would still be on and other +++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ quad core CPU but with 28 ecores today.
 
Joined
Jan 14, 2019
Messages
13,195 (6.03/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
There is life beyond gaming, believe me.
I can say the same about this gaming, that I will not notice any difference if I change the 14700K to the 7800X3D, but I have a real reserve of computing power, it didn't cost me anything extra and it is welcome for the future.
And I have a reserve of extra gaming power. If you can show me what else I could ever need in a gaming PC, I'll be happy to consider it.

As for the future of your processor, in gaming, you can take a look at the fate of the 5800X3D. Last year he was king.
It's next to the 12700K, sometimes 12900K, right where it's expected to be. And your point is...?

Edit: If you mean that Zen 4 and Raptor Lake perform better, of course they do! It's called progress. But the 5800X3D is still way faster than other Zen 3 CPUs and most of the Alder Lake lineup.

I seriously dont get all the intel vs amd love/hate.. i like both and i own both.
Same here, and I couldn't agree more. :)
 
Last edited:
  • Like
Reactions: Jun
Joined
Nov 16, 2023
Messages
1,575 (3.75/day)
Location
Nowhere
System Name I don't name my rig
Processor 14700K
Motherboard Asus TUF Z790
Cooling Air/water/DryIce
Memory DDR5 G.Skill Z5 RGB 6000mhz C36
Video Card(s) RTX 4070 Super
Storage 980 Pro
Display(s) Some LED 1080P TV
Case Open bench
Audio Device(s) Some Old Sherwood stereo and old cabinet speakers
Power Supply Corsair 1050w HX series
Mouse Razor Mamba Tournament Edition
Keyboard Logitech G910
VR HMD Quest 2
Software Windows
Benchmark Scores Max Freq 13700K 6.7ghz DryIce Max Freq 14700K 7.0ghz DryIce Max all time Freq FX-8300 7685mhz LN2
So I did that little 4ghz all core + e-cores and Ran a little time Spy.

The CPU seems efficient enough to reach near my average scores from multiple runs.

4000mhz 24T vs 5700mhz 16T CPU time spy score difference. Minimal impact to TimeSpy.

Time Spy scales linear to the GPU, in this case an RX 6800 -

I suppose to get all the cores at a higher frequency to catch the 5.7ghz score would really increase the wattage. I estimate about 148-152w at load and would hope to be only running 4.2ghz, though I prefer the lower wattage at the 4ghz it's at now, roughly 123w peak vs 235w. 5.7ghz 16 threads is closer to 260w peaks. :)

Efficient enough?? Perhaps this is when the E-cores really shine, when the P-cores are at the same speed.
(Usually I overclock, this is like an all new weirded out adventure and I don't know why I'm doing it.)
13700K go burrrrrrrr

EDIT: Note, this is on air cooling. I'll nab a pick next time I post another Benchmark comparison.

TimeSpy CPU score comparison.png
 
Last edited:
Joined
May 24, 2023
Messages
957 (1.61/day)
...
All I do know is that my E-cores want an outstanding amount of v-core to run 4.7ghz. Even with P core reduction and frequency reduction. Horrible design mess.
Do you know that 4 E cores fit in the area of 1 P core and the E core has approx. 70% performance of the P core at the same frequency and they have similar energy efficiency?

E cores running at 4.4 GHz compared to P cores at 5.7 GHz cause approx. DOUBLE heat output per silicone area of the P cores. You have to be very careful about what frequency you make E cores run at.

E cores at 4700 MHz have approx. 2.4 higher heat output per silicone area compared to P cores running at 5.7 GHz.

E cores got their name from SILICONE AREA EFFICIENCY, not energy efficiency. You get a lot more performance from a given area of silicone from E cores than from P cores.
 
Last edited:
Joined
Mar 21, 2016
Messages
2,508 (0.78/day)
From what I've been seeing with CPU-Z ST/MT benchmark you want the E-cores and P-cores moderately closer together with a slight offset of a higher ratio on the P-cores, but there is a point where jacking up the P-core ratio has a serious detriment to overall ST/MT results. At least on the 14700K. The E-cores provide far more uplift overall to bump up the MT uplift is enormous, but ST not so much however it seems like a better sustained all P-core is preferable to some occasional P-core boost on 1-2 cores and others operating at lower frequencies. Setting the E-core ratio higher is nice, but seems tricky to get stable beyond a certain point.

Still trying to gauge and figure out instabilities on this system there is a lot of over complexity to the design which is both good and bad. It's harder to figure out, but once optimized well should be better given the amount of fine tuning controls that is offered.
 
Joined
Dec 7, 2022
Messages
56 (0.07/day)
"My 14900K will draw less power if I disable hyper-threading" is about as smart as "my body will require less food if I cut off both of my legs". If you buy a $600 CPU only to immediately disable half of its features just to get acceptable power draw, you're not being smart - you are, in fact, being the exact opposite.

I'm getting really tired of seeing these "Intel CPUs can be power efficient too" threads/posts. Nobody cares that they can be, the point is that, at stock, they are not. The fact that it's possible to make these CPUs consume sane amounts of power is not the saving grace that everyone who uses them seems to think it is. If it's not good out of the box, i.e. how the vast majority of users will experience it because most users don't tweak CPU power consumption, it's not good period.
-
If you're going to turn off hyperthreading in an i7/i9, just buy an i5 instead.
-
Exactly my thoughts.
-
If you can afford to buy a 600$ CPU, IT IS EXCATLY ONLY YOUR CHOICE WHAT YOU WANT TO DO WITH IT , PERIOD!!!
 

Toothless

Tech, Games, and TPU!
Supporter
Joined
Mar 26, 2014
Messages
9,675 (2.45/day)
Location
Washington, USA
System Name Veral
Processor 7800x3D
Motherboard x670e Asus Crosshair Hero
Cooling Corsair H150i RGB Elite
Memory 2x24 Klevv Cras V RGB
Video Card(s) Powercolor 7900XTX Red Devil
Storage Crucial P5 Plus 1TB, Samsung 980 1TB, Teamgroup MP34 4TB
Display(s) Acer Nitro XZ342CK Pbmiiphx, 2x AOC 2425W, AOC I1601FWUX
Case Fractal Design Meshify Lite 2
Audio Device(s) Blue Yeti + SteelSeries Arctis 5 / Samsung HW-T550
Power Supply Corsair HX850
Mouse Corsair Nightsword
Keyboard Corsair K55
VR HMD HP Reverb G2
Software Windows 11 Professional
Benchmark Scores PEBCAK
-

-

-
If you can afford to buy a 600$ CPU, IT IS EXCATLY ONLY YOUR CHOICE WHAT YOU WANT TO DO WITH IT , PERIOD!!!
It's a dumb choice. Let's be real.
 
Joined
Mar 21, 2016
Messages
2,508 (0.78/day)
I'll probably eventually look into how just disabling HT on some individual P cores works out versus disabling or enabling all of them. Someday after dialing in stability across 20 cores with individual ratio's and piles of voltage offsets. There are some scenario's where HT enabled/disabled helps or hinders. It's worth looking at if you want to get the most out of your system for how you use it.

It's not any different than any other of the numerous settings that need attention to optimization for stability or performance reasons. It's not too different than like AVX OFFSET being multifaceted and useful, but not always.

If you can just get away with keeping HT enabled on a few individual cores, 1/2 of them, or even like 3/4's that's might still be worth doing so if there is enough of a qualitative difference between the different setup scenario's.
 
Joined
Jan 14, 2019
Messages
13,195 (6.03/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
I'll probably eventually look into how just disabling HT on some individual P cores works out versus disabling or enabling all of them. Someday after dialing in stability across 20 cores with individual ratio's and piles of voltage offsets. There are some scenario's where HT enabled/disabled helps or hinders. It's worth looking at if you want to get the most out of your system for how you use it.

It's not any different than any other of the numerous settings that need attention to optimization for stability or performance reasons. It's not too different than like AVX OFFSET being multifaceted and useful, but not always.

If you can just get away with keeping HT enabled on a few individual cores, 1/2 of them, or even like 3/4's that's might still be worth doing so if there is enough of a qualitative difference between the different setup scenario's.
I still think that if you need to disable cores and/or HT, then you either chose the wrong CPU, or you seriously need to upgrade your cooling.

I'm not denying that there's a lot of efficiency to be gained on 14th gen, but perhaps the highest of the highest end isn't always necessary. I'm sure that at least half of the i9 owners would be just as happy with an i5.
 
Joined
May 24, 2023
Messages
957 (1.61/day)
Reposting my findings of running frequency limited 14900K with an RTX 4070 and 1440p monitor (HT off) to this more appropriate thread and adding one more data point:

Cyberpunk built in benchmark with RT off, DLSS off, settings high. The low reported power draw is while the camera pans the bar and is reliable and repeatable, the maximal power number is not.

P cores MHz/E cores MHz - power - avg/min/max fps

5700/4400 - 122-160W - 90/73/122 *
5000/4000 - 66-83W - 94/73/126
4800/4000 - 60-75W -93/73/126
4500/3600 - 48-63W - 95/58/126 **

* Game drew more than 160W before I got to the benchmark and then it overheated during benchmark under my air cooler, I removed a power limit for this test

** It seems that the dip in min FPS was caused by E cores running at 3600MHz, combination 4500/4000 was fine

The insane stock frequencies more than doubled power draw for no benefit compared to 4800/4000 settings (in my case even worsening performance due to overheating).
 
Last edited:
Joined
Mar 21, 2016
Messages
2,508 (0.78/day)
No one is saying you need to though. Some people are just inherently into tweaking optimizing to make things more optimal. Also regardless of how happy they would be it's their money and they might reach a point where they aren't just as happy with a i5 at the same time. I mean I could run my i7 just as happy as any weaker performing CPU and at lower power draw than it would use at stock. It's entirely capable of operating how I wish. In fact Asus even has profile setups you can create for specific programs which seems pretty useful.

Until their not happy with i5 and wished they had a i7 or i9's MT capabilities and/or a bit more boost frequency binning though probably more so the MT realistically for a given task that actually benefits from it.
 
Joined
Jan 14, 2019
Messages
13,195 (6.03/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
Reposting my findings of running frequency limited 14900K with an RTX 4070 and 1440p monitor (HT off) to this more appropriate thread and adding one more data point:

Cyberpunk built in benchmark with RT off, DLSS off, settings high. The low reported power draw is while the camera pans the bar and is reliable and repeatable, the maximal power number is not.

P cores MHz/E cores MHz - power - avg/min/max fps

5700/4400 - 122-160W - 90/73/122 *
5000/4000 - 66-83W - 94/73/126
4800/4000 - 60-75W -93/73/126
4500/3600 - 48-63W - 95/58/126 **

* Game drew more than 160W before I got to the benchmark and then it overheated during benchmark under my air cooler, I removed a power limit for this test

** It seems that the dip in min FPS was caused by E cores running at 3600MHz, combination 4500/4000 was fine

The insane stock frequencies more than doubled power draw for no benefit compared to 4800/4000 settings (in my case even worsening performance due to overheating).
Maybe the results with higher clocks are similar to those with lower clocks due to overheating, a GPU limit, or a combination of both?
 
Joined
May 24, 2023
Messages
957 (1.61/day)
Maybe the results with higher clocks are similar to those with lower clocks due to overheating, a GPU limit, or a combination of both?
Sure, I am not testing a CPU, I am testing my CPU-GPU combination with a moderately powerful GPU which is GPU limited at the resolution I want to use, however I lowered the settings of the game to make the task as easy as possible for the GPU. The results are comparable even when the CPU was not overheating, so the indicated constant performance level (94/73/126) is due to the GPU limitation.

It feels like I could/should be using a more powerful GPU with this CPU, however 4070 is similarly expensive as the CPU (in my case I payed 25% more for the GPU, I got one with a nice cooler on it and I got the 14900K quite cheap). So it is not like I am using a GPU which costs a third of the CPU.
 
Last edited:
Joined
Jan 14, 2019
Messages
13,195 (6.03/day)
Location
Midlands, UK
Processor Various Intel and AMD CPUs
Motherboard Micro-ATX and mini-ITX
Cooling Yes
Memory Anything from 4 to 48 GB
Video Card(s) Various Nvidia and AMD GPUs
Storage A lot
Display(s) Monitors and TVs
Case The smaller the better
Audio Device(s) Speakers and headphones
Power Supply 300 to 750 W, bronze to gold
Mouse Wireless
Keyboard Wired
VR HMD Not yet
Software Linux gaming master race
Sure, I am not testing a CPU, I am testing my CPU-GPU combination with a moderately powerful GPU which is GPU limited at the resolution I want to use, however I lowered the settings of the game to make the task as easy as possible for the GPU. The results are comparable even when the CPU was not overheating, so the indicated constant performance level (94/73/126) is due to the GPU limitation.

It feels like I could/should be using a more powerful GPU with this CPU, however 4070 is similarly expensive as the CPU (in my case I payed 25% more for the GPU, I got one with a nice cooler on it and I got the 14900K quite cheap). So it is not like I am using a GPU which costs a third of the CPU.
A hard GPU limit like yours just proves what was said above: you don't need an expensive CPU for gaming, especially with a mid-range GPU and reasonable expectations.
 
Joined
Oct 21, 2005
Messages
7,080 (1.01/day)
Location
USA
System Name Computer of Theseus
Processor Intel i9-12900KS: 50x Pcore multi @ 1.18Vcore (target 1.275V -100mv offset)
Motherboard EVGA Z690 Classified
Cooling Noctua NH-D15S, 2xSF MegaCool SF-PF14, 4xNoctua NF-A12x25, 3xNF-A12x15, AquaComputer Splitty9Active
Memory G-Skill Trident Z5 (32GB) DDR5-6000 C36 F5-6000J3636F16GX2-TZ5RK
Video Card(s) ASUS PROART RTX 4070 Ti-Super OC 16GB, 2670MHz, 0.93V
Storage 1x Samsung 990 Pro 1TB NVMe (OS), 2x Samsung 970 Evo Plus 2TB (data), ASUS BW-16D1HT (BluRay)
Display(s) Dell S3220DGF 32" 2560x1440 165Hz Primary, Dell P2017H 19.5" 1600x900 Secondary, Ergotron LX arms.
Case Lian Li O11 Air Mini
Audio Device(s) Audiotechnica ATR2100X-USB, El Gato Wave XLR Mic Preamp, ATH M50X Headphones, Behringer 302USB Mixer
Power Supply Super Flower Leadex Platinum SE 1000W 80+ Platinum White, MODDIY 12VHPWR Cable
Mouse Zowie EC3-C
Keyboard Vortex Multix 87 Winter TKL (Gateron G Pro Yellow)
Software Win 10 LTSC 21H2
Reposting my findings of running frequency limited 14900K with an RTX 4070 and 1440p monitor (HT off) to this more appropriate thread and adding one more data point:

Cyberpunk built in benchmark with RT off, DLSS off, settings high. The low reported power draw is while the camera pans the bar and is reliable and repeatable, the maximal power number is not.

P cores MHz/E cores MHz - power - avg/min/max fps

5700/4400 - 122-160W - 90/73/122 *
5000/4000 - 66-83W - 94/73/126
4800/4000 - 60-75W -93/73/126
4500/3600 - 48-63W - 95/58/126 **

* Game drew more than 160W before I got to the benchmark and then it overheated during benchmark under my air cooler, I removed a power limit for this test

** It seems that the dip in min FPS was caused by E cores running at 3600MHz, combination 4500/4000 was fine

The insane stock frequencies more than doubled power draw for no benefit compared to 4800/4000 settings (in my case even worsening performance due to overheating).
I had similar experience with 12900KS. The extra frequency is mostly marketing. Possibly with a massive radiator (1080mm) it would be worth while as one could avoid throttling.
 
Joined
Nov 13, 2007
Messages
10,884 (1.74/day)
Location
Austin Texas
System Name stress-less
Processor 9800X3D @ 5.42GHZ
Motherboard MSI PRO B650M-A Wifi
Cooling Thermalright Phantom Spirit EVO
Memory 64GB DDR5 6400 1:1 CL30-36-36-76 FCLK 2200
Video Card(s) RTX 4090 FE
Storage 2TB WD SN850, 4TB WD SN850X
Display(s) Alienware 32" 4k 240hz OLED
Case Jonsbo Z20
Audio Device(s) Yes
Power Supply Corsair SF750
Mouse DeathadderV2 X Hyperspeed
Keyboard 65% HE Keyboard
Software Windows 11
Benchmark Scores They're pretty good, nothing crazy.
If you step back for a second... you will see that 14th gen is now a 2 year old architecture on a 10nm (7n-mish node) competing with a newer architecture, on a newer 4nm node, and really only loses badly to the model with 3D cache in gaming efficiency.

If you compare it to the non-3d cache parts, or to the 3d cache parts in anything but gaming - it's less efficient, but it's also faster...

The fact that this chip is in contention for anything is nothing short of a miracle. There should be no way team blue is in the top 10 slots for any FPS chart, yet there they are. The only way to really do that is to yeet power until you hit 6ghz.
 
Joined
Dec 7, 2022
Messages
56 (0.07/day)
It's a dumb choice. Let's be real.
Its not, for the sake, you turn on the PC, hit del after 1 sec UEFI shows up, short cut to "HT off preset" - save - restart on PCIe NVMe 8 sec later you are good to go with HT off with 5-20% more fps (min/max/avg) for your own pesronal preference. Why? Because you paid the $$$ for it.

Its not dumb at all, it is just an individual use / show case.
 
Top