• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Core i9-14900K Raptor Lake Tested at Power Limits Down to 35 W

Joined
Oct 23, 2020
Messages
57 (0.04/day)
The 7950x3D is an improved version of the 7950x not the other way around. AMD added 3D cache and lowered power consumption to get similar performance that's progress and good.
The 7950X gets about the same performance as the 3D variant when set to the same power limit. The efficiency gains on MT at 142W is mostly due to the power limit being closer to the sweet spot for it.
And the 7900X at 142W is on average ~2% slower than stock in heavy MT workloads, an even smaller difference than the 7950X ~5% slower average.
 

cchi

New Member
Joined
Nov 12, 2022
Messages
9 (0.01/day)
I dont get why such test stop at 125W for the testing. I have seen multiple outlets doing this. We would like to see at least 1 point between 125W and 253W, preferably 2.
One important use case for such tests is for people that cannot run at 253W or higher due to thermal limitations (cooling cost, warm climate, small room), for them to understand how much they need to lower the power and the associated performance loss.
 
Joined
Nov 9, 2022
Messages
39 (0.05/day)
Why not a 14700K? Seems like this 8P+12E SKU is actually the one to get, with the pricing being what it is and all. The 14900K... well, it's cheaper than a 13900KS and is basically one, so I guess you can't hit it too much other than for being as low effort as Intel could have

The 13700K is currently about a $60USD savings (having gone to discount) and I have yet to see a situation where I’d use the extra E-cores based on current testing across multiple sites.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,960 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
I dont get why such test stop at 125W for the testing. I have seen multiple outlets doing this. We would like to see at least 1 point between 125W and 253W, preferably 2.
One important use case for such tests is for people that cannot run at 253W or higher due to thermal limitations (cooling cost, warm climate, small room), for them to understand how much they need to lower the power and the associated performance loss.
Pick a number, I’ll do a round of testing just for you
 
Last edited:
Joined
Sep 28, 2012
Messages
982 (0.22/day)
System Name Poor Man's PC
Processor Ryzen 7 9800X3D
Motherboard MSI B650M Mortar WiFi
Cooling Thermalright Phantom Spirit 120 with Arctic P12 Max fan
Memory 32GB GSkill Flare X5 DDR5 6000Mhz
Video Card(s) XFX Merc 310 Radeon RX 7900 XT
Storage XPG Gammix S70 Blade 2TB + 8 TB WD Ultrastar DC HC320
Display(s) Xiaomi G Pro 27i MiniLED
Case Asus A21 Case
Audio Device(s) MPow Air Wireless + Mi Soundbar
Power Supply Enermax Revolution DF 650W Gold
Mouse Logitech MX Anywhere 3
Keyboard Logitech Pro X + Kailh box heavy pale blue switch + Durock stabilizers
VR HMD Meta Quest 2
Benchmark Scores Who need bench when everything already fast?


So its $500++ Celeron.Got it.
 
Last edited:
Joined
Nov 9, 2022
Messages
39 (0.05/day)
Ain't bottle a wonderful thing. :toast:



Honestly at that strict a power limit, you can probably optimize per use case. I'd shave 7 of the P-cores and three clusters of E-cores, and attempt to maintain frequency.

Probably would still kick for a lot of applications

At that point, less cores with more clock speed will come into play, and an i5-13600 non-K will win easily with good energy efficiency (especially as a non-K has lower PL1/PL2). And you won’t have to perform all sorts of oddball tweaks.

P

Pick a number, I’ll do a round of testing just for you

I would be interested (seeing as non-K CPUs like the i7-13700 have a PL1 of 65w and a PL/2 of 219w, which is the rumored/leaked spec of the 14700 I believe as well) to see what you would get with this setting, or dropping to PL/1 65w, PL/2 of 200 with an undervolt that most CPUs would be predicted to do without issue (.1v? .075v?). But my focus would there is probably little reason other than doing a non-K to K power/performance comparison.
 
Joined
Mar 19, 2009
Messages
2,487 (0.43/day)
System Name Always changing
Processor Always changing
Motherboard Always changing
Cooling Always changing
Memory Always changing
Video Card(s) Always changing
Storage Always changing
Display(s) Always changing
Case Always changing
Audio Device(s) Always changing
Power Supply Always changing
Mouse Always changing
Keyboard Always changing
I dont get why such test stop at 125W for the testing. I have seen multiple outlets doing this. We would like to see at least 1 point between 125W and 253W, preferably 2.
One important use case for such tests is for people that cannot run at 253W or higher due to thermal limitations (cooling cost, warm climate, small room), for them to understand how much they need to lower the power and the associated performance loss.
I highly doubt anyone worried about thermal limitations such as cooling cost would even be thinking about buying a 14900k. It's like buying a Raptor and being concerned about how many miles per gallon you get on your daily commute :laugh: but really though on a serious note these two don't go hand in hand for the intended buyer LOL. Especially when relative game performance at 125W to 253W is already a tiny 1.4% at 1440P and a 35.2C drop in temps which is what most tinkerers would do. Why split hairs? But, you do you boo.
The 13700K is currently about a $60USD savings (having gone to discount) and I have yet to see a situation where I’d use the extra E-cores based on current testing across multiple sites.
Yet another weird point. Save $25 ($393.99 vs $419.00 as current pricing for the 13700k and 14700k) instead of getting the latest option? That shouldn't even be a consideration on a new build :kookoo:. Heck even as far as the $60 you mention. Wouldn't even but on my radar with such a miniscule price difference in the overall cost. I don't see why anyone in their right mind would choose to go with a 13th gen X700K(F) on a fresh build now regardless if the cores are being taken advantage of at this point in time. Even better, you could get the 14700KF and it would cost the same at $394.00. Who needs integrated graphics when you know you are going to buy a GPU anyway. Failsafe I suppose incase you get a DOA GPU. Of course, if you already have a 13th gen i7, yes I wouldn't feel the need to jump on the 14th bandwagon. That makes sense.

Just my thoughts LOL fire away.
 
Last edited:
Joined
Dec 26, 2006
Messages
3,862 (0.59/day)
Location
Northern Ontario Canada
Processor Ryzen 5700x
Motherboard Gigabyte X570S Aero G R1.1 BiosF5g
Cooling Noctua NH-C12P SE14 w/ NF-A15 HS-PWM Fan 1500rpm
Memory Micron DDR4-3200 2x32GB D.S. D.R. (CT2K32G4DFD832A)
Video Card(s) AMD RX 6800 - Asus Tuf
Storage Kingston KC3000 1TB & 2TB & 4TB Corsair MP600 Pro LPX
Display(s) LG 27UL550-W (27" 4k)
Case Be Quiet Pure Base 600 (no window)
Audio Device(s) Realtek ALC1220-VB
Power Supply SuperFlower Leadex V Gold Pro 850W ATX Ver2.52
Mouse Mionix Naos Pro
Keyboard Corsair Strafe with browns
Software W10 22H2 Pro x64
Looks like 65W-95W would be the sweet range.....................depending on what you're doing.

also
 
Joined
Nov 9, 2022
Messages
39 (0.05/day)
Yet another weird point. Save $25 ($393.99 vs $419.00 as current pricing for the 13700k and 14700k) instead of getting the latest option? That shouldn't even be a consideration on a new build :kookoo:. Heck even as far as the $60 you mention. Wouldn't even but on my radar with such a miniscule price difference in the overall cost. I don't see why anyone in their right mind would choose to go with a 13th gen X700K(F) on a fresh build now regardless if the cores are being taken advantage of at this point in time. Even better, you could get the 14700KF and it would cost the same at $394.00. Who needs integrated graphics when you know you are going to buy a GPU anyway. Failsafe I suppose incase you get a DOA GPU. Of course, if you already have a 13th gen i7, yes I wouldn't feel the need to jump on the 14th bandwagon. That makes sense.

Just my thoughts LOL fire away.

One, I usually don’t get F variants because I find QuickSync a useful feature, and you don’t get that on an F. Two, not $393.99, but $363.99 vs $419.99.

I have nothing against your thoughts; but my reasoning (I had an i5) works for me, being:

-The additional two P-cores (and their hyperthreading) are useful to me. I’d have bought an i7 at the time, but Raptor Lake Refresh (at the time) hinted at much more of a performance improvement than there turned out to be, and I bought the i5 to save a bit (I can now sell it though) prior to likely getting RL Refresh. I believe now that was a mistake but hindsight is always 20/20.
-The 14700k upon release was shown to have little to no performance gains in what I do day-to-day over a 13700k. At the same time, it was shown to consume even more power, and run even warmer. Why should I work on cooling four additional E-cores that are providing me no tangible benefit?
-The money I save can be used for better cooling, or upgrading from my EVGA RTX 3080Ti down the road, either of which make more sense to me (I’m skipping Ada Lovelace and waiting for the generation after).
-I wasn’t going with a fresh build. I have a Gigabyte Z690 Aorus Master I got open box for half price at Microcenter early this year, and it’s the perfect board for me, a near-flagship board at a more normal price. I got open-box G.Skill Trident Z DDR5 RAM as well, between those two components I saved almost $300 off new prices. I’m picky about mainboards and I see myself as having this platform for awhile.

One of the rumors that turned out to be false was that the i5-14600k would be eight P-cores and eight E-cores. A second was that there would be some new voltage regulation tech on the “14th” gen. This was what I was looking ahead to along with possible process improvements, as the i5 has had much lower wattage than the i7 or i9. Now that the reality of the “14th” gen is here, all of the current after-release research I’ve done leads me to believe my best choice is an i7-13700k tuned for efficiency. If I could have predicted all of this, I admit I’d have likely skipped the i5 purchase and just gone i7 in the first place.

You do you. My reasoning works for me; it doesn’t have to for anyone else (though Steve at GN, Jay, and others have largely said if the 13th saves you more than a few bucks, go with what’s cheaper as well). I take no offense, I just know what my needs are, and they aren’t yours.
 
Last edited:

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,960 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
I would be interested (seeing as non-K CPUs like the i7-13700 have a PL1 of 65w and a PL/2 of 219w, which is the rumored/leaked spec of the 14700 I believe as well) to see what you would get with this setting, or dropping to PL/1 65w, PL/2 of 200 with an undervolt that most CPUs would be predicted to do without issue (.1v? .075v?). But my focus would there is probably little reason other than doing a non-K to K power/performance comparison.
You can't reliably simulate chips like that, we don't know their voltages and clocks and binning

If just 1 then 200W, if 2 then 165W and 210W. Thanks! :lovetpu:
Will see what I can do next week
 
Joined
Jun 6, 2022
Messages
622 (0.67/day)
Very cheap for the performance and efficiency it delivers, the diluted average means you won't see games where the X3D beats the i9 by 30-40% like TW. Plus, People who really need powerful CPUs because "time is money" should go for TR workstation, not desktop CPUs.

What are you talking about? MSRP? Because I see the price of the X3D dropping to almost the price of the i5 13600k often.
Cheap? Seriously? In all the reviews, the X3D processors have the most disastrous performance per dollar ratio even when you use the RTX 4090.
I tested all the installed games with the benchmark included and I didn't see any difference between the i5-13500 and the 14700KF. None! The video card is 3070Ti, mainstream in 2021, middle now.
So, if I replace 13500 with 7800X3D for this video card, I get a big damage to my wallet, zero increase in performance in games and applications.
Tests prohibited by reviewers are those with X3D versus non-X3D comparisons using video cards from the entry-enthusiast range. We only have comparisons for enthusiasts, although over 90% of players use weaker video cards. I'm really curious to see what performance boost the 7800X3D brings in front of the 7700X using an RTX 4070 Ti video card or weaker or equivalent AMD. We know for sure that the 7700X is cheaper (much cheaper) and beats the 7800X3D in applications.
 
Joined
Mar 7, 2007
Messages
1,426 (0.22/day)
Processor E5-1680 V2
Motherboard Rampage IV black
Video Card(s) Asrock 7900 xtx
Storage 500 gb sd
Software windows 10 64 bit
Benchmark Scores 29,433 3dmark06 score
I see an arctic 420 mentioned in setup, am I missing something or are there no results mentioned with a watercooler involved?
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,960 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
I see an arctic 420 mentioned in setup, am I missing something or are there no results mentioned with a watercooler involved?
Correct, I used the Noctua only. Test Setup table has been updated
 
Joined
Aug 21, 2015
Messages
1,752 (0.51/day)
Location
North Dakota
System Name Office
Processor Ryzen 5600G
Motherboard ASUS B450M-A II
Cooling be quiet! Shadow Rock LP
Memory 16GB Patriot Viper Steel DDR4-3200
Video Card(s) Gigabyte RX 5600 XT
Storage PNY CS1030 250GB, Crucial MX500 2TB
Display(s) Dell S2719DGF
Case Fractal Define 7 Compact
Power Supply EVGA 550 G3
Mouse Logitech M705 Marthon
Keyboard Logitech G410
Software Windows 10 Pro 22H2
"It's not only important how much power is consumed, but also how quickly a task is completed—taking both into account results in "efficiency."

Taking both into account you get "energy"

Both are true in context. As you've noted,

Power x Time = Energy

In a fixed workload like a CB run, leveraging that relationship can reveal relative efficiency. To keep things simple, let's assume two configurations that use the same amount of power. Using the first equation, the setup that took less time would have consumed less energy, indicating greater efficiency.

The wording is maybe not the clearest, but I don't think it's wrong.
 
Joined
Aug 13, 2010
Messages
5,479 (1.04/day)
The only thing that bothers me is the lack of software ingenuity for Intel to automatize the process of fine-tuning processors for the users, as long as it may take the software to do so.
A software that selects a few operation modes and tunes them silicon-specific should have been a part of XTU the moment 12th gen has landed.

It is worth mentioning that people who want to absolutely minmax 13th and 14th gen capabilities can tune each core's peak voltage and frequency individually, really squeezing every last drop the silicon has to offer in terms of binning. Roughly setting PL1-2 peak wattage with a bit of undervolting bias is a simple solution that should also be included in an a simple software function for users. Not as something you delve into, but as a generic function which lets you pick package power and runs stability tests on in order to set a safe voltage
 
Joined
Feb 1, 2019
Messages
3,666 (1.70/day)
Location
UK, Midlands
System Name Main PC
Processor 13700k
Motherboard Asrock Z690 Steel Legend D4 - Bios 13.02
Cooling Noctua NH-D15S
Memory 32 Gig 3200CL14
Video Card(s) 4080 RTX SUPER FE 16G
Storage 1TB 980 PRO, 2TB SN850X, 2TB DC P4600, 1TB 860 EVO, 2x 3TB WD Red, 2x 4TB WD Red
Display(s) LG 27GL850
Case Fractal Define R4
Audio Device(s) Soundblaster AE-9
Power Supply Antec HCG 750 Gold
Software Windows 10 21H2 LTSC
Whats interesting with 13th gen, and perhaps if you get time you can try it with 14th gen, is undervolting cache seems as effective as vcore adjustments.

But yeah this testing shows, that the chips are basically factory overclocked, circa 10% performance gained by more than doubling the power consumption. Way too far up the curve.

This is why some of us dont buy this any temp below 100C is wasted performance nonsense, intel have basically cranked it up no matter the cost.

So without u/v 125w is sweet spot and probably should have been official spec, the temp also doesnt need an insane cooler either, and if prepared to u/v 95w.

Feels like they are like the car industry in late 1970s, early 80s where they had to deal with sorting out their inefficient engines.

@dj-electric yeah the auto o/c tools I would like to see auto undervolt feature's added.
 
Last edited:
Joined
Oct 6, 2021
Messages
1,605 (1.37/day)
Cheap? Seriously? In all the reviews, the X3D processors have the most disastrous performance per dollar ratio even when you use the RTX 4090.
I tested all the installed games with the benchmark included and I didn't see any difference between the i5-13500 and the 14700KF. None! The video card is 3070Ti, mainstream in 2021, middle now.
So, if I replace 13500 with 7800X3D for this video card, I get a big damage to my wallet, zero increase in performance in games and applications.
Tests prohibited by reviewers are those with X3D versus non-X3D comparisons using video cards from the entry-enthusiast range. We only have comparisons for enthusiasts, although over 90% of players use weaker video cards. I'm really curious to see what performance boost the 7800X3D brings in front of the 7700X using an RTX 4070 Ti video card or weaker or equivalent AMD. We know for sure that the 7700X is cheaper (much cheaper) and beats the 7800X3D in applications.
The price difference between the 7700X and 7800X3D is only U$40-50, which isn't a significant dent in your budget, and it's impressive among chips within the high-end gaming CPU segment

But I agree that an i5 or even a Ryzen 7600/5600x/5700x is more than enough for most people.
 
Joined
Dec 30, 2010
Messages
2,200 (0.43/day)
I dont get why such test stop at 125W for the testing. I have seen multiple outlets doing this. We would like to see at least 1 point between 125W and 253W, preferably 2.
One important use case for such tests is for people that cannot run at 253W or higher due to thermal limitations (cooling cost, warm climate, small room), for them to understand how much they need to lower the power and the associated performance loss.

Offices don't need maxed out CPU's. Thats why that whole option is in the bios to cap it's power consumption. In office PC's there's usually no high end stuff in regards of cooling, just stock CPU coolers.

It's amazing to see that in between 125W and 254W there's barely no difference. Intel needs to stop with the 16E cores and head to performance cores again.
 
Joined
Jul 20, 2020
Messages
1,151 (0.71/day)
System Name Gamey #1 / #3
Processor Ryzen 7 5800X3D / Ryzen 7 5700X3D
Motherboard Asrock B450M P4 / MSi B450 ProVDH M
Cooling IDCool SE-226-XT / IDCool SE-224-XTS
Memory 32GB 3200 CL16 / 16GB 3200 CL16
Video Card(s) PColor 6800 XT / GByte RTX 3070
Storage 4TB Team MP34 / 2TB WD SN570
Display(s) LG 32GK650F 1440p 144Hz VA
Case Corsair 4000Air / TT Versa H18
Power Supply EVGA 650 G3 / EVGA BQ 500
Offices don't need maxed out CPU's. Thats why that whole option is in the bios to cap it's power consumption. In office PC's there's usually no high end stuff in regards of cooling, just stock CPU coolers.

It's amazing to see that in between 125W and 254W there's barely no difference. Intel needs to stop with the 16E cores and head to performance cores again.

Neither do they need kneecapped i9s, that's a waste of money and resources. They need 35-65W 6P+4E i5s which are still overkill for the vast majority of office work.
 
Joined
Jun 6, 2022
Messages
622 (0.67/day)
The only thing that bothers me is the lack of software ingenuity for Intel to automatize the process of fine-tuning processors for the users, as long as it may take the software to do so.
A software that selects a few operation modes and tunes them silicon-specific should have been a part of XTU the moment 12th gen has landed.

It is worth mentioning that people who want to absolutely minmax 13th and 14th gen capabilities can tune each core's peak voltage and frequency individually, really squeezing every last drop the silicon has to offer in terms of binning. Roughly setting PL1-2 peak wattage with a bit of undervolting bias is a simple solution that should also be included in an a simple software function for users. Not as something you delve into, but as a generic function which lets you pick package power and runs stability tests on in order to set a safe voltage
Motherboards and silicon lottery make it impossible to establish an effective preset. You can manually set PL1/2 and the rest is handled by software with automatic adjustment of frequencies and voltages. There is a whole palette of tools in BIOS or XTU for fine tuning. You can manually adjust the voltage for each core P or E and Uncore (SA, MC, cache).

The price difference between the 7700X and 7800X3D is only U$40-50, which isn't a significant dent in your budget, and it's impressive among chips within the high-end gaming CPU segment

But I agree that an i5 or even a Ryzen 7600/5600x/5700x is more than enough for most people.
You pay an extra $50... for what?
The $50 is almost enough to change the 7700XT to the 7800XT in the purchase list, video cards that the 7800X3D cannot help. And the 7700X, according to reviews, surpasses the 7800X3D in applications.
It is just one of the many examples in which choosing an X3D over a non-X3D only brings damage to the wallet.

Intel needs to stop with the 16E cores and head to performance cores again.
AMD disagrees with you. The reason I don't use e-core is because they are still working on something viable. Ray-tracing history repeats itself.
How can you completely disable these cores, only those who do not have processors with E-core cores have a problem with them.

....................................

100 minutes with 14700KF (stock settings) only with www (news and forums) and youtube in the background.
Average CPU: 6W
If we activate all the LED crap (I can't stand them), we triple the consumption.
I think I will resist the next bill.
Clipboard01.jpg
 
Last edited:

Mussels

Freshwater Moderator
Joined
Oct 6, 2004
Messages
58,413 (7.91/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
This goes to show that Intel really need to release a P core only series for gamers, the concept of one "best of everything" processor is failing them badly.

Best single core performance
Best multi core performance
Low wattage


pick one at a time, because it cant do them at all once.

Yet without the E-cores raising the wattages, they wouldnt need as high end motherboards, cooling, PSUs etc

AMD disagrees with you. The reason I don't use e-core is because they are still working on something viable. Ray-tracing history repeats itself.
How can you completely disable these cores, only those who do not have processors with E-core cores have a problem with them.
Incorrect, AMD's C cores are an entirely different approach to E-cores, since AMD's C cores support the exact same instructions as their regular cores.
AMD's mini Zen 4c cores explained: They're nothing like Intel's Efficient cores | PC Gamer

Intels E-cores are not the same, and that's where they cause issues - those missing instructions break programs when they get bounced from a P core to an E core, because suddenly they can't operate and crash - this happens with a few games anti-cheat mechanisms, as one common example.
I've got 3 CPU's here with E-cores, and while i don't disable them (because i need them for testing purposes) I have zero interest in using them either. They're a bandaid to win in short benchmarks and not useful to consumers.
 
Joined
Jun 14, 2020
Messages
3,530 (2.14/day)
System Name Mean machine
Processor 12900k
Motherboard MSI Unify X
Cooling Noctua U12A
Memory 7600c34
Video Card(s) 4090 Gamerock oc
Storage 980 pro 2tb
Display(s) Samsung crg90
Case Fractal Torent
Audio Device(s) Hifiman Arya / a30 - d30 pro stack
Power Supply Be quiet dark power pro 1200
Mouse Viper ultimate
Keyboard Blackwidow 65%
Intels E-cores are not the same, and that's where they cause issues - those missing instructions break programs when they get bounced from a P core to an E core, because suddenly they can't operate and crash - this happens with a few games anti-cheat mechanisms, as one common example.
I've got 3 CPU's here with E-cores, and while i don't disable them (because i need them for testing purposes) I have zero interest in using them either. They're a bandaid to win in short benchmarks and not useful to consumers.
Have you actually tested games with and without ecores? Everything I've tested runs considerably worse without them.
 
Joined
Aug 13, 2010
Messages
5,479 (1.04/day)
Motherboards and silicon lottery make it impossible to establish an effective preset. You can manually set PL1/2 and the rest is handled by software with automatic adjustment of frequencies and voltages. There is a whole palette of tools in BIOS or XTU for fine tuning. You can manually adjust the voltage for each core P or E and Uncore (SA, MC, cache).
My whole take is that you shouldn't establish voltage presets, only power ones and let the software play with voltages during load tests.
 
Top