Tuesday, July 16th 2024

Intel Planning P-core Only "Bartlett" LGA1700 Processor for 2025

In a surprising development, Intel plans to extend the longevity of its Socket LGA1700 platform even as the newer LGA1851 platform led by the Core Ultra 200 "Arrow Lake" remains on track for a late-Q3/early-Q4 2024 debut. This, according to a sensational leak by Jaykihn. It plans to do this with a brand-new silicon for LGA1700, codenamed "Bartlett." This should particularly interest gamers for what's on offer. Imagine the "Raptor Lake-S" die, but with four additional P-cores replacing the four E-core clusters, making a 12-core pure P-core processor—that's "Bartlett." At this point we're not sure which P-core is in use—whether it's the current "Raptor Cove," or whether an attempt will be made by Intel to backport a variant of "Lion Cove" to LGA1700.

This wouldn't be the first pure P-core client processor from Intel after its pivot to heterogeneous multicore—the "Alder Lake" H0 die has six "Golden Cove" P-cores, and lacks any E-core clusters. Intel is planning to give launch an entire new "generation" of processor SKUs for LGA1700 which use the newer client processor nomenclature by Intel, which is Core 200-series, but without the "Ultra" brand extension. There will be SKUs in the Core 3, Core 5, Core 7, and Core 9 brand extensions. Some of these will be Hybrid, and based on the rehashed "Raptor Lake-S" 8P+16E silicon, and some "Alder Lake-S" 8P+8E; but "Bartlett" will be distinctly branded within the series, probably using a letter next to the numerical portion of the processor model number. There will not be any Core 3 series chips based on "Bartlett," but Core 5, Core 7, and Core 9.
The Core 5 "Bartlett" series will feature an 8-core configuration. That's 8 P-cores, and no E-cores. The Core 7 "Bartlett" will be 10-core, no E-core. The Core 9 "Bartlett" will draw the most attention, as being 12-core. If Intel is using "Raptor Cove" P-cores, these should be 8-core/16-thread, 10-core/20-thread, and 12-core/24-thread, respectively. Depending on the K- or non-K SKUs, these chips feature a processor base power value of 125 W, or 65 W, or even 45 W.

Intel is planning to launch these non-Ultra Core Socket LGA1700 processors in Q1-2025, but the "Bartlett" silicon won't arrive before Q3-2025.
Source: Jaykihn (Twitter)
Add your own comment

140 Comments on Intel Planning P-core Only "Bartlett" LGA1700 Processor for 2025

#26
N/A
Intel just went too far with the Raptor. nothing will be ironed out. Lets face it 6.3/5.9 Ghz just isn't sustainable long term. This bartlett is old news actually. Can't believe it.. And who would want that and then have to throttle down to 4.5Ghz to get normal gaming power. 200W in Cyberpunk for the CPU alone is not normal.. Skymont, the E-core is raptor level IPC already 4.5Ghz same
Posted on Reply
#27
oxrufiioxo
Outback BronzeYeah, might have to start Raptor Baking mine soon, get a credit for a new one :pimp:
I was half joking.... I certainly hope that isn't the case.... I have multiple buddies with 13900k/14900k.

I do give them shit though.... Me: Yo your cpu burn up yet... Them: nah.... me: soon....

A couple needed voltages to be manually set to be stable though.
Posted on Reply
#28
JustBenching
oxrufiioxoI think he means it'll consume 250‐300w under heavy load not that it'll be hard to cool. Regardless of what temp the actual cpu runs at you'll still need to dissipate whatever energy it consumes Into your room/office... Not an issue for me but paired with a 4090/5090 it wouldn't surprise me if it hits 600-700w total system power in some games.
It will consume as much as intel decides it should. The 14900 doesn't consume any specific amount either, it exists in 35w,65w and 250w versions. Power draw is never an issue, a cpu with more cores will always be more efficient because it will be faster at the same power.

People keep restating the idea that Intel can't add more p cores cause of power draw which is fundamentally a misunderstanding of basic physics.
oxrufiioxoI expect it to consume a similar amount of power to the 13900k/14900k with the bios intel recommends which is around 260w..... At least till they degrade and Intel recommends The 125w bios.... Which performance isn't very good on even gaming.

Hopefully that's ironed out by then.
If it's used for gaming 125w is more than enough for max performance if you set it up properly - for games. HT off and some undervolting will get you to below 100w even on the heaviest of games with a 4090 completely cpu bound.
Posted on Reply
#29
nguyen
chrcolukThey dont recommend that, this is something the board vendors did of their own back.

Also 125w is more than enough for gaming, I have never seen my CPU go anywhere near 125w, unless of course guess what? I run cinebench,
Dead Space Remake, Battlefield 2042 and HellDivers 2 can push 13900/14900k over 125W, into the 150W range
Posted on Reply
#30
Launcestonian
Is it known at this stage if Bartlett will be using the same memory controller from Raptor lake? If this is the case & I believe it could very well be, then DDR4 will have a longer life span :eek:... in 2025!
Posted on Reply
#31
oxrufiioxo
fevgatosIf it'swill used for gaming 125w is more than enough for max performance if you set it up properly - for games. HT off and some undervolting will get you to below 100w even on the heaviest of games with a 4090 completely cpu bound.
Both samples I worked with lost bad to a 7800X3D at 100w and by about 10% at 125w... with a 4090 so that's my only frame of reference.

It's actually why I reluctantly picked up a 7950X3D while they are cheap I have mine faster than the 7800X3D I had for a week but it's almost more work than it's worth getting there and I'm not a huge fan of having to rely on third party programs.
Posted on Reply
#32
InVasMani
There are rumblings that it's not simply 13900K/14900K however as well. I think it's just as a case of chips that haven't had the most stress placed on the ring bus are most faulty in terms of degradation. It might even impact alder lake generation, but just be slower burn. The first ones to go would be the 12900K if that's the case. It's not necessarily all SKU's from alder lake to raptor lake refresh, but rather particular ones pushed over aggressively across different metrics leading to heavy degradation of the ring bus. That is of course if it is the ring bus like what currently is suspected to be the case.

I can see why that's suspected based on my own experiences with memory tuning because I started noticed memory error once I pushed ring to above x39 to x40 with the same memory settings. Initially I suspected it was more heavily attributed to E cores, but dropping the ratio's on those didn't really help stability and crashing in the way I was expecting it.

For me x39 appeared stable, but x40 showed occasional errors in certain tests like hammer row and moving inversions. I think the moving inversions are with it heavily screws up for the record from what I recall.
Posted on Reply
#33
Redwoodz
fevgatosIt will consume as much as intel decides it should. The 14900 doesn't consume any specific amount either, it exists in 35w,65w and 250w versions. Power draw is never an issue, a cpu with more cores will always be more efficient because it will be faster at the same power.

People keep restating the idea that Intel can't add more p cores cause of power draw which is fundamentally a misunderstanding of basic physics.


If it's used for gaming 125w is more than enough for max performance if you set it up properly - for games. HT off and some undervolting will get you to below 100w even on the heaviest of games with a 4090 completely cpu bound.
"Power draw is never an issue, a cpu with more cores will always be more efficient because it will be faster at the same power.
People keep restating the idea that Intel can't add more p cores cause of power draw which is fundamentally a misunderstanding of basic physics."

Do you now? Seems to me you are making a bunch of generalizations that can effect either scenario.
Power draw is always an issue, no way you can say it isn't. Especially when a cpu is on the border line of what a ATX PC can draw safely with normal cooling methods.
Being more efficient is largely governed by the application in a hybrid cpu. A game that only uses 8 threads will not benefit from more cores while a rendering load will benefit.
Go ahead and run a gaming benchmark at 1080p and tell me you get the same performance while limiting your cpu to 100w.
Posted on Reply
#34
Cheeseball
Not a Potato
oxrufiioxoBoth samples I worked with lost bad to a 7800X3D at 100w and by about 10% at 125w... with a 4090 so that's my only frame of reference.

It's actually why I reluctantly picked up a 7950X3D while they are cheap I have mine faster than the 7800X3D I had for a week but it's almost more work than it's worth getting there and I'm not a huge fan of having to rely on third party programs.
You don't need to rely on any third-party software to get a 7950X3D faster than a 7800X3D, unless you mean extracting more performance beyond stock (like using Process Lasso or Game Bar to determine what is a game or not).

The 7950X3D turbos higher than 7800X3D by default (EDIT: with regard to the 3D cache CCD as well) so it should beat its lesser-core'd brother by a small margin. You can even do a apples-to-apples comparison if you disable CCD1 in your motherboard's UEFI, even though most people wouldn't do that (especially if they use the extra cores for actual multithreaded tasks).
Posted on Reply
#35
JustBenching
oxrufiioxoBoth samples I worked with lost bad to a 7800X3D at 100w and by about 10% at 125w... with a 4090 so that's my only frame of reference.

It's actually why I reluctantly picked up a 7950X3D while they are cheap I have mine faster than the 7800X3D I had for a week but it's almost more work than it's worth getting there and I'm not a huge fan of having to rely on third party programs.
Well I got a test on tlou with an underclocked 14900k at 95w if you want to compare. I don't think your 7800x 3d will be faster. It's on YouTube if you want a link
Posted on Reply
#36
dgianstefani
TPU Proofreader
oxrufiioxoA lot of 13900k/14900k owners gonna need new cpus soon and some 12900k owners who got carried away with Overclocking lol perfect timing
Actually people who overclocked are probably better off.

Setting a manual vcore will likely be 1.4 V or lower. It's the auto boosting broken TVB/borked power settings combo taking chips above 1.5 V at high temperatures that caused degradation.
Posted on Reply
#37
JustBenching
Redwoodz"Power draw is never an issue, a cpu with more cores will always be more efficient because it will be faster at the same power.
People keep restating the idea that Intel can't add more p cores cause of power draw which is fundamentally a misunderstanding of basic physics."

Do you now? Seems to me you are making a bunch of generalizations that can effect either scenario.
Power draw is always an issue, no way you can say it isn't. Especially when a cpu is on the border line of what a ATX PC can draw safely with normal cooling methods.
Being more efficient is largely governed by the application in a hybrid cpu. A game that only uses 8 threads will not benefit from more cores while a rendering load will benefit.
Go ahead and run a gaming benchmark at 1080p and tell me you get the same performance while limiting your cpu to 100w.
I did limit my cpu to 95w at 1080p and got the same results. On a 4090 that is. But gaming is completely irrelevant here.

My point is, power isn't what stops intel from adding more pcores like you seem to be implying. A 20p core chip running at the same power as an 8p core chip will be a lot faster, a lot more efficient and a lot easier to cool.
Posted on Reply
#38
oxrufiioxo
fevgatosWell I got a test on tlou with an underclocked 14900k at 95w if you want to compare. I don't think your 7800x 3d will be faster. It's on YouTube if you want a link
Ran 5 games I'm currently playing locally on 4 different systems that's enough hands on for me.

While I like reviews I really only purchase somthing after using it locally and don't really care for anyone's results but my own.

I hated the 7950X3D at launch but it's quite a bit better now as long as you set it up right but I guess you could say the same about the 13900K/14900k

TLOU wasn't a game I tested finished that game when it launched on pc so any results from it are pointless to me also the X3D chips don't perform much better in it than the non X3D chips in my testing so it probably doesn't take good advantage of the extra cache. I'd for sure need an overlay to even tell them apart even at 1080p with reasonable setting ofc.

Don't get me wrong perfomance was great on all 4 though it was just slower in everything I currently play except warzone and worse in decompression what I use my cpu for the most outside of gaming especially when capped to 125w or lower...

Actually I'd own a 7800X3D if it wasn't so bad at decompression lol...
dgianstefaniActually people who overclocked are probably better off.

Setting a manual vcore will likely be 1.4 V or lower. It's the auto boosting broken TVB/borked power settings combo taking chips above 1.5 V at high temperatures that caused degradation.
Yeah you're probably right I always manually set voltages on my intel systems out of habit.
Posted on Reply
#39
Nostras
I'm not sure what the point of these processors is. For gaming you won't need more than 8 cores. Not for a long time.
With E-cores Intel is at least somewhat competitive in heavily multithreaded applications and for non gaming you can toss you background processes on them.
Having 10 or 12 P cores is sure to make the processors break new records in being inefficient in multi-threaded application. Do review it though.
The only 2 reasons I can see is if you're paranoid about scheduling issues or see 6+8 as a long-term disaster thinking 6 P-cores won't be enough.
If you absolutely must have 8 P-cores just... Buy AMD?

Unless they're dirt cheap of course. But Intel is surely to market them as gaming chips, especially considering the 6 core is missing, so I doubt that.
Posted on Reply
#40
AusWolf
oxrufiioxoWhile I like reviews I really only purchase somthing after using it locally and don't really care for anyone's results but my own.
This is the most sensible comment here.

I've hated hardware that reviews praised, and I've loved hardware that reviews condemned so many times that I even take hard numerical data with a pinch of salt.
Posted on Reply
#41
JustBenching
NostrasI'm not sure what the point of these processors is. For gaming you won't need more than 8 cores. Not for a long time.
Oh you absolutely do need more than 8 for gaming. Check the new game once human for example, it absolutely hammers 8core chips
Posted on Reply
#42
oxrufiioxo
AusWolfThis is the most sensible comment here.

I've hated hardware that reviews praised, and I've loved hardware that reviews condemned so many times that I even take hard numerical data with a pinch of salt.
Yeah the universally praised 7800X3D wasn't for me doesn't make it a bad product it's really good at what it's good at and it just works which is nice....

I would like a 12 P core intel chip more than my 7950X3D though even if it was slightly slower.... Shame intel waited so long.... Assuming they have the degradation under control.

I'd also like a 12 core single ccd x3d chip lol hopefully amd is listening.
Posted on Reply
#43
JustBenching
oxrufiioxoRan 5 games I'm currently playing locally on 4 different systems that's enough hands on for me.

While I like reviews I really only purchase somthing after using it locally and don't really care for anyone's results but my own.

I hated the 7950X3D at launch but it's quite a bit better now as long as you set it up right but I guess you could say the same about the 13900K/14900k

TLOU wasn't a game I tested finished that game when it launched on pc so any results from it are pointless to me also the X3D chips don't perform much better in it than the non X3D chips in my testing so it probably doesn't take good advantage of the extra cache. I'd for sure need an overlay to even tell them apart even at 1080p with reasonable setting ofc.

Don't get me wrong perfomance was great on all 4 though it was just slower in everything I currently play except warzone and worse in decompression what I use my cpu for the most outside of gaming especially when capped to 125w or lower...

Actually I'd own a 7800X3D if it wasn't so bad at decompression lol...



Yeah you're probably right I always manually set voltages on my intel systems out of habit.
Well yeah, the point is that if you set it up for games it sips power if you set it up for max blender performance then it can casually hit 200 watts even for games. I've hit 207w on a 14900k without any overclocking in last of us, lol.
Posted on Reply
#44
oxrufiioxo
fevgatosWell yeah, the point is that if you set it up for games it sips power if you set it up for max blender performance then it can casually hit 200 watts even for games. I've hit 207w on a 14900k without any overclocking in last of us, lol.
Yeah I've seen similar 200w ish stock in that game but I've seen almost as high in UE5 games.

I didn't have the first decendent to test currently playing it performance doesn't seem that great on X3D with it whenever I get around to it I'll mess around with my buddies 14900k system on it out of curiosity online games are pretty difficult to accurately benchmark though.
Posted on Reply
#45
AusWolf
oxrufiioxoYeah the universally praised 7800X3D wasn't for me doesn't make it a bad product it's really good at what it's good at and it just works which is nice....
My 7800X3D didn't work with water, no matter what I did. There must be something with my AIO's cold plate, or I don't know. But as soon as I got an air cooler for it, I loved it.

My most recent positive examples are a 6500 XT which I adore for being a small and quiet GPU that sips power, perfect for older games, and an i7-11700 which is awesome for its configurability.
My most recent negative example is the Ryzen 5 3600 which I couldn't for the love of god keep from throttling in a low-profile SFF case. Like you said, it's not bad, just didn't do it for me.
oxrufiioxoI would like a 12 P core intel chip more than my 7950X3D though even if it was slightly slower.... Shame intel waited so long.... Assuming they have the degradation under control.
To be honest, I've given up on Intel with Alder/Raptor Lake, but yet another P-core only monolithic design in the making got some long lost juices flowing in me. :rolleyes:
Posted on Reply
#46
oxrufiioxo
AusWolfMy 7800X3D didn't work with water, no matter what I did. There must be something with my AIO's cold plate, or I don't know. But as soon as I got an air cooler for it, I loved it.
Don't have any open loop experience but both the 7800X3D/7950X3D run cool on the Corsair 150 elite lcd and the LF III 360...

For me the 16 core runs cooler in both gaming and MT workloads even though it boost higher in both which I found odd even when disabling the second CCD... Could just be the samples I worked with they all varied a little bit.

The 7800X3D hit 80 compiling shaders vs 70 on the 7950X3D both 100 utilization.
Posted on Reply
#47
Nostras
fevgatosOh you absolutely do need more than 8 for gaming. Check the new game once human for example, it absolutely hammers 8core chips
Based on some very surface level googling it appears that the game has a problem with real-time rendering of shaders. That would, aside being a massive fail for the devs, work just as fine on the E-cores.
Posted on Reply
#48
AusWolf
oxrufiioxoDon't have any open loop experience but both the 7800X3D/7950X3D run cool on the Corsair 150 elite lcd and the LF III 360...
I have a be quiet! Silent Loop 2 280 mm, which works great with the 7700X even at full load, but can't push the 7800X3D past 50-55 W. I've tried different pastes, offset mounts, etc, nothing worked. I don't think I'll ever fully understand why.
oxrufiioxoFor me the 16 core runs cooler in both gaming and MT workloads even though it boost higher in both which I found odd even when disabling the second CCD... Could just be the samples I worked with they all varied a little bit.

The 7800X3D hit 80 compiling shaders vs 70 on the 7950X3D both 100 utilization.
Dual CCD units spread their heat across a greater area. If we take the 7700X and 7950X, that's a 142 and a 230 W PPT across 1 and 2 CCDs. Let's take 20 W for the IO die, then we're left with 122 W per CCD on the 7700X, and 105 W per CCD on the 7950X.

Disabling the second CCD is a different matter, though. I don't know why you saw lower temps then. Ryzen is weird.
Posted on Reply
#49
oxrufiioxo
AusWolfDual CCD units spread their heat across a greater area. If we take the 7700X and 7950X, that's a 142 and a 230 W PPT across 1 and 2 CCDs. Let's take 20 W for the IO die, then we're left with 122 W per CCD on the 7700X, and 105 W per CCD on the 7950X.
Yeah I'm fully aware it's what I've observed on my 5800X vs 5950X but even when disabling the second CCD the 7950X3D was both cooler and ran at a higher wattage which makes no sense to me although I don't have an accurate enough way to monitor voltages my guess is it ran at a lower voltage.

I am looking forward to this mythical 12 core though and hope it does well enough for them to just do it on their new socket as well.... Would love it as my secondary setup.
Posted on Reply
#50
P4-630
I was planning an Arrow Lake build , still do, I guess I'll keep my current GB Aorus Master Z690 system as backup instead of selling it which I always did when I got a new build...
Assuming the Z690 Aorus Master will still get a BIOS update to run these new CPU's.
Posted on Reply
Add your own comment
Nov 21st, 2024 12:29 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts