Tuesday, July 16th 2024
Intel Planning P-core Only "Bartlett" LGA1700 Processor for 2025
In a surprising development, Intel plans to extend the longevity of its Socket LGA1700 platform even as the newer LGA1851 platform led by the Core Ultra 200 "Arrow Lake" remains on track for a late-Q3/early-Q4 2024 debut. This, according to a sensational leak by Jaykihn. It plans to do this with a brand-new silicon for LGA1700, codenamed "Bartlett." This should particularly interest gamers for what's on offer. Imagine the "Raptor Lake-S" die, but with four additional P-cores replacing the four E-core clusters, making a 12-core pure P-core processor—that's "Bartlett." At this point we're not sure which P-core is in use—whether it's the current "Raptor Cove," or whether an attempt will be made by Intel to backport a variant of "Lion Cove" to LGA1700.
This wouldn't be the first pure P-core client processor from Intel after its pivot to heterogeneous multicore—the "Alder Lake" H0 die has six "Golden Cove" P-cores, and lacks any E-core clusters. Intel is planning to give launch an entire new "generation" of processor SKUs for LGA1700 which use the newer client processor nomenclature by Intel, which is Core 200-series, but without the "Ultra" brand extension. There will be SKUs in the Core 3, Core 5, Core 7, and Core 9 brand extensions. Some of these will be Hybrid, and based on the rehashed "Raptor Lake-S" 8P+16E silicon, and some "Alder Lake-S" 8P+8E; but "Bartlett" will be distinctly branded within the series, probably using a letter next to the numerical portion of the processor model number. There will not be any Core 3 series chips based on "Bartlett," but Core 5, Core 7, and Core 9.The Core 5 "Bartlett" series will feature an 8-core configuration. That's 8 P-cores, and no E-cores. The Core 7 "Bartlett" will be 10-core, no E-core. The Core 9 "Bartlett" will draw the most attention, as being 12-core. If Intel is using "Raptor Cove" P-cores, these should be 8-core/16-thread, 10-core/20-thread, and 12-core/24-thread, respectively. Depending on the K- or non-K SKUs, these chips feature a processor base power value of 125 W, or 65 W, or even 45 W.
Intel is planning to launch these non-Ultra Core Socket LGA1700 processors in Q1-2025, but the "Bartlett" silicon won't arrive before Q3-2025.
Source:
Jaykihn (Twitter)
This wouldn't be the first pure P-core client processor from Intel after its pivot to heterogeneous multicore—the "Alder Lake" H0 die has six "Golden Cove" P-cores, and lacks any E-core clusters. Intel is planning to give launch an entire new "generation" of processor SKUs for LGA1700 which use the newer client processor nomenclature by Intel, which is Core 200-series, but without the "Ultra" brand extension. There will be SKUs in the Core 3, Core 5, Core 7, and Core 9 brand extensions. Some of these will be Hybrid, and based on the rehashed "Raptor Lake-S" 8P+16E silicon, and some "Alder Lake-S" 8P+8E; but "Bartlett" will be distinctly branded within the series, probably using a letter next to the numerical portion of the processor model number. There will not be any Core 3 series chips based on "Bartlett," but Core 5, Core 7, and Core 9.The Core 5 "Bartlett" series will feature an 8-core configuration. That's 8 P-cores, and no E-cores. The Core 7 "Bartlett" will be 10-core, no E-core. The Core 9 "Bartlett" will draw the most attention, as being 12-core. If Intel is using "Raptor Cove" P-cores, these should be 8-core/16-thread, 10-core/20-thread, and 12-core/24-thread, respectively. Depending on the K- or non-K SKUs, these chips feature a processor base power value of 125 W, or 65 W, or even 45 W.
Intel is planning to launch these non-Ultra Core Socket LGA1700 processors in Q1-2025, but the "Bartlett" silicon won't arrive before Q3-2025.
140 Comments on Intel Planning P-core Only "Bartlett" LGA1700 Processor for 2025
I do give them shit though.... Me: Yo your cpu burn up yet... Them: nah.... me: soon....
A couple needed voltages to be manually set to be stable though.
People keep restating the idea that Intel can't add more p cores cause of power draw which is fundamentally a misunderstanding of basic physics. If it's used for gaming 125w is more than enough for max performance if you set it up properly - for games. HT off and some undervolting will get you to below 100w even on the heaviest of games with a 4090 completely cpu bound.
It's actually why I reluctantly picked up a 7950X3D while they are cheap I have mine faster than the 7800X3D I had for a week but it's almost more work than it's worth getting there and I'm not a huge fan of having to rely on third party programs.
I can see why that's suspected based on my own experiences with memory tuning because I started noticed memory error once I pushed ring to above x39 to x40 with the same memory settings. Initially I suspected it was more heavily attributed to E cores, but dropping the ratio's on those didn't really help stability and crashing in the way I was expecting it.
For me x39 appeared stable, but x40 showed occasional errors in certain tests like hammer row and moving inversions. I think the moving inversions are with it heavily screws up for the record from what I recall.
People keep restating the idea that Intel can't add more p cores cause of power draw which is fundamentally a misunderstanding of basic physics."
Do you now? Seems to me you are making a bunch of generalizations that can effect either scenario.
Power draw is always an issue, no way you can say it isn't. Especially when a cpu is on the border line of what a ATX PC can draw safely with normal cooling methods.
Being more efficient is largely governed by the application in a hybrid cpu. A game that only uses 8 threads will not benefit from more cores while a rendering load will benefit.
Go ahead and run a gaming benchmark at 1080p and tell me you get the same performance while limiting your cpu to 100w.
The 7950X3D turbos higher than 7800X3D by default (EDIT: with regard to the 3D cache CCD as well) so it should beat its lesser-core'd brother by a small margin. You can even do a apples-to-apples comparison if you disable CCD1 in your motherboard's UEFI, even though most people wouldn't do that (especially if they use the extra cores for actual multithreaded tasks).
Setting a manual vcore will likely be 1.4 V or lower. It's the auto boosting broken TVB/borked power settings combo taking chips above 1.5 V at high temperatures that caused degradation.
My point is, power isn't what stops intel from adding more pcores like you seem to be implying. A 20p core chip running at the same power as an 8p core chip will be a lot faster, a lot more efficient and a lot easier to cool.
While I like reviews I really only purchase somthing after using it locally and don't really care for anyone's results but my own.
I hated the 7950X3D at launch but it's quite a bit better now as long as you set it up right but I guess you could say the same about the 13900K/14900k
TLOU wasn't a game I tested finished that game when it launched on pc so any results from it are pointless to me also the X3D chips don't perform much better in it than the non X3D chips in my testing so it probably doesn't take good advantage of the extra cache. I'd for sure need an overlay to even tell them apart even at 1080p with reasonable setting ofc.
Don't get me wrong perfomance was great on all 4 though it was just slower in everything I currently play except warzone and worse in decompression what I use my cpu for the most outside of gaming especially when capped to 125w or lower...
Actually I'd own a 7800X3D if it wasn't so bad at decompression lol... Yeah you're probably right I always manually set voltages on my intel systems out of habit.
With E-cores Intel is at least somewhat competitive in heavily multithreaded applications and for non gaming you can toss you background processes on them.
Having 10 or 12 P cores is sure to make the processors break new records in being inefficient in multi-threaded application. Do review it though.
The only 2 reasons I can see is if you're paranoid about scheduling issues or see 6+8 as a long-term disaster thinking 6 P-cores won't be enough.
If you absolutely must have 8 P-cores just... Buy AMD?
Unless they're dirt cheap of course. But Intel is surely to market them as gaming chips, especially considering the 6 core is missing, so I doubt that.
I've hated hardware that reviews praised, and I've loved hardware that reviews condemned so many times that I even take hard numerical data with a pinch of salt.
I would like a 12 P core intel chip more than my 7950X3D though even if it was slightly slower.... Shame intel waited so long.... Assuming they have the degradation under control.
I'd also like a 12 core single ccd x3d chip lol hopefully amd is listening.
I didn't have the first decendent to test currently playing it performance doesn't seem that great on X3D with it whenever I get around to it I'll mess around with my buddies 14900k system on it out of curiosity online games are pretty difficult to accurately benchmark though.
My most recent positive examples are a 6500 XT which I adore for being a small and quiet GPU that sips power, perfect for older games, and an i7-11700 which is awesome for its configurability.
My most recent negative example is the Ryzen 5 3600 which I couldn't for the love of god keep from throttling in a low-profile SFF case. Like you said, it's not bad, just didn't do it for me. To be honest, I've given up on Intel with Alder/Raptor Lake, but yet another P-core only monolithic design in the making got some long lost juices flowing in me. :rolleyes:
For me the 16 core runs cooler in both gaming and MT workloads even though it boost higher in both which I found odd even when disabling the second CCD... Could just be the samples I worked with they all varied a little bit.
The 7800X3D hit 80 compiling shaders vs 70 on the 7950X3D both 100 utilization.
Disabling the second CCD is a different matter, though. I don't know why you saw lower temps then. Ryzen is weird.
I am looking forward to this mythical 12 core though and hope it does well enough for them to just do it on their new socket as well.... Would love it as my secondary setup.
Assuming the Z690 Aorus Master will still get a BIOS update to run these new CPU's.