Sunday, June 28th 2020
Intel "Alder Lake-S" Confirmed to Introduce LGA1700 Socket, Technical Docs Out for Partners
Intel's Core "Alder Lake-S" desktop processor, which succeeds the 11th generation "Rocket Lake-S," is confirmed to introduce a new CPU socket, LGA1700. This new socket has been churning in the rumor mill since 2019. The LGA1700 socket is Intel's biggest mainstream desktop processor package change since LGA1156, in that the package is now physically larger, and may be cooler-incompatible with LGA115x sockets (Intel H# sockets). The enlargement in package size is seen as an attempt by Intel to give itself real-estate to build future multi-chip modules; while the increased pin-count points to the likelihood of more I/O centralization to the processor package.
The "Alder Lake-S" silicon is rumored to be Intel's first 10 nm-class mainstream desktop processor, combining a hybrid core setup of a number of "Golden Cove" high-performance CPU cores, and a number of "Gracemont" low-power cores. The processor's I/O feature-set is expected to include dual-channel DDR5 memory, PCI-Express gen 4.0, and possibly preparation for gen 5.0 on the motherboard-side. In related news, Intel put out technical documentation for the "Alder Lake-S" microarchitecture and LGA1700 socket. Access however, is restricted to Intel's industrial partners. The company also put out documentation for "Rocket Lake-S."
The "Alder Lake-S" silicon is rumored to be Intel's first 10 nm-class mainstream desktop processor, combining a hybrid core setup of a number of "Golden Cove" high-performance CPU cores, and a number of "Gracemont" low-power cores. The processor's I/O feature-set is expected to include dual-channel DDR5 memory, PCI-Express gen 4.0, and possibly preparation for gen 5.0 on the motherboard-side. In related news, Intel put out technical documentation for the "Alder Lake-S" microarchitecture and LGA1700 socket. Access however, is restricted to Intel's industrial partners. The company also put out documentation for "Rocket Lake-S."
34 Comments on Intel "Alder Lake-S" Confirmed to Introduce LGA1700 Socket, Technical Docs Out for Partners
hahahaaa
Yes Intel is moving forward with (H6 LGA 1700 socket)....
Vs
AMD is also changing their socket too! AMD (AM5 socket)....
Intel first mainstream PCIe 5.0 is Meteor Lake-S Intel first 7nm+ (700 Series) H6 LGA 1700 socket.
Alder Lake is Intel second generation PCIe 4.0 board (600 Series) H6 LGA 1700 socket.
Intel only has two years on PCIe 4.0 technology with Rocket Lake-S and Alder Lake-S
The last Intel release two desktops CPUs in the same year was the 7700K (Q1 2017) and the 8700K (Q4 2017).....
Now Rocket-S Lake launch time is (Q1 2021) with Alderlake-S (Q4 2021)
With Meteor Lake-S (Q4 2022) replacing Alder Lake-S one year later....
Next 3 years going to very interesting from both AMD and Intel with new sockets and PCIe 5.0 DDR5 USB4 WiFi-6E 5G on AMD AM5 and Intel H6 LGA 1700 socket Motherboards...
Very exciting news!
So 1700 just takes over from there. adds more PCIe lanes, and maybe the chipset moves to the CPU. There is no need for a chipset to sit on the motherboard.
they sell at 3700x price here.
I was considering one but decided to stick to a 10th gen i5 cause the savings are big and gaming performance is still prev gen i7/current gen Ryzen 7.when I need more I'd like it to come with pci-e 4.0 too.
The 2 year lifespan is exactly why I am skipping LGA1200/Z490/no pcie 4 or other real improvements nonsense etc etc...
I would not want to be stuck on a platform where i lose performance every year. Not to mention the advances next two-three generations will bring.
A few years from now these 14nm parts will be seen as ancient and slow. Especially if Intel will finally get their IPC increases rolling again. By the end of this decade we will have double or even triple the IPC of these parts. Quadruple the cores in mainstream, 3D stacked memory and new instructions. Optimal use would be 2-5 years. Once you go past 5 you really start to notice the slowdows on an older platform. I used 2500K from 2012 to 2019. The last two years were painful to use a 4c/4t CPU. Made worse by the Meltdown/Spectre patches that decreased performance further.
if that 14nm serves them better for what they do but 7nm r3000 will magically stay fresh and modern right yeah right
in 2010 an enthusist mainstream platform could have a 970 with 6/12 and ddr3 now a 10900 with 10/20 and ddr4
so yeah,quadriple quintuple shmuple
You think people would buy this 14nm++++ crap if Intel had a faster 10nm or even 7nm product? Ofcourse not. But because they don't people try to justify and find reasons to suggest this 14nm crap. And what would that be? Impercieveable 5% better avg with 2080Ti @ 1080p low-medium settings? Better than this "9 month" platform at the very least. You're comparing an Extreme CPU then with mainstream now. Sure if i take 64c/128t 3990X and 5Ghz DDR4 today i too could say in 10 years - look there has not been much progress.
But that would be blatantly false as the perfiormance of 970 from 2010 could be had today (and even better due to much better IPC) for a fraction of what it cost back then.
Double IPC - Intel's own estimations vs Skylake for the next ~4 years.
Quadruple the cores - We had 4c/8t high end mainstream CPU's only 3 years ago. Now we have 16c/32t mainstream high end CPU's. Only in the span of 3 years! If you think this is it and we will sit at 16c/32t for the next decade you need to learn your history. 64c/128t will be the high end mainstream CPU in decade with atleast double the IPC of todays part and likely double as fast DDR5 (vs DDR4) and PCIe 5.0 or 6.0. And that's likely not the highest core/thread count option either.
Except when now it costs 4k for the CPU alone in a decade it will cost 400. That's how progress works.
Im very curious.
7nm is a more efficient node for sure but 5% at 1080p low is just not correct.
try 15% in 1080p ultra
stock
www.computerbase.de/2020-05/intel-core-i9-10900k-i5-10600k-test/3/#abschnitt_benchmarks_in_spielen_full_hd_und_uhd
and what is a "7nm workload" for that matter ? or a "14nm workload" ?
That's the very definition of niche and overpaying for those few extra frames.
Intel does have some other small niches like AVX-512 or QuickSync but these are even less useful for most people.
For 2080Ti buyers. Yeah sure - it makes sense to pair it with a 10900K or even 10600K but for everyone else it's just a bad investment/return reward.
while 3300x is better value than both 3600 and 3700x
if you're gonna make a point,at least make it consistent.
all I'm gonna say.
I do wonder what those extra 500 pins will bring, more IO?
What I do dislike with the current lineups of both companies is the fuzzy transition between upper mainstream and HEDT. While many of you might disagree with me, I would have preferred if the i9 and Ryzen 9 CPUs were on their respective HEDT platforms instead of increasing the VRM requirements and costs of the mainstream platforms. I think they pushed it too far (note that I don't think they should have retained just 4 cores).
The cool thing is the two CPU packages could ping pong the power throttling off and on between inactivity and activity so when one package gets engaged the other can disengage and to reduce heat and energy. If they can do that and sync it well it could be quite effective much the fan profiles on GPU's at least when setup and working right are quite nice from the 0db fan profiles to just when they trigger higher fan RPM's to operate and how long they operate cooling things down and then wind down the fan RPM's after they've lowered the GPU temp's.
I think clock skew between the two would be hell and a half to compensate for and manage.