Monday, June 6th 2022

Intel LGA1851 to Succeed LGA1700, Probably Retain Cooler Compatibility

Intel's next-generation desktop processor socket will be the LGA1851. Leaked documents point to the next-generation socket being of identical dimensions to the current LGA1700, despite the higher pin-count, which could indicate cooler compatibility between the two sockets, much in the same way as the LGA1200 retained cooler-compatibility with prior Intel sockets tracing all the way back to the LGA1156. The current LGA1700 will service only two generations of Intel Core, the 12th Generation "Alder Lake," and the next-gen "Raptor Lake" due for later this year. "Raptor Lake" will be Intel's last desktop processor built on a monolithic silicon, as the company transitions to multi-chip modules.

Intel Socket LGA1851 will debut with the 14th Gen Core "Meteor Lake" processors due for late-2023 or 2024; and will hold out until the 15th Gen "Arrow Lake." Since "Meteor Lake" is a 3D-stacked MCM with a base tile stacked below logic tiles; the company is making adjustments to the IHS thickness to end up with an identical package thickness to the LGA1700, which would be key to cooler-compatibility, besides the socket's physical dimensions. Intel probably added pin-count to the LGA1851 by eating into the "courtyard" (the central gap in the land-grid), because the company states that the pin-pitch hasn't changed from LGA1700.
Sources: BenchLife.info, VideoCardz
Add your own comment

197 Comments on Intel LGA1851 to Succeed LGA1700, Probably Retain Cooler Compatibility

#76
Why_Me
A new socket means new bright shiny boards. I'm down.
eidairaman1Scamintel. Uninteligent. AM4 was around 5 years. I think the clear choice is AMD at this rate with AM5.
Intel recently broke the ice for DDR5 and PCIe 5.0. New sockets sometime's mean new technology.
Posted on Reply
#77
Crackong
Another meaningless cash grab
Posted on Reply
#78
AlwaysHope
AusWolfSo much for LGA-1700 having a long life.
Yep, just like the previous generation. Ain't life fun being a PC enthusiast! :laugh:
Posted on Reply
#79
AusWolf
AlwaysHopeYep, just like the previous generation. Ain't life fun being a PC enthusiast! :laugh:
Personally, I don't really care. I upgrade my PC whenever I feel like it (and have the money for it), regardless of platform compatibility. :roll:
Posted on Reply
#80
Psychoholic
I'm probably in the minority here but I normally upgrade my board when i upgrade the CPU it just seems right, lol.
Even when i went 1700x to 2700x, I went from X370 to X470.. Then to X570 when i bought my 3900x.
Posted on Reply
#81
AlwaysHope
PsychoholicI'm probably in the minority here but I normally upgrade my board when i upgrade the CPU it just seems right, lol.
Even when i went 1700x to 2700x, I went from X370 to X470.. Then to X570 when i bought my 3900x.
I'll add one more to that "minority". :) Although its nice if you can move your cooler across multiple platforms.
Posted on Reply
#82
mama
So AM5 likely has the longevity aspect ticked off against the competition. Let's see what's what on other comparison points before everyone gets even more worked up.
Posted on Reply
#83
Mussels
Freshwater Moderator
birdieYou assume every person out there upgrades their CPU every bloody release which is blatantly false. I had my Core i5 2500 for 10 years and upgraded to Ryzen 3700X. It's amazing how you reply to my post while completely ignoring what I said. We've had polls here in TPU on the topic and, nope, even PC enthusiasts don't upgrade their CPUs every year, most do it every 3 to 5 years at which point platform longevity means almost nothing. OK, AMD dragged their AM4 socket for five or six years now? People still swap their motherboards because AM4 motherboards from six years ago terribly suck in terms of IO. The number of people who use 5950X with their six year old motherboards is vanishingly small. This "argument" doesn't stand, period.
I still have my 2500k and 3570k
because the performance gain from them to anything prior to 8th gen intel was basically nothing.

My x370 on the other hand, the ryzen 1400 it started with to the 5600x i'm putting in it is worlds apart.

You do things a certain way because you had to. You're used to it.
Given wings you'd just walk around because you've always had those legs, everything's designed for legs and you're just gunna leg it like you always have and everyone else has.
You know, except everything else with wings who thinks you're daft for not realizing the obvious freedom you have by using that option.
Posted on Reply
#84
Unregistered
I love the way people using AMD that will never ever use Intel argue as to why this is so bad when it does not matter at all to them anyway.
#85
maxfly
I hate them both! Always have, always will. They never make a perfect lifelong platform and I will never forgive them for it. Rotten money grubbing good for nothings is all they are. Killing our environment with their stupid wattage and their you know... things!
Now get off my lawn you dirty kids before I tell your ma!
Posted on Reply
#86
ThrashZone
birdieYou assume every person out there upgrades their CPU every bloody release which is blatantly false.
Hi,
Only because Intel has trained you well over the years
One board only two chip series for intel

I might want to upgrade from a 10900k but 11900k is my only option and it's a downgrade seeing I'd loose 2 cores and only gain a little single core performance so big whoop on that so called upgrade path.

12900k is not an option because it obviously requires a new board, so I'm sure a lot of people would like to upgrade maybe not everyone but far more than you might think

I read all the time amd folks upgrading from years old series to near newest, boy that sure would be nice to do if I could on a intel platform but sadly it is not an option.

x99 eol haswell-e but broardwell-e was no prize either but did have a 10 core but it was stupid priced
x299 eol about the only intel platform that had 3 series options 79-99-109 but frankly 79 was a thermal defect series should of never been created.
Posted on Reply
#87
john_
TiggerI love the way people using AMD that will never ever use Intel argue as to why this is so bad when it does not matter at all to them anyway.
.........
TiggerLots of probablys which means-i don't know
Posted on Reply
#88
ThrashZone
john_.........
Hi,
Just to fill in one or two of those dots if you don't mind

I've only gone intel platforms in the past and find it silly to eol so many of my boards just because intel requires it
Funny intel chose the "lakes" naming scheme I wonder how many real lakes would be filled by all these silly socket changes :laugh:
Posted on Reply
#90
TheoneandonlyMrK
TiggerI love the way people using AMD that will never ever use Intel argue as to why this is so bad when it does not matter at all to them anyway.
Just as I love the comedy of people defending a company based on their purchases and their use cases.

Some would buy Intel, but Intel makes platforms with No upgrade path besides storage and GPU.

I personally wouldn't buy gen 1 of a new architecture, two is always better so I never would have an upgrade path via Intel.

But Intel does power my two laptops, I'm no Intel hater.


But.


When it comes to a chip, mounted on a circuit board(substrate = pin in to pin out addapter) that's put in a socket on a circuit, having it's socket swapped add nauseum just to push board sales.

I have a problems with that.
Posted on Reply
#91
JustBenching
john_1. Not an argument amigo. Except if you are an upset representative of Intel seeing that you still have to defend Intel's business plan after so many years.
2. Not an argument amigo, except if you are an Intel shareholder. (By the way, I am an Intel shareholder)*
3. Not an argument amigo, it's a completely irrelevant comment.
4. You sabotage yourself amigo. Saying that people keep their systems for 3-7 years is an argument in favor of offering more CPU generations to the owner of a motherboard. And AM4 shows that someone CAN have meaningful upgrade options after 3-7 years, options that can offer over 100% performance in multitasking (5950X) or in games (5800X3D).
5. The only one screaming here is you because we have the nerve to point at an obvious fact. That buyers of Raptor Lake CPUs wouldn't be able to upgrade to something newer.

You do seem to want to vindicate what Intel does and it does looks like that you do care.

And by the way. Forcing people to replace equipment, because there isn't an upgrade path longer than 2 gens, does translate to pollution. All companies pollute with their choices, it's just that some pollute more when they decide that a particular part will lose it's value faster.





*But I am also a customer.
Really? Who on their mind would put a 5950x or a 3d in a b350 motherboard? Why the heck would you do that? We are talking about people that keep their cpus for a long time, right? So these people will be stuck with pcie 3 for the next like 5 years? Losing on faster drives / performance increases in new gpus and possibly direct storage.

Also, who on their right mind that keeps their cpu for 5 to 7 years buys an almost 2 year old cpu? Cause thats how old the 5950x.

X370 and b350 are completely outdated right now and besides the 3d every other cpu that is supported is 2 years old. So yeah, not a great option
ARFIn another reality. Actually, Intel offers terrible performance per watt, as you can see in the reviews:


Intel Core i9-12900K and Core i5-12600K Power Consumption and Efficiency - Intel Core i9-12900K and Core i5-12600K Gaming CPUs Review | Tom's Hardware (tomshardware.com)
12900k scores 28k at 156 watts and 15k+ at 35w,making it the most efficient cpu on planet earth. Ive uploaded results in the cbr23 thread.
Posted on Reply
#92
Unregistered
TheoneandonlyMrKJust as I love the comedy of people defending a company based on their purchases and their use cases.

Some would buy Intel, but Intel makes platforms with No upgrade path besides storage and GPU.

I personally wouldn't buy gen 1 of a new architecture, two is always better so I never would have an upgrade path via Intel.

But Intel does power my two laptops, I'm no Intel hater.


But.


When it comes to a chip, mounted on a circuit board(substrate = pin in to pin out addapter) that's put in a socket on a circuit, having it's socket swapped add nauseum just to push board sales.

I have a problems with that.
Guess you won't be buying am5 then
#93
TheoneandonlyMrK
TiggerGuess you won't be buying am5 then
No, not on day one , why would I.

Maybe 2nd generation, maybe not.

@fevgatos , stepped upgrades, clearly beyond you, but you could have bought a 2600X and x470( I don't Dooo 1st gen) then bought a 3800X a year later.

Then buy a x570 , then buy a 5950X

Stepped, upgrades, not the most extreme example you spout of b350 and a 1700 to a 5950X.

Some also sell systems on cheap, so can leverage that to their own systems advantage.
Posted on Reply
#94
JustBenching
TheoneandonlyMrK@fevgatos , stepped upgrades, clearly beyond you, but you could have bought a 2600X and x470( I don't Dooo 1st gen) then bought a 3800X a year later.

Then buy a x570 , then buy a 5950X

Stepped, upgrades, not the most extreme example you spout of b350 and a 1700 to a 5950X.

Some also sell systems on cheap, so can leverage that to their own systems advantage.
And what exactly is the net benefit doing that compared to what you would do with intel? Say you bought an 8700k + a z390 and then moved to alderlake.
Posted on Reply
#95
TheoneandonlyMrK
fevgatosAnd what exactly is the net benefit doing that compared to what you would do with intel? Say you bought an 8700k + a z390 and then moved to alderlake.
Well if I have to explain the benefits of going from 6/12 to 8/16 to 16/32 cores or pciex 3/4 your in the wrong forum.

As for if that would be better, that's a personal choice depending on your own use case.

I'm not saying my way is best, I am saying my way suited me best.

And it's a choice I would rather have then NOT have.

Plus 8700k to alderlake would have been two to three years of lacking performance in x86 tasks, go you.
Posted on Reply
#96
JustBenching
TheoneandonlyMrKWell if I have to explain the benefits of going from 6/12 to 8/16 to 16/32 cores or pciex 3/4 your in the wrong forum.

As for if that would be better, that's a personal choice depending on your own use case.

I'm not saying my way is best, I am saying my way suited me best.

And it's a choice I would rather have then NOT have.
Not trying to be mean but what suits you personally ( or me) best is irrelevant. The question is, does the long term mobo support offer something substantial? The answer is, not really. Especially the way AMD is doing it, it basically took them almost 2 years to support...2 year old CPUs. Im sorry but that is not good support.
Posted on Reply
#97
TheoneandonlyMrK
fevgatosNot trying to be mean but what suits you personally ( or me) best is irrelevant. The question is, does the long term mobo support offer something substantial? The answer is, not really. Especially the way AMD is doing it, it basically took them almost 2 years to support...2 year old CPUs. Im sorry but that is not good support.
The answer to you is not really.

That's called an opinion, not a fact.

I got No snags doing it my way, and all it took was knowing what to buy, when, and what it supported, dramatically hard I know.

All your points against are still aimed at ab350 owner going through the series of CPU.

Great understanding of what I said there.
Posted on Reply
#98
JustBenching
TheoneandonlyMrKThe answer to you is not really.

That's called an opinion, not a fact.

I got No snags doing it my way, and all it took was knowing what to buy, when, and what it supported, dramatically hard I know.

All your points against are still aimed at ab350 owner going through the series of CPU.

Great understanding of what I said there.
Well if you don't give me a substantial benefit then Ill assume there isn't any. That, or it's so important you wanna keep it to yourself :eek:

On the other hand there are multiple benefits to what Intel is doing so...
Posted on Reply
#99
Gica
AusWolfHow do you make it consume 0.7 Watts? :eek:
Idle and little task (read an write in this forum).
Total power (wattmetter) is 25W idle.
When we talk about that extreme consumption, only the torture scenario is taken into account. In the real world, consumption is much, much lower.
This system does not exceed 300W per day in 8 hours of operation (www, multimedia, Office, WoT and some old games.)
I will upgrade to 10500 only when the AV1 codec will completely replace VP9 in youtube and netflix.
Ironically, without the integrated graphics processor I would have had to buy a much more expensive video card than the motherboard under discussion on this topic. :laugh:


11600K and 3070Ti eat ~ 400W / h in AAA games. At the factory settings, it goes to 500W.
TheoneandonlyMrKSo by your logic, no chip is the best then ,all Intel offerings get trounced by a 5950 in some cases, and marginally by the 5800X3D in others, all on Am4 no less.
Meanwhile Intel also beats all AMD chips in some applications.

So none are the best then?!.
Or they're all good.
In other words, it is important to buy what is cheaper and offers better performance for your requirements. I see a problem if you buy the X processor because it's better in gaming but you don't play, or you don't have a video card to highlight it. How does the 5800x3d help a 6500XT?
Posted on Reply
#100
TheoneandonlyMrK
fevgatosWell if you don't give me a substantial benefit then Ill assume there isn't any. That, or it's so important you wanna keep it to yourself :eek:

On the other hand there are multiple benefits to what Intel is doing so...
Make assumptions, go ahead that'll pan out.

What I do with more cores is my business alone but I will say, every pc I have owned in the last ten years spent it's time on and working at 100% 24/7 362(it does require maintenance)

I can think of only two benefits, one IO improvement (I updated mobo same CPU after 16 months for this anyway), two Intel's bottom line. . .

F#@k Intel's bottom line.

@Gica I have no clue what your on about, I have my systems listed, a 6500Xt isn't on it.
Posted on Reply
Add your own comment
Mar 15th, 2025 22:19 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts