• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel Socket LGA1851 Only Supports DDR5 Memory

Ok, but CL 30 not so good...
And CL14 used to be awful compared to DDR3(CL6), and so on and so forth. If you ignore all the advantages of DDR5 over 4, it's frequency, burst length and its 2x32 bit independent channels and only focus on latency, we might as well still be using DDR3, it had the lowest latency out of all DDR generations so far.
Having DDR4 support allows people with good DDR4 to reuse their memory. If you are doing a new build and need memory anyway, sure, DDR5 makes more sense.
Yes, and Intel did that with their current gen. Would it be nice to have DDR4 support forever? Sure, I guess but it doesn't make sense to support it financially. DDR4 prices are only going to get higher from hereon out, and like I posted, there's not a significant enough difference between the two's prices to continue to support both.
 
Regardless of the memory involved, unless the next gen of cpu's/chipsets can provide some REAL and SIGNIFICANT performance increases, instead of the MINISCULE ones from the last 10 generations, then moving to a new socket will only be beneficial for the "latest-greatest" "keep up with the Jonesies" "benchy-dweeb" crowd :)
 
Regardless of the memory involved, unless the next gen of cpu's/chipsets can provide some REAL and SIGNIFICANT performance increases, instead of the MINISCULE ones from the last 10 generations, then moving to a new socket will only be beneficial for the "latest-greatest" "keep up with the Jonesies" "benchy-dweeb" crowd :)
The 12th generation was a fairly decent upgrade; perhaps even bigger than Sandy Bridge.
 
Strange that to do this, DDR4 more attractive with CL14 (16) and costs less, as DDR5 with CL 38 or more...
RAM prices will continue to drop and Intel CPUs dont need the highest quality memory such as CL14(16). Higher latency RAM will run just fine.
 
Strange that to do this, DDR4 more attractive with CL14 (16) and costs less, as DDR5 with CL 38 or more...

This platform will come out at the end of 2024, if not 2025. DDR5 has already come down in price drastically, I expect it might be on par or even cheaper by then.


Arrow Lake should have really good efficiency, but Intel is definitely playing catch-up in this department.
 
Especially when you consider the price of 3600-4000 CL14 kits. The 6000 CL30 kits are half as cheap at 2x16GB capacities because 2x16GB is the most common DDR5 capacity where as with DDR4 it was 2x8GB.

6000 CL30 is literally the best there is right now in terms of speed/latency. Yes there are some lower speed CL28 kits but they're slower.
So what is the point of this new socket? Sounds like there will be little expectation of any sizable performance uplift? Or no?
 
So what is the point of this new socket?

Could be this maybe-
Meanwhile, the LGA1851 platform is expected to lose DDR4 compatibility (which is not surprising as we are dealing with 2nm-based CPUs and DDR4's 1.2V may be too high for such parts).
 
So what is the point of this new socket? Sounds like there will be little expectation of any sizable performance uplift? Or no?
More pins = more power and more lanes. Mainstream sockets keep growing and are nearing 2000 pins although not all pins are used. Some are reserve pins. Server sockets have already passed 6000 pins.
 
And CL14 used to be awful compared to DDR3(CL6), and so on and so forth. If you ignore all the advantages of DDR5 over 4, it's frequency, burst length and its 2x32 bit independent channels and only focus on latency, we might as well still be using DDR3, it had the lowest latency out of all DDR generations so far.

Yes, and Intel did that with their current gen. Would it be nice to have DDR4 support forever? Sure, I guess but it doesn't make sense to support it financially. DDR4 prices are only going to get higher from hereon out, and like I posted, there's not a significant enough difference between the two's prices to continue to support both.
For people with good DDR4 kits to reuse, it is a $90 savings right there. It is a good transition strategy for the consumers

Besides, the slightly higher latency is compensated for by the two independent command channels for each DDR5 DIMM and the doubled banks. In addition, the REFsb command spreads DRAM refresh across banks rather than refreshing all banks. DDR5 is better than DDR4 in many ways, and quite a few games run faster with DDR5. The 13900k is limited by DDR4 in some games: in Cyberpunk, 1% lows are 19% better with DDR5 at 1080p. A 13900k with DDR4 is slower than a 12900k with DDR5 in this game.

View attachment 310037
Nice! A 13900k with an rtx 4090 running DDR4 and 1080p!
 
For people with good DDR4 kits to reuse, it is a $90 savings right there. It is a good transition strategy for the consumers


Nice! A 13900k with an rtx 4090 running DDR4 and 1080p!
The test is a bit contrived, but it is a harbinger of things to come as future GPUs will be even faster and will be CPU limited at higher resolutions. There are other examples of performance differences in favour of DDR5 as well. The image below is edited from @W1zzard 's comparison of the 12900k with DDR and DDR5.

1692670139810.png
 
DDR4 is not more attractive anymore. That was maybe a good argument a year ago, but DDR5 is only marginally more expensive than DDR4 now.

Just a quick look at PCpartspicker:
A DDR4 3600 CL16 2x16 kit at $65(only 3 below $90).
A DDR5 6000 CL30 2x16 kit is $90(10 kits below $100).
The absolute cheapest DDR4 RAM (2x8 kit) is $26.
The absolute cheapest DDR5 RAM (2x8 kit) is $37.
The only DDR4 kits that are competitive with DDR5 (besides Micron which is very slow) are binned Samsung B Die kits. And the 2x16 Kits are expensive ($130 and above)...

If you care about gaming, there's no reason to go with DDR4 anymore.
 
The only DDR4 kits that are competitive with DDR5 (besides Micron which is very slow) are binned Samsung B Die kits. And the 2x16 Kits are expensive ($130 and above)...

If you care about gaming, there's no reason to go with DDR4 anymore.

I am stil using my B die DDR4, waiting till LGA1851 till i switch to DDR5
 
DDR4 was a scam all along. idk what you expected. I totally skipped it and went from ddr3 haswell to ddr5
 
I am stil using my B die DDR4, waiting till LGA1851 till i switch to DDR5
Me too...

DDR4 was a scam all along. idk what you expected. I totally skipped it and went from ddr3 haswell to ddr5
DDR5 is great, the problem was the absurd prices, but now there's no reason to build new with DDR4. Unless you find some great deals on zen3 or intel 11/10th gen.
 
DDR4 was a scam all along. idk what you expected. I totally skipped it and went from ddr3 haswell to ddr5
Scam? I don’t know if it’s that great I haven’t tried it yet (got a few modules for budget build) but I don’t know if the word scam is appropriate in describing it.
 
I'm planning to swap back to Intel from AMD - I'll be waiting for this new platform also haha.
5800X3D will keep up for now.
Yup keeping my 5800x3D until maybe Ryzen 9800x3D then switching to DDR5!
 
Strange that to do this, DDR4 more attractive with CL14 (16) and costs less, as DDR5 with CL 38 or more...
CL14 (16) at what speed? My last DDR4 kit was 18/3600, which has the same latency as my current DDR5 kit of 34/6800 (and that's not considering the other advantages of 5). Can't just look at these things in a vacuum.
 
CL14 (16) at what speed? My last DDR4 kit was 18/3600, which has the same latency as my current DDR5 kit of 34/6800 (and that's not considering the other advantages of 5). Can't just look at these things in a vacuum.

My DDR4 kit 2x16 is 4000c16, not really played much but it could probably do 3600c14
 
Back
Top