Tuesday, December 14th 2021

Samsung Sampling 24 Gbps GDDR6 Memory Chips

Samsung has started sampling high-speed 24 Gbps-rated GDDR6 memory chips. Just to be clear, these are standard GDDR6 chips built to JEDEC-specifications, and not GDDR6X, a derivative standard co-developed by NVIDIA and Micron leveraging PAM4 signaling. The 24 Gbps chips by Samsung can be used by both NVIDIA and AMD, if their GPU designs can handle the data-rates. The specific part number for the chip is "K4ZAF325BC-SC24." This chip has a density of 16 Gb (2 GB), which means 8 of these make up 16 GB across a 256-bit wide memory bus, and 12 of these make 24 GB across a 384-bit bus.

At 24 Gbps, these chips offer 50% more bandwidth than 16 Gbps, and 71% more than 14 Gbps. A hypothetical 6 nm refresh of the "Navi 21" paired with these chips, would hence have 768 GB/s of memory bandwidth on top of its Infinity Cache bandwidth, compared to 512 GB/s on the current Radeon RX 6800 XT. Since the chip is sampling, it's likely that both AMD and NVIDIA have their hands on it. There's no word on when the chip hits mass-production, but this could definitely happen within 2022.
Source: Olrak29_ (Twitter)
Add your own comment

12 Comments on Samsung Sampling 24 Gbps GDDR6 Memory Chips

#1
defaultluser
Is this finally going to drop the incredible power consumption of GDDR6x? The reuse of the same signaling as Micron (except clocked higher) makes me fear-for-the-worst!
Posted on Reply
#2
Minus Infinity
defaultluserIs this finally going to drop the incredible power consumption of GDDR6x? The reuse of the same signaling as Micron (except clocked higher) makes me fear-for-the-worst!
The articles clearly said it's NOT based on the same PAM4 signaling as developed by Ngreedia and Micron for GDDR6X
Posted on Reply
#3
wolf
Better Than Native
Minus InfinityNgreedia
:sleep:
At 24 Gbps, these chips offer 50% more bandwidth than 16 Gbps
Not bad at all! should bode well for what 2022 brings.
Posted on Reply
#4
Prima.Vera
how do those compare to the latest GDDR6X ?
Posted on Reply
#6
Richards
Minus InfinityThe articles clearly said it's NOT based on the same PAM4 signaling as developed by Ngreedia and Micron for GDDR6X
Thank god.. gddr6x is shat lol
Posted on Reply
#7
wolf
Better Than Native
RichardsThank god.. gddr6x is shat lol
I'd say not optimal, but it allowed the GA102 products to get the bandwidth they needed at the time, and it's allowed me to mine and pay for a 3080 twice over, so I'm not at all upset about the inclusion.

Decision made at a point in time to bridge a gap.
Posted on Reply
#8
londiste
wolfDecision made at a point in time to bridge a gap.
The first card that used 16Gbps GDDR6 was 2080 Super in July 2019, running them at 15.5Gbps for some reason.
The first card running 16Gbps GDDR6 proper was 6800XT in November 2020.
RTX3080 was released in September 2020 with 19Gbps GDDR6X.
Posted on Reply
#9
ghazi
Nvidia really got REKT with this one. Spent untold millions for Micron to develop GDDR6X and got forced into buying it all up, put out cards with 350W TDP as a result, and now plain old GDDR6 is pushing far beyond the data rates possible with GDDR6X. Hilarious really.

And for those skeptical, no, Ampere did not need GDDR6X, it could have benefited from it if not for the terrible power efficiency. The A6000 is consistently faster than the 3090 FE in games -- and while yes, it is a higher bin -- it also has a TDP of only 300W and only 768GB/s bandwidth, yet still clocks higher and performs better. Further proof can be found in the joke that is the 3070 Ti.
Posted on Reply
#10
Richards
ghaziNvidia really got REKT with this one. Spent untold millions for Micron to develop GDDR6X and got forced into buying it all up, put out cards with 350W TDP as a result, and now plain old GDDR6 is pushing far beyond the data rates possible with GDDR6X. Hilarious really.

And for those skeptical, no, Ampere did not need GDDR6X, it could have benefited from it if not for the terrible power efficiency. The A6000 is consistently faster than the 3090 FE in games -- and while yes, it is a higher bin -- it also has a TDP of only 300W and only 768GB/s bandwidth, yet still clocks higher and performs better. Further proof can be found in the joke that is the 3070 Ti.
Nothing but truth... Nvidia got scammed by micron lol.. this Samsung gddr6 absolutely destroys gddr6x nvidia must be embarrassed
Posted on Reply
#11
wolf
Better Than Native
ghaziNvidia really got REKT with this one. Spent untold millions for Micron to develop GDDR6X and got forced into buying it all up, put out cards with 350W TDP as a result, and now plain old GDDR6 is pushing far beyond the data rates possible with GDDR6X. Hilarious really.
I must have missed the part where these cards with GDDR6X flew off shelves and helped make them untold millions in profits. Hilarious really.
ghaziThe A6000 is consistently faster than the 3090 FE in games
Got a source on that? everything I've seen shows it trailing a 3090 or at best batching it depending on the game.
Posted on Reply
#12
kayfabe
ghaziNvidia really got REKT with this one. Spent untold millions for Micron to develop GDDR6X and got forced into buying it all up, put out cards with 350W TDP as a result, and now plain old GDDR6 is pushing far beyond the data rates possible with GDDR6X. Hilarious really.

And for those skeptical, no, Ampere did not need GDDR6X, it could have benefited from it if not for the terrible power efficiency. The A6000 is consistently faster than the 3090 FE in games -- and while yes, it is a higher bin -- it also has a TDP of only 300W and only 768GB/s bandwidth, yet still clocks higher and performs better. Further proof can be found in the joke that is the 3070 Ti.
Excuse me to contact you this way Mister Ghazi but you the same "al-ghazi" from the WCCFtech Disqus??
Posted on Reply
Add your own comment
Aug 18th, 2024 22:12 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts