Tuesday, March 11th 2014

Samsung Mass Producing Industry's Most Advanced 4 Gb DDR3, Using 20 nm Process

Samsung Electronics Co., Ltd., the world leader in memory technology, today announced that it is mass producing the most advanced DDR3 memory, based on a new 20 nanometer process technology, for use in a wide range of computing applications. Samsung has pushed the envelope of DRAM scaling, while utilizing currently available immersion ArF lithography, in its roll-out of the industry's most advanced 20-nanometer (nm) 4-gigabit (Gb) DDR3 DRAM.

With DRAM memory, where each cell consists of a capacitor and a transistor linked to one another, scaling is more difficult than with NAND Flash memory in which a cell only needs a transistor. To continue scaling for more advanced DRAM, Samsung refined its design and manufacturing technologies and came up with a modified double patterning and atomic layer deposition.
Samsung's modified double patterning technology marks a new milestone, by enabling 20nm DDR3 production using current photolithography equipment and establishing the core technology for the next generation of 10nm-class DRAM production. Samsung also successfully created ultrathin dielectric layers of cell capacitors with an unprecedented uniformity, which has resulted in higher cell performance.

With the new 20nm DDR3 DRAM applying these technologies, Samsung also has improved manufacturing productivity, which is over 30 percent higher than that of the preceding 25 nanometer DDR3, and more than twice that of 30nm-class* DDR3.

In addition, the new 20nm 4Gb DDR3- based modules can save up to 25 percent of the energy consumed by equivalent modules fabricated using the previous 25 nanometer process technology. This improvement provides the basis for delivering the industry's most advanced green IT solutions to global companies.

"Samsung's new energy-efficient 20-nanometer DDR3 DRAM will rapidly expand its market base throughout the IT industry including the PC and mobile markets, quickly moving to mainstream status," said Young-Hyun Jun, executive vice president, memory sales and marketing, Samsung Electronics. "Samsung will continue to deliver next-generation DRAM and green memory solutions ahead of the competition, while contributing to the growth of the global IT market in close cooperation with our major customers."

According to market research data from Gartner, the global DRAM market will grow from $35.6 billion US dollars in 2013 to $37.9 billion US dollars in 2014.
Add your own comment

30 Comments on Samsung Mass Producing Industry's Most Advanced 4 Gb DDR3, Using 20 nm Process

#1
IvantheDugtrio
Any chance we can get more Samsung miracle memory?
Posted on Reply
#2
Ja.KooLit
hope they will produce also for desktop. I wonder the price and timing and the difference of the current ddr3's
Posted on Reply
#3
Steevo
Why don't we skip DDR4 and jump right to DDR5 with some of the tech advances we have learned like chip independent timings, on the fly error correction for overclocking? With this and a few simple ECC bits in hardware on the CPU and a core control and data security provided by something like a ARM co-processor...........
Posted on Reply
#4
Ja.KooLit
SteevoWhy don't we skip DDR4 and jump right to DDR5 with some of the tech advances we have learned like chip independent timings, on the fly error correction for overclocking? With this and a few simple ECC bits in hardware on the CPU and a core control and data security provided by something like a ARM co-processor...........
this sounds nice. i never know there is such thing as like that now with arm proc
Posted on Reply
#5
buggalugs
What kind of voltage are these running? <1v?
Posted on Reply
#6
Jack1n
buggalugsWhat kind of voltage are these running? <1v?
This is STILL DDR3 so no its 1.35-1.65v.
Posted on Reply
#7
alwayssts
buggalugsWhat kind of voltage are these running? <1v?
1.25v +/- .06v

So 1.2-1.3ish.

One of those great PR spin jobs where they are essentially saying it will CAN run at 1.2v (for low-voltage products) where-as 30nm was running at 1.5v for 'average' products, but in reality the nominal spec of 1.25v (for low-voltage) is similar to the 30nm spec of 1.35v (for low-voltage).

So...yeah. TBH, quite surprised they couldn't pull a 2133mhz bin from that spec...but someone probably will make such a product.
Posted on Reply
#8
Planet
SteevoWhy don't we skip DDR4 and jump right to DDR5 with some of the tech advances we have learned like chip independent timings, on the fly error correction for overclocking? With this and a few simple ECC bits in hardware on the CPU and a core control and data security provided by something like a ARM co-processor...........
Right....... Because DDR5 has been developed already right?
Posted on Reply
#9
Jorge
There isn't much to be gained in actual system performance from this DRAM. It's just an refinement of existing DDR3 RAM. The tiny bit lower power consumption and cool tech is nice and all but you won't actually see any tangible system gains. They are just applying some DDR4 requirements to DDR3 as DDR4 is primarily for servers and not a cost effective DRAM solution for desktop or laptop.
Posted on Reply
#10
FreedomEclipse
~Technological Technocrat~
So much mass production and still the price of ram is high.
Posted on Reply
#11
Steevo
PlanetRight....... Because DDR5 has been developed already right?
Considering we are using it in the new consoles and on graphics cards....... I know its "G" DDR5 but the specifications can't be that hard to implement considering its being done now.

But thanks for knowing this already, and making a useful post instead of just being an asshole, we all appreciate it.
Posted on Reply
#12
WaroDaBeast
SteevoConsidering we are using it in the new consoles and on graphics cards....... I know its "G" DDR5 but the specifications can't be that hard to implement considering its being done now.

But thanks for knowing this already, and making a useful post instead of just being an asshole, we all appreciate it.
You have suggested that we skip a technology which God knows how many people have worked on for a long time. I do not think that it was a clever suggestion.
Posted on Reply
#13
Steevo
WaroDaBeastYou have suggested that we skip a technology which God knows how many people have worked on for a long time. I do not think that it was a clever suggestion.
en.wikipedia.org/wiki/GDDR4

Yeah, GDDR4 was a huge success, and of course the amount of time spent on something is highly indicative of its ultimate success and not market adoption or true world performance.

Perhaps we could get together and watch a HD-DVD or a laserdisc and post about it to our myspace pages on our blackberry's? Oh could we!!!!?
Posted on Reply
#14
Arjai
^ Haha, He said "MySpace!!" :laugh:
Posted on Reply
#15
yotano211
Steevoen.wikipedia.org/wiki/GDDR4

Yeah, GDDR4 was a huge success, and of course the amount of time spent on something is highly indicative of its ultimate success and not market adoption or true world performance.

Perhaps we could get together and watch a HD-DVD or a laserdisc and post about it to our myspace pages on our blackberry's? Oh could we!!!!?
Dammit you guys are so far ahead, I'm still using betamax.
Posted on Reply
#16
Steevo
yotano211Dammit you guys are so far ahead, I'm still using betamax.
make sure to embrace the future, but only one step at a time so we make sure the big good companies who only care about our personal lives can make the pittance they do serving us gods.
Posted on Reply
#17
WaroDaBeast
Steevoen.wikipedia.org/wiki/GDDR4

Yeah, GDDR4 was a huge success, and of course the amount of time spent on something is highly indicative of its ultimate success and not market adoption or true world performance.

Perhaps we could get together and watch a HD-DVD or a laserdisc and post about it to our myspace pages on our blackberry's? Oh could we!!!!?
If I understand correctly, you're assuming that DDR4 will be a letdown because GDDR4 was not so great? I think I'll have fun quoting you again if it is indeed successful.

Secondly, you're talking about technologies that failed to remain relevant. That being said, I'm not so sure you predicted that (i.e. failure to stay relevant).

Finally, you're comparing tech that is sinking or that has sunk with a tech that hasn't hit the market yet.

P.S.: I was going to talk about the usefulness of posts and the use of sarcasm in said posts, but I changed my mind when I realized you don't like being on the receiving end of criticism.
Posted on Reply
#18
de.das.dude
Pro Indian Modder
wonder if the samsung memory in my samsung laptop i recently got is this sort.
Posted on Reply
#19
Steevo
WaroDaBeastIf I understand correctly, you're assuming that DDR4 will be a letdown because GDDR4 was not so great? I think I'll have fun quoting you again if it is indeed successful.

Secondly, you're talking about technologies that failed to remain relevant. That being said, I'm not so sure you predicted that (i.e. failure to stay relevant).

Finally, you're comparing tech that is sinking or that has sunk with a tech that hasn't hit the market yet.

P.S.: I was going to talk about the usefulness of posts and the use of sarcasm in said posts, but I changed my mind when I realized you don't like being on the receiving end of criticism.
But yet you did good sir!!!!

I am not going to spend the time to break all of my reasoning down for you further than, the APU powering the PS4 uses GDDR5 and has no issues and squeezes great performance out of it. Manufacturers are already making GDDR5 and its years ahead of current RAM. Who is making DDR4? Who has adopted it despite having it laid out and planned since 2009? If you had the choice between using a higher speed new version, and a lower speed old version would you honestly choose the older version?

We the consumer have the final say in it, with our money, and much like GDDR4 was a good small stepping stone but ultimately cast aside, current DDR4 (it is in production and has been waiting in the wings for awhile now) production offers no performance or cost improvement over 3 minus a little low power server segment, and we have and know of working high performance use of 5.

Set your sights higher than trying to disagree with another person.
Posted on Reply
#20
Prima.Vera
SteevoConsidering we are using it in the new consoles and on graphics cards....... I know its "G" DDR5 but the specifications can't be that hard to implement considering its being done now.

But thanks for knowing this already, and making a useful post instead of just being an asshole, we all appreciate it.
Stevo, GDDR5 is based on DDR3. Is just a different representation.
Posted on Reply
#21
bogami
wow good job samsung development team .30% is not so little relation to the norms required for the operation. Regardless of the kind of disruptive high latency (CL-11 at 12600).
I hope that we will soon see a new batch of RAM with manufacturers such as G SKILL a.s.o.:clap:
Posted on Reply
#22
MikeMurphy
GDDR5 tech is not feasible due to its very high power consumption.

I wonder if this will bring us 16GB dimms?

My 32GB of memory is feeling a bit claustrophobic.
Posted on Reply
#23
Steevo
MikeMurphyGDDR5 tech is not feasible due to its very high power consumption.


My 32GB of memory is feeling a bit claustrophobic.
So the fact the PS4 uses it, and all current graphics cards use it and yet can draw a few watts during light use......


I don't know whats so hard to comprehend, its useable, its faster.

More than 32GB of memory in anything other than a production environment is useless currently, the bandwidth constraints at 32GB to page through for execution make it useless with current generation processors, we need faster RAM and a wider bus.
Posted on Reply
#24
WaroDaBeast
SteevoI am not going to spend the time to break all of my reasoning down for you further than, the APU powering the PS4 uses GDDR5 and has no issues and squeezes great performance out of it.
While that's nice, I fail to see how that is a demonstration of DDR5's advantages over DDR4. Now, if you're talking about using GDDR5 as system memory, I wonder if that is feasible. Your sole example is a console, the architecture of which is not exactly the same as our PCs, I believe.
Steevowe have and know of working high performance use of 5.
Yet, none of us knows how much time and work are needed for us to have usable DDR5 as system memory. So, should we wait for DDR5 RAM sticks' availability and use DDR3 in the meantime, or should we adopt DDR4 until that occurs?

You've made a bold claim (that it's better for the next generation of memory to be DDR5 in lieu of DDR4), and you've called another user names, just because he was being sarcastic. Then, you yourself have been sarcastic towards me. Of course I'm going to disagree with you. ;)
Posted on Reply
#25
Arjai
Please guys, /Flame

Currently, it is all Tilting at Windmills. Let it go, you both may be wrong, then what? He's "Wrong-er?"
Posted on Reply
Add your own comment
Dec 22nd, 2024 08:13 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts