Monday, December 3rd 2007

Samsung Develops GDDR5 Memory at 6Gbps

Samsung Electronics has announced that it has developed the world's fastest memory, a GDDR5 (series five, graphics double-data-rate memory) chip that can transfer data at six gigabits per second (Gbps). Samsung's GDDR5, which will be introduced at a density of 512 Mb (16Mb x 32) chips, is capable of transmitting moving images and associated data at 24 gigabytes per second (GBps). The new Samsung graphics memory operates at 1.5 volts, representing an approximate 20% improvement in power consumption over today's most popular graphics chip, the GDDR3. Samples of Samsung's new GDDR5 chip have been delivered to major graphic processor companies last month and mass production is expected in the first half of 2008. Samsung expects that GDDR5 memory chips will become standard in the top performing segment of the market by capturing more than 50% of the high-end PC graphics market by 2010.
Source: DigiTimes
Add your own comment

30 Comments on Samsung Develops GDDR5 Memory at 6Gbps

#1
GLD
Damnit! I am still on DDR1. :eek:
Posted on Reply
#3
KennyT772
My god...Who wants to bet ATI will be all over this?
Posted on Reply
#4
OnBoard
GLDDamnit! I am still on DDR1. :eek:
Actually you are not, this is graphics memory and you are using GDDR4 ;) I'm still on 3 and even after next GPU upgrade I'll be on it.
Posted on Reply
#5
EastCoasthandle
It looks like this will require a powerful GPU in order to take full advantage of the bandwidth.
Posted on Reply
#6
a111087
GLDDamnit! I am still on DDR1. :eek:
lol, you are fine, it's not Gddr, not ddr :laugh:
and i also think that not every GPU will be able to really take an advantage of this memory
Posted on Reply
#7
a111087
KennyT772My god...Who wants to bet ATI will be all over this?
+1 :toast:
Posted on Reply
#8
jocksteeluk
KennyT772My god...Who wants to bet ATI will be all over this?
i doubt it, with AMD struggling on component pricing and Samsung wanting to drive ram prices up i cannot see anyone rushing to implement this technology at a premium just yet.
Posted on Reply
#9
robodude666
Can't wait to see how well these will work :)

Think its possible that someone will use these for a SSD drive, even though they are meant as graphics memory? 6Gbps is a hella lot of speed!
Posted on Reply
#10
Ketxxx
Heedless Psychic
At last, maybe GDDR2 will now finally drop off the fact of the earth. I mean my god, wtf were they thinking with GDDR2? It ran hotter, slower even with its massively faster clock speeds, and was only capable of a 128bit memory bus. I hope whoever had the idea of "improving" GDDR with GDDR2 got fired.
Posted on Reply
#11
Random Murderer
The Anti-Midas
KetxxxI hope whoever had the idea of "improving" GDDR with GDDR2 got fired.
lol, he died when a gddr2 chip exploded and hit him in the eye:roll:
Posted on Reply
#12
Ketxxx
Heedless Psychic
Haha, I hope that actually happened :D
Posted on Reply
#14
lemonadesoda
If GDDR(x) is faster and more power efficient than regular DDR(x) then why doesnt the industry move over to GDDR(x) for system memory. Obviously, its more expensive, but with SO MANY enthusiast mobos and components, then surely this is what we need to BREAK the benchmarks.
Posted on Reply
#15
Random Murderer
The Anti-Midas
lemonadesodaIf GDDR(x) is faster and more power efficient than regular DDR(x) then why doesnt the industry move over to GDDR(x) for system memory. Obviously, its more expensive, but with SO MANY enthusiast mobos and components, then surely this is what we need to BREAK the benchmarks.
because the architecture is too different.
Posted on Reply
#16
Scrizz
Holy S***! GDDR5. I can't wait till GDDR6.
wow
Posted on Reply
#17
Unregistered
I think these will get bought for ati/nvidia to use on their flagship graphics cards tho' at least.
#18
effmaster
You know this will be implemented by ATI ASAP just like they did with the X1950's XTX's with the GDDR4 memory to compete with the then just about to release Nvidea 8800's:laugh:
Posted on Reply
#20
Random Murderer
The Anti-Midas
effmasterYou know this will be implemented by ATI ASAP just like they did with the X1950's XTX's with the GDDR4 memory to compete with the then just about to release Nvidea 8800's:laugh:
um, gddr4 was implemented quickly by ati because they created it.
Posted on Reply
#21
Unregistered
How does the jump to each consecutive form of graphics ram work then? Is it a smaller process,or differant chips or method of addresing the ram.I kinda understand with system ram.
#22
Random Murderer
The Anti-Midas
tigger69How does the jump to each consecutive form of graphics ram work then? Is it a smaller process,or differant chips or method of addresing the ram.I kinda understand with system ram.
well, gddr4 isn't really a big enough improvement over gddr3 to warrant a new name, but ati uses weird marketing :shadedshu it's really more like gddr3.5.
the new name denotes architecture type.
there was never gddr1, because graphics cards actually used ddr system ram chips. gddr2 is specialized ddr2 system memory, and gddr3 was actually a completely new architecture. gddr4 was optimized gddr3, and gddr5 is ddr3 system memory that has been shrunk to reduce power usage but still allow high frequencies and throughput.
Posted on Reply
#23
Ketxxx
Heedless Psychic
There is no such terme as GDDR perse, the "G" simply denotes "Graphics", and thus, makes it easier for people to differentiate between system and graphics memory. If it were me, I'd just use the term I invented to differentiate between system and graphics memory - VidRAM, just makes more sence to me.
Posted on Reply
#24
happita
This just gives me the giddies, lesser power consumption. Might make more efficient vid cards with these bad boys. Implement these into the next-gen cards and I think I'll be set for a LOOOOOONG time :respect:
Posted on Reply
#25
largon
Slowest GDDR5 is double the speed of fastest GDDR3/4...
KetxxxThere is no such terme as GDDR perse, (...)
Oh yes there is, it's official JEDEC jargon. Graphics DDRs (GDDRs) are completely different from DDR-SDRAMs.
GDDR3 is not even remotely similar to DDR3.
KetxxxAt last, maybe GDDR2 will now finally drop off the fact of the earth. I mean my god, wtf were they thinking with GDDR2? It ran hotter, slower even with its massively faster clock speeds, and was only capable of a 128bit memory bus.
GDDR2 already dropped from the face of the earth a long, long time ago...
There are no graphics boards on the market that use GDDR2.
Posted on Reply
Add your own comment
Nov 28th, 2024 18:04 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts