# GDDR3 vs. DDR2



## stupidbiznitch9 (Feb 22, 2007)

so in gfx cards is there a significant difference in memory in these two??


----------



## Ketxxx (Feb 22, 2007)

Yes, mainly DDR2 runs hotter and is only 128bit, not to mention slower. DDR3 all the way.


----------



## stupidbiznitch9 (Feb 22, 2007)

so is regular GDDR really bad?


----------



## Ketxxx (Feb 22, 2007)

DDR for graphics cards is actually better as it can be 256bit. DDR3 is obviously what anyone should be looking at when choosing a new card.


----------



## SlipSlice (Feb 22, 2007)

Ketxxx said:


> DDR for graphics cards is actually better as it can be 256bit. DDR3 is obviously what anyone should be looking at when choosing a new card.



I would go for DDR3 if you are looking to get a new graphics card.. It is definately the fastest out right now.  Until the X2800 series cards roll out.  I think one type of those cards is going to  have GDDR4 memory, but im not sure which one.. maybe the XT?


----------



## stupidbiznitch9 (Feb 22, 2007)

ya well definately buying a new one but i was just wondering cuz mine is GDDR a geforce6600, it kinda sucks but hey i got it for free.....


----------



## SlipSlice (Feb 22, 2007)

stupidbiznitch9 said:


> ya well definately buying a new one but i was just wondering cuz mine is GDDR a geforce6600, it kinda sucks but hey i got it for free.....



Well hey, can't go wrong there.


----------



## tkpenalty (Feb 22, 2007)

lol 6600 is quite decent. DDR-2/GDDR-2 Isn't very efficient and not very great in terms of performance, GDDRIII is good and powerful as well as GDDR4. DDR is basically great when clocked high but generates so much heat.


----------



## SPHERE (Feb 22, 2007)

Ketxxx said:


> DDR for graphics cards is actually better as it can be 256bit. DDR3 is obviously what anyone should be looking at when choosing a new card.





Ketxxx said:


> Yes, mainly DDR2 runs hotter and is only 128bit, not to mention slower. DDR3 all the way.




http://www.newegg.com/Product/Product.asp?Item=N82E16814141026

dude you should really check your facts before you go spreading disinformation


----------



## tkpenalty (Feb 22, 2007)

SPHERE said:


> http://www.newegg.com/Product/Product.asp?Item=N82E16814141026
> 
> dude you should really check your facts before you go spreading disinformation



Hahahhaahahah finally Ketxxx gets pwned  
Ket got pwned Ket got pwned so bad


----------



## Scavar (Feb 22, 2007)

SPHERE said:


> http://www.newegg.com/Product/Product.asp?Item=N82E16814141026
> 
> dude you should really check your facts before you go spreading disinformation




Err isn't that the interface for the CARD and not the GDDR2?


----------



## largon (Feb 22, 2007)

Ketxxx said:


> Yes, mainly DDR2 runs hotter and is only 128bit, not to mention slower.


It was _GDDR2_ that was too hot and slow, _DDR2_ on the other hand is a good budjet solution and it actually consumes less power than GDDR3. GDDR2 has more common to DDR1 than to DDR2, and infact, GDDR2 is an older tech compared to plain DDR2 which is the chip-of-choise for budjet GFX today, the exact same chips are used as main system memory on most recent AMD/Intel platforms. 





> DDR3 all the way.


DDR3 is not the same as GDDR3. GDDR3 is a DDR2 chip modified to be used as high-bandwidth texture buffer memory where latency is not an issue. 

And btw, the 64bit/128bit/256bit/512bit memory interface on graphics cards is not a feature of a single memory chip nor it is limited by the type of chip used (DDR2/GDDR3/etc.). Infact, you could build a 1024bit bus using any given generation chips. Memory bus width is the sum of chip count and chip width. 
For example the X1900XT I had sometime ago had 8 Samsung GDDR3 chips (K4J52*32*4QC-BJ11). As you know, X1900XT has a 256bit wide memory bus. The bolded "32" in the part number is the width of the chip; chip width is basically the amount of parallel cell arrays inside the chip, thus chip width is the amount of bits the chip can send or receive within a single clock cycle. 

One "x32" chip forms a 32bit bus - 8 such chips form a 256bit wide bus together (8*32bit=256bit), as long as the GPU's memory controller has a dedicated bus for each of the chips.


----------



## SPHERE (Feb 22, 2007)

largon said:


> It was _GDDR2_ that was too hot and slow, _DDR2_ on the other hand is a good budjet solution and it actually consumes less power than GDDR3. GDDR2 has more common to DDR1 than to DDR2, and infact, GDDR2 is an older tech compared to plain DDR2 which is the chip-of-choise for budjet GFX today, the exact same chips are used as main system memory on most recent AMD/Intel platforms. DDR3 is not the same as GDDR3. GDDR3 is a DDR2 chip modified to be used as high-bandwidth texture buffer memory where latency is not an issue.
> 
> And btw, the 64bit/128bit/256bit/512bit memory interface on graphics cards is not a feature of a single memory chip nor it is limited by the type of chip used (DDR2/GDDR3/etc.). Infact, you could build a 1024bit bus using any given generation chips. Memory bus width is the sum of chip count and chip width.
> For example the X1900XT I had sometime ago had 8 Samsung GDDR3 chips (K4J52*32*4QC-BJ11). As you know, X1900XT has a 256bit wide memory bus. The bolded "32" in the part number is the width of the chip; chip width is basically the amount of parallel cell arrays inside the chip, thus chip width is the amount of bits the chip can send or receive within a single clock cycle.
> ...


 you can also "stack" mem chips (<for lack of knowing the technical name) 

^a great example of this is system memory 

you can have 16x 32-bit mem chips form a 256 bit bus where each pair instead of being 64-bits total remains 32 bits (2x 32 bit chips only adding 32 bits to the total bus width) there are many possible configurations these ones only being a few possible configs 

i suppose a good analogy for what im talking about would be 8 pairs of 32v, 1A batteries hooked to each other in parallel then hooking the pairs up in a series

(32v1A+32v1A [in parallel] = 32v2A)x8 [in series] = 256v16A (right?) lol 




tkpenalty said:


> Hahahhaahahah finally Ketxxx gets pwned
> Ket got pwned Ket got pwned so bad



heh (dont take this the wrong way dude) but i've noticed him "assuming" things allot but its all good i kno he is just tryin to help out and i've seen alot of that from him 



Scavar said:


> Err isn't that the interface for the CARD and not the GDDR2?


largon did a good job of explaining the answer to that question 

but basically no  its the interface for all the memory chips on the card combined


----------



## stupidbiznitch9 (Feb 22, 2007)

i love how people ask questions to get answers and other people get carried away with other crap......


----------



## stupidbiznitch9 (Feb 22, 2007)

also the difference between 128-bit and 256-bit


----------



## largon (Feb 22, 2007)

Difference between 128-bit and 256-bit? 

256bit has double the bandwidth of 128bit. Given the frequency is equal. 





stupidbiznitch9 said:


> i love how people ask questions to get answers and other people get carried away with other crap......


Well, I would rather read a long thread with correct answers instead of short one with inaccurate ones.


----------



## SPHERE (Feb 22, 2007)

largon said:


> Difference between 128-bit and 256-bit?
> 
> 256bit has double the bandwidth of 128bit. Given the frequency is equal.


 theoretically 


largon said:


> Well, I would rather read a long thread with correct answers instead of short one with inaccurate ones.


cheers


----------

