Friday, May 18th 2007

GECUBE lists Radeon HD 2600 XT

The company website has the Radeon HD 2600XT listed in detail, while the Radeon HD2400 Series link is present but empty. Gecube opted for a special "X-Turbo2 Silent Fan" so the card may be clocked higher than default. No mention of actual clock speeds are made, but the card features 256MB GDDR4, Built-in HDMI (HD Video with 5.1 surround audio) and Dual DVI (2 Dual link, HDCP).
Source: GECUBE
Add your own comment

45 Comments on GECUBE lists Radeon HD 2600 XT

#26
Mussels
Freshwater Moderator
Generally, 256/512MB versions are the same bit width, with the exception of say the powercolor EZ series, which used slowerDDR1 on its 256MB versions, compared to DDR2 on its 256MB.
Posted on Reply
#27
largon
GDDR4 is different, it's exclusively 64MB 32bit/ic.
Posted on Reply
#28
Mussels
Freshwater Moderator
ah i see, so it depends on the amount of chips.
Posted on Reply
#29
Tatty_Two
Gone Fishing
Pinchydoesnt the 2900XTX use GDDR4 while the 2900XT uses the GDDR3?

I would undertand it if that was the case; ATi are making the difference between the XT and XTX physical, so it cant just be a simple overclock that makes the difference.

Anywho, if these HD 2600xt's are cheap enough, im gonna sell my X1950 pro and buy two of em :D

I find it weird that ati have 512bit cards ---> 128bit ---> 64 bit. Id think they would scrap 64 bit altogether and go 512 --> 256 --> 128.

Unless they are planning to release 2*50 cards or rename all their cards, where they have double the bitrate :p (They wait for all these cards to sell like mad and then release a new product line :D)
Your right but I was of the beleif that the 2900XTX is binned and will not be retail released.
Posted on Reply
#30
Ripper3
Even if it isn't being released, someone will likely make a 2900XT with GDDR4, it's pin compatibel with GDDR3 modules, and can use the same memory controller anyhow, so there wouldn't be much difficulty in squeezing on some GDDR4.
Posted on Reply
#31
Mussels
Freshwater Moderator
true but no one would buy it... it was cancelled beacuse the DDR4 XTX was no faster than the DDR3 XT - the core was limited, not the ram speed.
Posted on Reply
#32
largon
Mussels,
Yes, bus width depends on the "width" of the chip and the amount of chips used.

DDR1 and DDR2 = 8 and 16bit/ic
gDDR1 and 2 = 16bit/ic
gDDR3 and 4 = 32bit/ic
Posted on Reply
#33
Mussels
Freshwater Moderator
handy info that i will forget within a week. :D
Posted on Reply
#34
Tatty_Two
Gone Fishing
Musselshandy info that i will forget within a week. :D
Damn your good.....I have forgotten it already :eek: I am old tho so dont function very well these days. :p
Posted on Reply
#35
Pinchy
Ripper3I don't see how putting a full 256-bit bus on a graphics card can be more expensive than putting GDDR4 or even just GDDR3 memory, so all of the mid-range cards should really have it by now. Although knowing both companie's strategies, maybe they'll soon have a 2900 Pro or a 2900 GTO version, for the upper-mid-range.
Exactly.

If they were to release a 2600xt with 256-bit, then what purpose would the 2900PRO/GT serve?
Its not how much it costs the actual company...i mean, there wouldnt be that much of a price difference in producing an x2900xt and an x300 :p.
largonMussels,
Yes, bus width depends on the "width" of the chip and the amount of chips used.

DDR1 and DDR2 = 8 and 16bit/ic
gDDR1 and 2 = 16bit/ic
gDDR3 and 4 = 32bit/ic
Not sure about GDDR4, but doesnt GDDR3 have 64bit/ic as well?
Posted on Reply
#38
Green Planet
All informations about Radeon HD 2600 & 2400 (PCI Express and AGP versions) posted by Sapphire (privileged partner of ATI):



Thanks to Matbe and, of course, to Sapphire !
Posted on Reply
#39
Pinchy
Green PlanetAll informations about Radeon HD 2600 & 2400 (PCI Express and AGP versions) posted by Sapphire (privileged partner of ATI):



Thanks to Matbe and, of course, to Sapphire !
Is it just me, or does anyone else find it weird that the 2600XT has higher clock speeds than the 2900XTX? :wtf:

Also, if those pics are valid....Woot for AGP DX10 :D
Posted on Reply
#40
largon
Pinchy,
Well, GPUReview is wrong. As it is with HD2900XT; their chart says it has 8x64bit chips. In reality it has 16x32bitters.

The 512MB X800XL is truly a horrible abomination.
It has, not 8, but 16 32MB 32bit/ic gDDR3 chips with ½ of the chips' I/O pins crippled in the card's bios so that the active bus is still 256bit despite the amount of chips.
Posted on Reply
#41
Green Planet
PinchyIs it just me, or does anyone else find it weird that the 2600XT has higher clock speeds than the 2900XTX? :wtf:

Also, if those pics are valid....Woot for AGP DX10 :D
Also, HD 2600 XT (65nm) has higher clock speeds than the HD 2900 XT (80nm).

We must wait 1st July, probable, to see if all these are correct...
Posted on Reply
#42
Zubasa
Why not?
The X1600XT has higher core clock speed than my X1950Pro anyways.
Posted on Reply
#43
Pinchy
Yeah...but the 2900 is meant to be the flagship of ati cards...it should have the highest clock speeds :p
Posted on Reply
#44
Tatty_Two
Gone Fishing
Green PlanetAlso, HD 2600 XT (65nm) has higher clock speeds than the HD 2900 XT (80nm).

We must wait 1st July, probable, to see if all these are correct...
Why would you think thats odd? I beleive they have different cores or dervitiives, the 8600GTS has higher clock speeds than both the 8800GTS and GTX....again, because it has different cores.....G80......G84......G86 etc etc
Posted on Reply
#45
WarEagleAU
Bird of Prey
looks to be an awesome performing midrange card. Maybe worth going xfire with these?
Posted on Reply
Add your own comment
Jan 17th, 2025 19:55 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts