The company website has the Radeon HD 2600XT listed in detail, while the Radeon HD2400 Series link is present but empty. Gecube opted for a special "X-Turbo2 Silent Fan" so the card may be clocked higher than default. No mention of actual clock speeds are made, but the card features 256MB GDDR4, Built-in HDMI (HD Video with 5.1 surround audio) and Dual DVI (2 Dual link, HDCP).
45 Comments on GECUBE lists Radeon HD 2600 XT
www.fudzilla.com/index.php?option=com_content&task=view&id=939&Itemid=34
Maybe they're trying to get past the 128 bit bus by using GDDR4 clocked higher, and also, I agree that they've probably got a steady cheap supply of GDDR4 from the X1950XTX, and they're taking advantage of it.
Sort of like the 7600GT. 1600mhz DDR3 on a 128bit wide bus. If they stuck slow ram on that small of a bus it would be crap(7600GS for example).
I would undertand it if that was the case; ATi are making the difference between the XT and XTX physical, so it cant just be a simple overclock that makes the difference.
Anywho, if these HD 2600xt's are cheap enough, im gonna sell my X1950 pro and buy two of em :D
I find it weird that ati have 512bit cards ---> 128bit ---> 64 bit. Id think they would scrap 64 bit altogether and go 512 --> 256 --> 128.
Unless they are planning to release 2*50 cards or rename all their cards, where they have double the bitrate :p (They wait for all these cards to sell like mad and then release a new product line :D)
The X1900 is more like a build up version of the X1800.
And all the X1000s after the X1800s are 3 p. shaders to 1p. pipeline
*No, it doesn't. The memory uses 128bit bus. But anyways the 2600's should deffinately use a 256 mem bus.
It has 256-bit bus for one direction only I believe. I gotta look it up in some old magazines, but it's half 128-bit, half 256-bit.
EDIT: I skimmed through some responses, didn't see erocker had already said Ring Bus, heh.
Maybe you thought it had 256-bit since the X1650XT is basically an X1950Pro with only 24pps (or pixel shaders, w/e, but it's based on the R570 core, that's the important part)
I don't see how putting a full 256-bit bus on a graphics card can be more expensive than putting GDDR4 or even just GDDR3 memory, so all of the mid-range cards should really have it by now. Although knowing both companie's strategies, maybe they'll soon have a 2900 Pro or a 2900 GTO version, for the upper-mid-range.
That said, perhaps its the memory controller thats expensive?
Edit: Oh and IMO, DX10 hardware needs a lot more GPU balls now - with physics and more effects pushed away from the CPU to the GPU, the GPU has to be more and more powerful to keep up, as opposed to the memory.