• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

ATI Radeon HD 2900 XT

On par yes. Thats why its a flop. We all expected much more, especially that the card is 6 months late! :wtf:

8800 owners won't make the jump unless the card sucks ass in DX10...which i seriously doubt. BTW I was a ATi fanboy and bought the G80 and love this card. Best card i've had since the ATi 9700. Buy whats best!

meh... :banghead: its a dx10 card, why wont peeps just leave off till we get unbiased dx10 benchies? it will run every dx9 game out there at max settings at insane resolutions. now please hang on with all this moaning till we can test it as it is designed to be used...
 
atitool doesnt work with my x2900xt , any ideias ??? i use amdtool to OC but in this review you say that atitool work OK to change voltages , but i cant get it work

i tryed with atitool 0.27
 
The main developer for ATItool is very busy, I doubt he has very much time to make ATItool work with the HD series.
 
atitool doesnt work with my x2900xt , any ideias ??? i use amdtool to OC but in this review you say that atitool work OK to change voltages , but i cant get it work

i tryed with atitool 0.27

You might keep an eye on Ray Adam's Ati tray tools. He should come up with something for that card.
 
the current development build with voltage/monitoring/fan control support isnt publicly released yet
 
Just a quick question W1z, will we need the 8-pin plugged in for ATItool to overclock?

Oh, and if you need help testing. lol
 
Just a quick question W1z, will we need the 8-pin plugged in for ATItool to overclock?

Oh, and if you need help testing. lol

but isnt the extra power NEEDED for overclocking? so wouldnt using atitool without the extra power connector just result in failed overclocks and crashes or am i missing something here? wouldnt mind knowing coz im gonna get me one eventually
 
but isnt the extra power NEEDED for overclocking? so wouldnt using atitool without the extra power connector just result in failed overclocks and crashes or am i missing something here? wouldnt mind knowing coz im gonna get me one eventually
I'm willing to try without. I think the headroom will just be lower with 2 6-pins vs a 6 and 8. Get paid today, and I'll be contacting Corsair for the 8-pin cable anyway.
 
All this being said, doesnt anyone wonder how well the ATi card would do if there was a way to increase the shader clock?
 
Just like nvidia?
 
All this being said, doesnt anyone wonder how well the ATi card would do if there was a way to increase the shader clock?

hmmm, isnt there a way to do that? I mean, comparing to the 8800's, how is the shader clock controlled?
 
Good news is, it beat the 8800GTX in some benchmarks, but in most, it was either at the top, or very close to it. Im very impressed with this as its meant to compete against the GTS (640 and 320MB versions). Very well done indeed. Power consumption is the only yellow exclamation point I see.

Spot on review W1zzard but you shouldnt be disappointed in the HD2900XTs performance. You should marvel at it. Its quite well. 6 month delay just proves that.
 
All this being said, doesnt anyone wonder how well the ATi card would do if there was a way to increase the shader clock?

The same person probably wonders how they'd cool and power a card that had such huge power requirements as a shader-overclocked R600 :p.
 
Good news is, it beat the 8800GTX in some benchmarks, but in most, it was either at the top, or very close to it. Im very impressed with this as its meant to compete against the GTS (640 and 320MB versions). Very well done indeed. Power consumption is the only yellow exclamation point I see.

Spot on review W1zzard but you shouldnt be disappointed in the HD2900XTs performance. You should marvel at it. Its quite well. 6 month delay just proves that.

While I agree to most of what you said, I still think the 8800GTS 640mb is better. It's cheaper and consume far less energy. Although you can get a free Logitech G5 from newegg with purchase of any HD2900XT.
 
I wouldnt say its better, if you can use the video hemi? part of the 2900xt the extra 80$ or whatever would be well worth it.. only time will tell what it is capable of really it has huge OC'n potential as expected from the majority of Ati products... I hope they come out with one with a die shrink and am going to wati until dx10 games really hit the market before I buy a dx10 card now... my 1900gt OC'd to 660+/810 takes on all my dx9 games with a breeze it even fairs well with Oblivion at very high settings. (I need to give it 1.4vgpu to reach up to 690/810 stable and will since Im waiting now).
 
when the 9700 came out it was the best card ever(still got mine) those were great times. this 2900 xt is a great card but reminds me of the voodoo banshee kick ass card that is as good as the other card.Good news is there is only 2 real choices out there.:twitch:
 
The same person probably wonders how they'd cool and power a card that had such huge power requirements as a shader-overclocked R600 :p.

its not that hard. ATi need to redo the PCB tho, I mean dear god theres lazy and theres lazy, but that PCB design has introduced a whole new rank of lazy PCB design.
 
Notice I didnt say anything about the stock cooler ;), just its not that hard to cool the 2kXT given the right modding tools.
 
I really don't understand why everyone complains about it's heat output. I see 75c during heavy gaming according to the AMD GPU Clock Tool's log. My X1800XT ran just as hot.
 
I really don't understand why everyone complains about it's heat output. I see 75c during heavy gaming according to the AMD GPU Clock Tool's log. My X1800XT ran just as hot.

exactly :) 80nm can take that heat fine. Im saying it can be improved on.
 
True.

I think I'm gonna skip straight past any lapping or other basic mods, and jump into water cooling it. I'm thinking of picking up a 3x120mm rad and a MCW60 block with some ramsinks.
 
I wouldnt say its better, if you can use the video hemi? part of the 2900xt the extra 80$ or whatever would be well worth it.. only time will tell what it is capable of really it has huge OC'n potential as expected from the majority of Ati products... I hope they come out with one with a die shrink and am going to wati until dx10 games really hit the market before I buy a dx10 card now... my 1900gt OC'd to 660+/810 takes on all my dx9 games with a breeze it even fairs well with Oblivion at very high settings. (I need to give it 1.4vgpu to reach up to 690/810 stable and will since Im waiting now).

1.4vgpu?? whats the stock 3dmode voltage? coz the x1900xtx is 1.45ish i think, i take it up to 1.57 for 760ish core oc, tho i could probs get away with a lower vcore
 
Back
Top