# Why the HD 2900xt is better.



## m3lisk (Jul 2, 2007)

Well, looking at all the reviews out there, I have come to a few conclusions:
1- I am going to buy this card
2- It is better than the 8800gtx
3- It is cheaper than the 8800gtx

You may say, what are you retarded, look at all the reviews, the 8800gtx gets wayy better frames. Well, not in all games. In fact, there is quite a few that the 2900xt has a lead on. Plus, in the 3dMarks, it also has a slight lead. This says that the card has locked in potential in my eyes. I mean, look at the spec of it. Technically speaking it should blow the hell out of the 8800gtx. I blame it on drivers. The 8800 gtx has had tons of driver re-polishings. I theorize that ATI wil wait until everyone has bought an 8800, then go MWAHAHAHA! then release the new drivers which will unlock the 2900's ture potential from within and make it the best card out there. Then they will know who is truly on their side. Those with the 2900 will reign supreme and ultimately conquer the 8800 USERS!!! Seriously though, wait a little whilee and ATI will have nice drivers for the 2900 that will allow it to easily keep up with the 8800gtx. The prior rant was built up inside, I _*had*_ to get it all out.


----------



## GJSNeptune (Jul 2, 2007)

Lots of theories in there. Hope they pan out. Good luck.


----------



## Deleted member 3 (Jul 2, 2007)

A GTS was cheaper last time I checked and performs at about the same level while using a lot less power.


----------



## mandelore (Jul 2, 2007)

the standard 2900xt was really aimed at the gts market tbh, with the higher clocked 1gb going after the gtx, at least thats what i believe was stated


----------



## Chewy (Jul 2, 2007)

I dont see them waiting to release killer drivers, they want sales now and its not happening to well for them. Im sure Niv has a HUGE lead in sales even just since the 2900 has been released... and thats not good for ati/amd.

 Personally I want to wait till nexted gen cards, but if I find a good deal/sale on a higher end dx10 card I will take it. I see how you cant rerally wait around though and good luck witht he 2900xt its a good card, hopefully it gets better.


----------



## GJSNeptune (Jul 2, 2007)

Niv? Nivea for Men?


----------



## Chewy (Jul 2, 2007)

mandelore said:


> the standard 2900xt was really aimed at the gts market tbh, with the higher clocked 1gb going after the gtx, at least thats what i believe was stated


How is the 1gb model paring with the gtx? it has a higher clock but I dont think its going to surpass yet.. you try oc'in your yet? wonder if it will oc higher than the 512 version.




GJSNeptune said:


> Niv? Nivea for Men?


  Nividea!!1! Nivida :O

Edit: ok its Nivdia.. I never was a fan of Nivdia cards... I actually hated thier organisation with a passion, therefore never paid attention to them.


----------



## m3lisk (Jul 2, 2007)

DanTheBanjoman said:


> A GTS was cheaper last time I checked and performs at about the same level while using a lot less power.



i was talking about a gtx, not a gts, and the 640mb gts is about the same price as the 2900xt, and the 2900 still owns it. 

As for the power, boo-hoo, go get a new PSU, lol.


----------



## hat (Jul 2, 2007)

The 2900XT is a good buy IMO. The Sapphire model I believe is $370... it has potential, just wait until ATi releases good drivers for it. It beats the GTX on paper, and the 8 series went through all it's driver revisions already, however the 2900 has not done so yet.


----------



## GJSNeptune (Jul 2, 2007)

Chewy said:


> Nividea!!1! Nivida :O
> 
> Edit: ok its Nivdia.. I never was a fan of Nivdia cards... I actually hated thier organisation with a passion, therefore never paid attention to them.


Wow. No, it's none of the above.

NVIDIA.


----------



## Tatty_One (Jul 2, 2007)

m3lisk said:


> Well, looking at all the reviews out there, I have come to a few conclusions:
> 1- I am going to buy this card
> 2- It is better than the 8800gtx
> 3- It is cheaper than the 8800gtx
> ...



I am not in a position to comment specifically on your comments as I have neither card, from reviews I have read...loosely you are about right, I have no doubts however that currently, across the board (I mean on average)the 8800GTX is the more powerful card but again I have no doubts that the 2900XT is better "Bang for Buck" probably by some margin and certainly up there in some benches, therefore I agree the better bet, although once you get to really Hi Res with lots of AA enabled I think the 2900XT's performance kind of disapears on the horizon (unless the latest driver release sorted AA at High Res??? not sure....dont quote me). 

What I would say to you however, as I say to anyone with similar decisions to make and that is "why are you judging a DX10 cards performance on current DX9 benches/games which may have no reflection whatsoever on your gaming experience in DX10, unless of course you may be in the market to buy another card in 3-6 months.....but if thats the case, surely you will be contemplating a similar decision but with the 8900 not the 8800.

My advice, for what it's worth, is be patient for a little while longer and see what DX10 brings for sure in a bug free full version game, benchmarks are largely irrelivant, especially nearly 2 year old ones but if thats your flavour....then yes the 2900XT is the better bet IMO.


----------



## mandelore (Jul 2, 2007)

Chewy said:


> How is the 1gb model paring with the gtx? it has a higher clock but I dont think its going to surpass yet.. you try oc'in your yet? wonder if it will oc higher than the 512 version.
> 
> 
> 
> ...



on that website from canada, extremepc the 1gb 2900xt is right there with the 8800 ULTRA, so yes its better than the 8800 gtx in terms of 3dmark06 benchies


----------



## Tatty_One (Jul 2, 2007)

hat said:


> The 2900XT is a good buy IMO. The Sapphire model I believe is $370... it has potential, just wait until ATi releases good drivers for it. It beats the GTX on paper, and the 8 series went through all it's driver revisions already, however the 2900 has not done so yet.



I agree but with each Forceware release there remains still gains to be had, some considerable.


----------



## mandelore (Jul 2, 2007)

and with each catalyst I say ditto


----------



## mandelore (Jul 2, 2007)

lol a flametastic thread name if ever there was one


----------



## Chewy (Jul 2, 2007)

Idle power consumption factors in for me alot since I dont just game on my computer. I have more idle time than gaming time, the 2900 doesnt use to much juice idleing.

 BUying a new psu to run your card sucks.. you already have one and if you just bought your computer not to long ago like me than you dont want ot have to go buy a new psu when you just got one. 

 Trys to stay on topic  but I guess I cant go all out overclocking with my using a 2900xt.. which is a bummer.. I was expecting to have more headroom for the 2900xt when I bought it.


----------



## Mediocre (Jul 2, 2007)

I'd just worry about the early dx10 reviews I've read. There was SERIOUS artifact/rendering issues with the 2900, it didn't even render certain elements of the benchmark....This was the first couple weeks they were out, have they been fixed with driver updates??
You can have all the speed in the world, but if you're not rendering the whole scene who cares?


----------



## Tatty_One (Jul 2, 2007)

mandelore said:


> and with each catalyst I say ditto



You may well do so but I was not saying that the gains had already been made in the drivers as was the case for the comment on Forceware. The last update gave me 200 points on 2006.


----------



## m3lisk (Jul 2, 2007)

Lol, thread explosion *FTW*!


----------



## Kasparz (Jul 2, 2007)

GJSNeptune said:


> Wow. No, it's none of the above.
> 
> NVIDIA.


Is nvidia still in business?
I thought nvidia died before 3dfx.


----------



## m3lisk (Jul 2, 2007)

Kasparz said:


> Is nvidia still in business?
> I thought nvidia died before 3dfx.



shadedshushadedshu:shadedshu


----------



## jagjitnatt (Jul 2, 2007)

I simply don't understand why people are so immature.
They go directly to conclusions without even thinking once. HD 2900XT is a very nice card.
It is better than 8800GTX. See what happenned when NVIDIA came out with 8800 series in Nov?
Their hardware was very nice cause it was being compared to X1950XTX.
Their drivers were immature, i should say their drivers plain SUCKED.
It took them over 6 months to achieve the performance that they get now.
Give ATI some time. And those dumb people who feel drivers aren't making a difference, check this link which shows the kind of improvement drivers have been showing on HD2900.

http://www.computerbase.de/news/tre...7/mai/ati_radeon_hd_2900_xt_treibervergleich/

HD2900 is being compared to GTX and GTS. sure people expect it to be slightly ahead or behind it. But I don't understand, there are people who believe because ATI was late 6 months, HD2900 should have scored ten times than that of 8800.
Guys, every gen of cards has seen this trend that the top of the line cards are pretty much similar in performance (8-12% diff)
Now ATI is slow because all the features it offers in hardware are not being implemented in games. But once DX10 is mainstream we'll see those features and then it will be like GeForce 7 vs X1k series where X1k series could do AA+HDR whereas nVIDIA didn't implement it in hardware. 
Complex shaders and Tesselation are such features. Games today use scalar shaders, which is why they only see 64 shaders in 2900 whereas they see 128 shaders in 8800GTX.
Once DX10 games are here, they will be able to utilise the advanced shaders which would give the card the power of around 256 shaders in best case (64X4, when all complex shaders are in use)
Moreover if you look at the DX10 games like company of heroes, you'll see that ATI performs better than 8800GTX.
So, I would say m3lisk made the correct decision and went for the best and the most future proof card out there.

Rock on ATI users


----------



## mandelore (Jul 2, 2007)

Mediocre said:


> I'd just worry about the early dx10 reviews I've read. There was SERIOUS artifact/rendering issues with the 2900, it didn't even render certain elements of the benchmark....This was the first couple weeks they were out, have they been fixed with driver updates??
> You can have all the speed in the world, but if you're not rendering the whole scene who cares?



you will find its software related, nothing to do with the hardware, and which benchy u talking about? I ran 3dmark fine.   some dx10 games, like lost planet had issues, but again, software, it wouldnt surprise me if Nvidia snuck in some ebil anti ati code hahahaha, well probs not, but who knows... they appear to pay game devs an awful lot


----------



## hat (Jul 2, 2007)

Kasparz said:


> Is nvidia still in business?
> I thought nvidia died before 3dfx.


You live in a hole?? nVidia bought 3DFX a LONG time ago... nVidia ant ATi are the 2 video card manifacturers. I don't see how you can't tell that nVidia is still in business... :/ :/ :/


----------



## mandelore (Jul 2, 2007)

lool, that must have been a joke..


----------



## m3lisk (Jul 2, 2007)

mandelore, where the f#$& did you get your avatar?


----------



## Tatty_One (Jul 2, 2007)

mandelore said:


> on that website from canada, extremepc the 1gb 2900xt is right there with the 8800 ULTRA, so yes its better than the 8800 gtx in terms of 3dmark06 benchies



And on the 2600XT thread I posted a linkie to a review from only last week (28 June) with the 2900XT Versus the 8800GTS 640MB, both had the up to date drivers, both overclocked to their max and the GTS beat the 2900XT in over half the benches, reviews can be very selective.

here ya go, now as I said they can be selective so I am not supporting the findings....meerly posting them for info:

http://it-review.net/index.php?option=com_content&task=view&id=1435&Itemid=91


----------



## Chewy (Jul 2, 2007)

jagjitnatt said:


> Moreover if you look at the DX10 games like company of heroes, you'll see that ATI performs better than 8800GTX.
> So, I would say m3lisk made the correct decision and went for the best and the most future proof card out there.
> 
> Rock on ATI users




 Nice post man but when the heat was turned up AA/AF higher res 1900x1200 the 2900xt fell behind in the benchies and thats where I want my gaming to be higher res with AA/AF maxed.. hopefully its a driver issue. good info in your post though. "Nvidia" (happy Mandelore?) has come a long way with image quality aswell, so Im not sure if the lower fps was due to bettter image quality that ati once dominated.


----------



## Tatty_One (Jul 2, 2007)

jagjitnatt said:


> I simply don't understand why people are so immature.
> They go directly to conclusions without even thinking once. HD 2900XT is a very nice card.
> It is better than 8800GTX. See what happenned when NVIDIA came out with 8800 series in Nov?
> Their hardware was very nice cause it was being compared to X1950XTX.
> ...



Who in this thread has said that drivers dont make a difference....unless I clicked on the wrong thread title?


----------



## Xaser04 (Jul 2, 2007)

jagjitnatt said:


> I simply don't understand why people are so immature.
> They go directly to conclusions without even thinking once. HD 2900XT is a very nice card.
> It is better than 8800GTX. See what happenned when NVIDIA came out with 8800 series in Nov?
> Their hardware was very nice cause it was being compared to X1950XTX.
> ...




Whilst I agree that the HD2900XT will imrpove with future driver revisions alot of the additional features it has will most likely go to waste. 

Features such as the tesselation unit will probably never be used in pc games until nvidia have a card capable of the same thing, this is mainly due to game developers not wanting to cause a rift between them and the graphics card companies. 

Also a common miss conception is that the 7xxx series could not do AA + HDR together. This is only partially correct. They could do Integer based HDR and AA perfectly fine together (ala source engine) however it was FP16 HDR and AA that caused the problem (Oblivion). 

This really was not a problem as unless you were running a Xfire setup HDR and AA together in Oblivion made the game unplayable in certain areas (at higher resolutions). 

Of course this is not to day that the HD2900XT will not improve and given its current price point if significant improvements are made then it certainly will offer excellent bang for buck. 

The question is of course will these improvements come as too little to late?!


----------



## L|NK|N (Jul 2, 2007)

Which ever card PWNS in Crysis will be the card I will get.  But, the game is still a ways away, so hopefully ATI will have greatly improved their drivers (I always want the underdog to win), because Id really like to have the HDMI!!!


----------



## m3lisk (Jul 2, 2007)

LiNKiN said:


> Which ever card PWNS in Crysis will be the card I will get.  But, the game is still a ways away, so hopefully ATI will have greatly improved their drivers (I always want the underdog to win), because Id really like to have the HDMI!!!



Yeah, ditto to that. Underdogs should always come out on top, I mean look at the Russians in WWII, lol.


----------



## mandelore (Jul 2, 2007)

m3lisk said:


> mandelore, where the f#$& did you get your avatar?



i made it


----------



## d44ve (Jul 2, 2007)

mandelore said:


> i made it



I have always liked your avatar....

however, everytime I look at it, I cant help but look at his (or her?) right hand. It looks a bit "off"


----------



## mandelore (Jul 2, 2007)

Chewy said:


> Nice post man but when the heat was turned up AA/AF higher res 1900x1200 the 2900xt fell behind in the benchies and thats where I want my gaming to be higher res with AA/AF maxed.. hopefully its a driver issue. good info in your post though. "Nvidia" (happy Mandelore?) has come a long way with image quality aswell, so Im not sure if the lower fps was due to bettter image quality that ati once dominated.



oh that wasnt me that pointed out the nvidia wotcha mecallit


----------



## mandelore (Jul 2, 2007)

d44ve said:


> I have always liked your avatar....
> 
> however, everytime I look at it, I cant help but look at his (or her?) right hand. It looks a bit "off"



its actually alyx, but in goth, and yes, i had finger poser for gmod10 enabled and it went funny, i could never be bothered to repose as it took so frikkin long to get it right... lol,


----------



## d44ve (Jul 2, 2007)

mandelore said:


> its actually alyx, but in goth, and yes, i had finger poser for gmod10 enabled and it went funny, i could never be bothered to repose as it took so frikkin long to get it right... lol,



hell... either way..... its WAAAAAAAAAAAAY better than anything I can do


----------



## m3lisk (Jul 2, 2007)

Ahh, HL2. Very nice indeed. So, quick off topic poll: x1650xt in crossfire, or single x1950 while I wait for the 2950xt(lol)?


----------



## mandelore (Jul 2, 2007)

yes yes.. totally off topic, but tickled sum1 took an interest, the hand is actually clipping with  the right boobie and is a lill twisted, that happened when i made the ragdoll a statue.. oh well


----------



## m3lisk (Jul 2, 2007)

I'd tap that.


----------



## d44ve (Jul 2, 2007)

m3lisk said:


> I'd tap that.




LOL


----------



## zekrahminator (Jul 2, 2007)

m3lisk said:


> I blame it on drivers. The 8800 gtx has had tons of driver re-polishings. I theorize that ATI wil wait until everyone has bought an 8800, then go MWAHAHAHA! then release the new drivers which will unlock the 2900's ture potential from within and make it the best card out there.



They won't do that on purpose . 

But seriously. DirectX 9 is a very crappy way of benchmarking a DirectX10 card .


----------



## zekrahminator (Jul 2, 2007)

mandelore said:


> yes yes.. totally off topic, but tickled sum1 took an interest, the hand is actually clipping with  the right boobie and is a lill twisted, that happened when i made the ragdoll a statue.. oh well



Who's the green-eyed girl, again? .


----------



## mandelore (Jul 2, 2007)

oh thats Elexis Sinclaire featured in the Sin games, and this model from the Sin Episodes Emergence game provided by valve/steam, i think shes pretty nice for a pixelated chick   especially in the first cut scene and the bathing poolscene tee-hee-hee


----------



## mandelore (Jul 2, 2007)

oh , and she is moddeled after Bianca Beauchamp

ive seen better pics, but shes hot 






thread degeneration 4TW


----------



## LonGun (Jul 2, 2007)

Look at the "special edition"/"Ultra?" new version of the Nvidia 8800GTX cost almost $1000. What a shame on them :shadedshu .


----------



## m3lisk (Jul 2, 2007)

LonGun said:


> Look at the "special edition"/"Ultra?" new version of the Nvidia 8800GTX cost almost $1000. What a shame on them :shadedshu .



Ditto.


----------



## mandelore (Jul 2, 2007)

yeah its crazy price, you could get 2 1gb 2900's and blow the shit out of that mighty expensive purchase


----------



## Darknova (Jul 2, 2007)

The only problem I have with the 2900 is the amount of power it uses. Buying a new PSU is all well and good until you take a look at your power bill, then you wont be so happy.


----------



## mandelore (Jul 2, 2007)

imrunning mine on my 580W psu, 35A on the 12v rail, i think ill only need to upgrade when going to crossfire


----------



## Tatty_One (Jul 2, 2007)

mandelore said:


> imrunning mine on my 580W psu, 35A on the 12v rail, i think ill only need to upgrade when going to crossfire



But you havent overclocked yours yet


----------



## hv43082 (Jul 2, 2007)

Well obviously OP made up his mind and even more obviously he's an ATI/AMD fan boy.  So why even bother posting your comment?  Hell I could have shown you numerous proofs of why 8800GTX is the best card and the 8800GTS 320MB is the best bang for the buck out there and you still would have not change your mind.  So is this kinda like self justification for your purchase or a very late attempt to say: "ATI/AMD rules even though every publications say otherwise?"  I am no fan boy of any company but if this thread was meant to be a glorification of ATI/AMD, I apologize for my intrusion.


----------



## mandelore (Jul 2, 2007)

Tatty_One said:


> But you havent overclocked yours yet



no thats true, soon as i can i will, and then ill see if my rig explodes


----------



## m3lisk (Jul 2, 2007)

hv43082 said:


> Well obviously OP made up his mind and even more obviously he's an ATI/AMD fan boy.  So why even bother posting your comment?  Hell I could have shown you numerous proofs of why 8800GTX is the best card and the 8800GTS 320MB is the best bang for the buck out there and you still would have not change your mind.  So is this kinda like self justification for your purchase or a very late attempt to say: "ATI/AMD rules even though every publications say otherwise?"





LOL. You sir, seems to have an air of snobbiness.


----------



## DaMulta (Jul 2, 2007)

Why the HD 2900xt is better.


Because ATi made it

Thread Over/


----------



## hv43082 (Jul 2, 2007)

DaMulta said:


> Why the HD 2900xt is better.
> 
> 
> Because ATi made it
> ...



 At least DaMulta has the balls to just come out and say it straight: "It's better because I said so!" and not beat around the bush w/ some bs excuse of reviews and future driver optimization.  Hey DaMulta, give me my heat.


----------



## HookeyStreet (Jul 2, 2007)

m3lisk said:


> Well, looking at all the reviews out there, I have come to a few conclusions:
> 1- I am going to buy this card
> 2- It is better than the 8800gtx
> 3- It is cheaper than the 8800gtx
> ...



A word of warning: Get a better PSU before getting the HD X2900XT


----------



## mandelore (Jul 2, 2007)

lol Damuta, thread justification in 1 scentence ^^


----------



## demonbrawn (Jul 2, 2007)

blasted power consumption. I wish they could find a way to make next gen video cards not so power hungry.


----------



## LonGun (Jul 2, 2007)

Unlike the guys with big bucks, if your paychecks are four or three figures or less and depends on what your career and interests are, sometimes, it's better off staying behind "some" technology one step for saving some bling bling ($). Mr. nice guy like Mandelore, will definately hook you up with good deals on his 2900XT when he collects his 2950XT  .


----------



## LonGun (Jul 2, 2007)

demonbrawn said:


> blasted power consumption. I wish they could find a way to make next gen video cards not so power hungry.



I wish my Accord can go 70 miles per gallon of gas.


----------



## m3lisk (Jul 2, 2007)

Yup. I make $320 a week before taxes working for my dad, and I dont start work for a few more days(just moved), aand I gotta buy car insurance and sh1t, and besides, I too am going to wait for a 2950xt... OR MAYBE AN x3000xt!!! lol


----------



## mandelore (Jul 2, 2007)

LonGun said:


> Unlike the guys with big bucks, if your paychecks are four or three figures or less and depends on what your career and interests are, sometimes, it's better off staying behind "some" technology one step for saving some bling bling ($). Mr. nice guy like Mandelore, will definately hook you up with good deals on his 2900XT when he collects his 2950XT  .



actually, i had to save alot for that card, i would consider myself "piss poor" i thank my credit card for giving me the breathing room to get it. and i gotta sell my old card to make up some of the cost, amongst other things


----------



## LonGun (Jul 2, 2007)

3000XT lol hopefully AMD is still hanging around by then. jk kekeke.  
And Mandalore, what a sweat that you've been through to own that card lol.


----------



## m3lisk (Jul 2, 2007)

mandelore said:


> actually, i had to save alot for that card, i would consider myself "piss poor" i thank my credit card for giving me the breathing room to get it. and i gotta sell my old card to make up some of the cost, amongst other things



whats your old card?


----------



## mandelore (Jul 2, 2007)

m3lisk said:


> whats your old card?



x1900 xtx with dangerden tyee waterblock and plastered with nice ramsinks on hotspots and a small 40mm fan blowing over them

Edit: actually heres a piccy of it:


----------



## m3lisk (Jul 2, 2007)

Oh yeah. I saw that and was like I want it, but then I realized I dont have water cooling.


----------



## Tatty_One (Jul 2, 2007)

m3lisk said:


> LOL. You sir, seems to have an air of snobbiness.



Whichever way you perceive him.....he is right tho   And did you check the link in post 27?


----------



## m3lisk (Jul 2, 2007)

How much for the VGA in US dollars with the stock cooler and have it shipped to the US?


----------



## yogurt_21 (Jul 2, 2007)

lol 320 a week wow. that'd about cover my gas.lol


----------



## LonGun (Jul 3, 2007)

yogurt_21 said:


> lol 320 a week wow. that'd about cover my gas.lol



I know, gas price in California is like around $3.50+ per gal. Geez, lucky my damn computer doesn't need gas to run (but my car does lol).


----------



## tkpenalty (Jul 3, 2007)

m3lisk said:


> i was talking about a gtx, not a gts, and the 640mb gts is about the same price as the 2900xt, and the 2900 still owns it.
> 
> As for the power, boo-hoo, go get a new PSU, lol.



Power wise its not THAT dramatic... any Normal 600W PSU can handle it fine


----------



## Ketxxx (Jul 3, 2007)

Power consumption is an issue for the HD29XT, but can be alleviated to a degree, everybody seems to be forgetting 2D and 3D profiles, just edit the 2D speed profile to something like 500\1000. You don't need anything more than that for desktop work, hell even those clocks are overkill, who gives a crap if a word processing app gets 700 or 400fps?


----------



## AsRock (Jul 3, 2007)

m3lisk said:


> Well, looking at all the reviews out there, I have come to a few conclusions:
> 1- I am going to buy this card
> 2- It is better than the 8800gtx
> 3- It is cheaper than the 8800gtx
> ...



From whot i see that the GTx does beat the 2900 in most bench marks
AA is one of the main problems with it too even with the Driver Version	8.383.

Yes i truly hope it will get better too.

Seems to do good in OpenGL as well which there's a new version on it's way this or next month they say ( 2.x ) and late this year 3.0


With ATITool you can clock down the 2D settings down to at least 300htz which will cut down on heat and power usage.

I play all my games @ 1600x1200 with my 2900XT and very happy with it.  One main thing that put me of with the NV's  was the hot spots that they have.  And the problem i had with the 7900 range as it was heating my southbridge badly if extra card cooling was not applyed.

I like NV too just wish they would cool there cards better.


----------



## TylerZambori (Jul 3, 2007)

mandelore said:


> yes yes.. totally off topic, but tickled sum1 took an interest, the hand is actually clipping with  the right boobie and is a lill twisted, that happened when i made the ragdoll a statue.. oh well



She doesn't really look like she's into it.


----------



## jagjitnatt (Jul 3, 2007)

Actually you know what guys, c don't need those Kilowatt power supplies. If you have a branded power supply and its over 400 watts, you can run 2900XT. I am not saying that. ATI says that, and I know its true.
Check the power input of all the devices in ur pC

AMD CPU - 65 watts or 90 watts (check your version on AMD site)
Intel CPU - 65 watts to 110 watts ( same here)
2900XT - 200 watts (75watts from PCIe slot + 75X2 from 2 6pin is the max, but it requires 25 watts less)
8800GTX - 175 watts (same maths here)
RAM - 15 watts per GB stick
Motherboard - 25 watts
PCI device - 10 watts
HDD - 15 watts
CD drive - 20 watts
Fans - 2to 3 watts

Sum it up.
This is the power requirement when your PC is being maxed out i.e. your processor's all cores are on 100% usage, your RAM is full, GPU is doing some GPGPU thing, motherboard is like frying, PCI devices are being used to max, cd drives rotating at highest speed, and fans too at highest speed.
This situation is impossible to achieve.
Add your complete power specs and then calculate 75% of it.
That is the total power usage of your PC. Surprised at how low it is??? 
But PSU degrades with life and degrades as much as 15%. Also they are only 75% efficient.
So double your final calculation and that should be a good future proof PSU for you.
Infact I have run Crossfire using 2 X850XT on VANtec Ion2 460 watts with X2 4600 which requires 110 watts at load.


----------



## tkpenalty (Jul 3, 2007)

jagjitnatt said:


> Actually you know what guys, c don't need those Kilowatt power supplies. If you have a branded power supply and its over 400 watts, you can run 2900XT. I am not saying that. ATI says that, and I know its true.
> Check the power input of all the devices in ur pC
> 
> AMD CPU - 65 watts or 90 watts (check your version on AMD site)
> ...



^ITS TRUE.

Yes, manufacturers over emphasize the need for a "powerful" PSU. As he said a 400W PSU is enough, but barely, if it has a high efficiency of 80% then... your 100% safe. I wouldnt go with 400 because they only come with one 6 pin PCI-E at the max. 

My system uses 300W... I didnt do my research and ended up spending more than I should have.


----------



## strick94u (Jul 3, 2007)

m3lisk said:


> i was talking about a gtx, not a gts, and the 640mb gts is about the same price as the 2900xt, and the 2900 still owns it.
> 
> As for the power, boo-hoo, go get a new PSU, lol.


Very little diffrence between a 640 gts and a 2900 if the 2900 does own it why is the price of the 8800 gts 640 mb not droping faster? I am hopeing the 2900xtx drives the price of the nvidia slicone down so I can get a gtx. I used to be AMD only till they dropped behind nvidia in development, its a short drop to become the next 3dfx voodoo. I do see the 2900  running stock beating the 8800 gts in a few benchs but its a a slim line between the 2 both ways "OWN"? Lets look at who dominates whom in the markit place.


----------



## Tatty_One (Jul 3, 2007)

Thats true but unfortunatly it does not take into account overclocking....most of us here overclock and the additional power requirements in overclocking can be very significant, you do not need to double a CPU's speed to double it's power draw, on the contrary in fact, same applies to the graphics card which for example, as it gets stressed can consume at times horriific power, you only need to look at my old 1800XT, 9Amp idle stock, 13Amp idle max overclock, 15Amp stock load, 20Amp overclocked load.....thats kind of double going from a core speed of 625 to 730Mhz.

I do agree with both of you however that manufacturers to tend to go overboard in power consumtion requirements.


----------



## Tatty_One (Jul 3, 2007)

strick94u said:


> Very little diffrence between a 640 gts and a 2900 if the 2900 does own it why is the price of the 8800 gts 640 mb not droping faster? I am hopeing the 2900xtx drives the price of the nvidia slicone down so I can get a gtx. I used to be AMD only till they dropped behind nvidia in development, its a short drop to become the next 3dfx voodoo. I do see the 2900  running stock beating the 8800 gts in a few benchs but its a a slim line between the 2 both ways "OWN"? Lets look at who dominates whom in the markit place.



I think it will, I am guessing that when the 8900 is released....not sure when...Oct?, the 8800GTX will become mid/upper mid and will need to have it's price reflect that, Ie a GTX for cheaper than the 2900XT.


----------



## Ketxxx (Jul 3, 2007)

Your looking a bit green today tatty


----------



## jagjitnatt (Jul 3, 2007)

You still believe nVIDIA will release 8900??
I don't think so. What we heard about was the 8800 Ultra. 8900 doesn't exist and won't exist either.
nVIDIA will move on to 9000 series next year some time.
But 9000 is not coming any soon. It will take atleast a year from here when DX10 becomes mainstream and a revision to DX10 is made.


----------



## AsRock (Jul 3, 2007)

I noticed this http://www.fudzilla.com/index.php?option=com_content&task=view&id=1744&Itemid=1

How much you can trust it i don't know.  BUT to be honest i don't trust reviews most of the time.


----------



## Juic3 (Jul 3, 2007)

Youre basically saying you dont use AA at all, I guess R600 is just the thing for you.

Nvidia has no real reason thx to ATI to release an 8900, and fudzilla ?


----------



## GJSNeptune (Jul 3, 2007)

You can't trust fudzilla.

It's *FUD*zilla. C'mon.


----------



## MarcusTaz (Jul 3, 2007)

The new Catalyst 7.6 drivers have dropped my temps both idle and under load 3D by 10c..

Logic tells me that power consumption is down, now at the expense of frame rates and performance, it is hard to say. my 3d marks or true in game play have no suffered so I can say I think not, but raw numbers to measure actual power consumption someone tell me how to do it. Would I need a Amp meter to measure, I wonder....

Also interesting someone posted this review on newegg about the 1GB verison of the card and I was wondering if anyone can confirm this, and I quote:

"Other Thoughts: Please be advised there appears to be 2 versions of this video card. 1 comes with Samsung memory IC: K4U52324QE BC09 which is 1.1GHz @ 2.2Gbps. The other is Samsung memory IC: K4U52324QE BC06 which is 1.6GHz @ 3.2Gbps. According to Samsung Press release dated June 28, 2007. I (and assume others) would like to buy this card using the "better" ram. So any information from users who purchased the card or others would be appricated"

I just sold my Diamond 512mb version and order the Sapphire 1GB verion. If the above quote is true I hope I get the better memory....


----------



## mandelore (Jul 3, 2007)

again peeps compare cards based on dx9 tests... cmon, surely u realise now that its in the fields of dx10 where the champ will be decided


----------



## ryboto (Jul 3, 2007)

tkpenalty said:


> ^ITS TRUE.
> 
> Yes, manufacturers over emphasize the need for a "powerful" PSU. As he said a 400W PSU is enough, but barely, if it has a high efficiency of 80% then... your 100% safe. I wouldnt go with 400 because they only come with one 6 pin PCI-E at the max.
> 
> My system uses 300W... I didnt do my research and ended up spending more than I should have.



your system, if it's the one listed in your specs, doesn't use 300W, maybe at load it uses a little over 200.  The manufacturers rate things at their highest possible peak output/consumption.  With cpu's, at least with AMD, TDP is set for a range of processors.  An X2 3600 is in the 65W category, but do you really think that it uses as much power, or produces as much heat as an X2 5600 65W?  As for the HD 2900...I really wish Ati had done a better job making sure everything was up to snuff out the door, why have such a delayed hardware launch, and not even have prepared proper software for them?


----------



## Tatty_One (Jul 3, 2007)

Ketxxx said:


> Your looking a bit green today tatty



Thanks, like I didnt notice.....am on antibiotics tho


----------



## Tatty_One (Jul 3, 2007)

jagjitnatt said:


> You still believe nVIDIA will release 8900??
> I don't think so. What we heard about was the 8800 Ultra. 8900 doesn't exist and won't exist either.
> nVIDIA will move on to 9000 series next year some time.
> But 9000 is not coming any soon. It will take atleast a year from here when DX10 becomes mainstream and a revision to DX10 is made.



There are some so called facts out there which suggest that the 8800 series is a cut down 8900, in the GTX something like 160 shader processors have been disabled so the technology is already done, if it's true, and I am not saying it is but there is a little bit of credibility from what I can see in some of what I have read then it was a pretty smart move, they could just leave the 8800GTX clocks exactly as they are and enable the lot and gain prob 30%.

Like I said, I dont beleive everything I read but it's a possibility I suppose.  There is actually a very good and credible article about it in these forums from about 3 months ago.


----------



## petepete (Jul 3, 2007)

7800 GTX Was ownage, then the 7900 GTX came out..

8800 GTX is ownage, THE 8900 WILL COME OUT... do you think they will just skip it?


FX 5800, to FX 5900


it will come out...


----------



## yogurt_21 (Jul 4, 2007)

petepete said:


> 7800 GTX Was ownage, then the 7900 GTX came out..
> 
> 8800 GTX is ownage, THE 8900 WILL COME OUT... do you think they will just skip it?
> 
> ...



um no 7800gtx got crushed by the x1800xt not even close ok sure nvidia released the uber rare 512mb version with superclocks, but the x1800 could oc past the performance of the oced 7800gtx 512mb and the x1800 was mainstream.
7900gtx got crushed by the x1900xtx again not even close

fx5800 crushed by the 9700pro not even close, in fact the 5800 can't even run dx9 properly.

5900 crushed by the 9800 pro, 5950 crushed by the 9800xt.

so no ati has had the market as far as lead performance goes in most recent launches, save for the 6800series and now the 8800 series.


----------



## petepete (Jul 4, 2007)

ok dude where did i talk about AMD/ATI? you are trying to prove a point for no reason because im not comparing ATI vs Nvidia, just from nvidia cards. just do this please 


i wasnt even comparing AMD/ATI cards so your reply is useless


----------



## Tatty_One (Jul 4, 2007)

To be honest, I am surprised this thread is still open, it just invites flame from those less self disciplined


----------



## Grings (Jul 4, 2007)




----------



## theonetruewill (Jul 4, 2007)

Grings said:


>


----------



## LonGun (Jul 4, 2007)

Sometimes we kept going on and on at something until Mr. (?) comes in and flip the page


----------



## largon (Jul 5, 2007)

jagjitnatt said:


> (...)
> 2900XT - 200 watts (75watts from PCIe slot + 75X2 from 2 6pin is the max, but it requires 25 watts less)
> 8800GTX - 175 watts (same maths here)
> (...)


Your figures are too high. Power connector specs != total consumption. 
2900XT peak consumption is "only" ~170W. 8800GTX peaks at 145W. 

edit:
Infact, even these figures appear to be a bit too high as 8800GTX maximum 3D peak was measured to be as low as 130W.


----------



## MarcusTaz (Jul 5, 2007)

yea what he said


----------



## m3lisk (Jul 5, 2007)

Tatty_One said:


> To be honest, I am surprised this thread is still open



Wow, me too. I decided to bite the bullet and order an ASUS HD 2900xt STALKER edition card though for like $435 on tigerdirect...


----------



## mrsemi (Jul 6, 2007)

I did just buy an 8800gtx and of course I don't want to hear that a card almost $200 less can smoke it but I did do some flipping through articles before I bought mine and according to some the ati doesn't match the performance of the 640 gts in a lot of cases which is around $150 less than the 2900.  

I guess it doesn't matter, my wallet is dented in a way it never has been before.  I still have little doubt I'll need an upgrade in the next 2 or 3 years and that's all that matters to me reallly.  

Couple google links.


http://enthusiast.hardocp.com/article.html?art=MTM1MSwxLCxoZW50aHVzaWFzdA==\

http://it-review.net/index.php?option=com_content&task=view&id=1435&Itemid=1

http://forums.techpowerup.com/showthread.php?p=340668


----------



## MarcusTaz (Jul 6, 2007)

Bro why tiger? you could get it cheaper and with free shipping from zipzoomfly or ever newegg..


----------



## mandelore (Jul 6, 2007)

ahhhh......... peeps buying from dx9 benchies and performance ^^

shud hav waited for more dx10 imo, but anyhoo, what is done is done


----------



## Tatty_One (Jul 6, 2007)

mandelore said:


> ahhhh......... peeps buying from dx9 benchies and performance ^^
> 
> shud hav waited for more dx10 imo, but anyhoo, what is done is done



Lol U didnt....why should he?


----------



## mandelore (Jul 6, 2007)

Tatty_One said:


> Lol U didnt....why should he?



coz im a die-hard ati fan, thats why 

well.. that and i wanted to future proof tech on my new card


----------



## Tatty_One (Jul 6, 2007)

mandelore said:


> coz im a die-hard ati fan, thats why
> 
> well.. that and i wanted to future proof tech on my new card



Bah...fanboi excuses


----------



## MarcusTaz (Jul 6, 2007)

mandelore said:


> coz im a die-hard ati fan, thats why
> 
> well.. that and i wanted to future proof tech on my new card



I did not buy cause I am a ATI die hard but that I just dispise Nvidia at this point with their horrendous drivers and lack of support. Anther reason was the future proof tech. But you know I am posting because of this "future proof statment."  I think we all get sucked in here a bit. Sure the card (2900) is a techno marvel, but before you know it they will have a new card out prob the gpu/cpu deal and we will be drooling again looking to upgrade.


"grandma wana buy a kick ass video card?"


----------



## Mussels (Jul 6, 2007)

i bought my GTX before the ATI was released.... 

future proof works on both sides of the fence, and even if DX10 sucks - both ATI and NV cards can absolutely rape the current DX9 games.

Since thats over, it just comes down to preference - ATI uses more power, Nvidia is a loot longer. Nvidia is better for widescreens (ATI's ratio scaling is still broken in current drivers) and ATI is better for.... oh yeah, lower power use at idle (despite its massive power use at load)

there, i've done fanboi for both products in the same paragraph


----------



## MarcusTaz (Jul 6, 2007)

Mussels said:


> i bought my GTX before the ATI was released....
> 
> future proof works on both sides of the fence, and even if DX10 sucks - both ATI and NV cards can absolutely rape the current DX9 games.
> 
> ...



Brother not to burst your bubble here and with all due respect Nvidia is not better for widescreens. I can vouge for that personally. I have a Hitachi 61" HD-RPTV and a Westy 37" LCD and my 2900 rocks in 1920x1080P mode where my Nvidia 7900GTKO was horrible, again prob the junk drivers. Heck I have 4 (well now 2 I sold my SLI rig) of the7900GTKO and I cannot even use them in Vista becuase of this:

http://s31.photobucket.com/albums/c389/Achaleon/?action=view&current=HPIM0390.jpg

That is not my pc but someone else with the same problem. And Nvidia has failed to adress it even with my 100 bug logs reports to them (shitheads). Maybe things are different now with the 8800 series, but I spent much time on nvnews prior to jumping ship, going ATI and now making techpowerup my new home. I can tell you if you read their vista forums under the driver section you will laugh your butt off, threats of lawsuites, drivers issues left and right.

Now I am no expert and maybe your right, "ATI's ratio scaling is still broken in current drivers" but it looks fine on my screen and Nvidia is far behind in drivers stability with Vista. Anyway sorry to disagree but it does make thread fun if done in a respectful manner...


----------



## Wile E (Jul 6, 2007)

No ratio scaling problems here, Both my 19" 1440x900 monitor and my 32" 720p set work great.


----------



## HellasVagabond (Jul 6, 2007)

Why X2900 ISNT better at least in DX10 Games..At least in 3 DX10 games so far..
Quote from AnandTech :
"When running with all the DX10 features enabled, the HD 2900 XT falls to just below the performance of the GeForce 8800 GTS. Once again, the low-end NVIDIA and AMD cards are unable to run at playable framerates under DX10, though the NVIDIA cards do lead AMD."


----------



## Wile E (Jul 6, 2007)

HellasVagabond said:


> Why X2900 ISNT better at least in DX10 Games..At least in 3 DX10 games so far..
> Quote from AnandTech :
> "When running with all the DX10 features enabled, the HD 2900 XT falls to just below the performance of the GeForce 8800 GTS. Once again, the low-end NVIDIA and AMD cards are unable to run at playable framerates under DX10, though the NVIDIA cards do lead AMD."


*Sigh* And again, how does testing with betas and immature drivers prove anything?



> For now, AMD does seem to have an advantage in Call of Juarez, while NVIDIA leads the way in Company of Heroes and Lost Planet. But as far as NVIDIA vs. AMD in DirectX 10 performance, we really don't want to call a winner right now. It's just way too early, and there are many different factors behind what we are seeing here. As the dust settles and everyone gets fully optimized DirectX 10 drivers out the door with a wider variety of games, then we'll be happy to take a second look


----------



## MarcusTaz (Jul 6, 2007)

HellasVagabond said:


> Why X2900 ISNT better at least in DX10 Games..At least in 3 DX10 games so far..
> Quote from AnandTech :
> "When running with all the DX10 features enabled, the HD 2900 XT falls to just below the performance of the GeForce 8800 GTS. Once again, the low-end NVIDIA and AMD cards are unable to run at playable framerates under DX10, though the NVIDIA cards do lead AMD."



Dude, solve all your bickering and go buy a ATI 2900XT....


----------



## HellasVagabond (Jul 6, 2007)

You are all seeing the tree but miss the forest as usuall...
So i should start calling a victor in DX10 when ? 2,3,4,5 years from now ?
THESE ARE DX10 COMPATIBLE CARDS....If you cant play a DX10 Game with it then dont name it a DX10 card...These are the LATEST BETA drivers from both brands.
Bottom line its weird that the people that Bought the X2900 struggle everyday to excuse its failure by saying that the drivers aint good cause its early and on the same time the same people say that ATI makes better drivers...
Crazy man Crazy...


----------



## mandelore (Jul 6, 2007)

huh, i dont need no excuses coz my card roxors, we are in the process of doing our OWN tpu dx10 benchies, so far my card is doing well, granted theres virtually zero nvidia input, but go get that corrected!! 

hellas u tried the call of j benchy yet?

if so post your results here:

http://forums.techpowerup.com/showthread.php?p=384409#post384409


----------



## HellasVagabond (Jul 6, 2007)

I thought for the bench you need to have the game and i dont have it.


----------



## mandelore (Jul 6, 2007)

just out of curiosity, i know the 8800 has an adjustable (overclockable shader engine) is the 2900 shader clock linked to the core speed? so when u overclock the core u overclock the shader engine?

wonder if there could be a way to ramp this up independantly


----------



## mandelore (Jul 6, 2007)

HellasVagabond said:


> I thought for the bench you need to have the game and i dont have it.



nonono, u can download just the dx10 bench program:

http://downloads.guru3d.com/download.php?det=1642#download

please giv it a go and add to the thread, a good profile of diff rigs will be awsome for comparisons


----------



## Tatty_One (Jul 6, 2007)

Mussels said:


> i bought my GTX before the ATI was released....
> 
> future proof works on both sides of the fence, and even if DX10 sucks - both ATI and NV cards can absolutely rape the current DX9 games.
> 
> ...



Impressed.....I am putting you forward for a job with the diplomatic corps!


----------



## Tatty_One (Jul 6, 2007)

mandelore said:


> just out of curiosity, i know the 8800 has an adjustable (overclockable shader engine) is the 2900 shader clock linked to the core speed? so when u overclock the core u overclock the shader engine?
> 
> wonder if there could be a way to ramp this up independantly



In a word......no, I beleive it's locked, TBH and truly not doing the fanboi stuff, it's largely an un appreciated strong point to NVidia's architecture IMO, I say that as it is more important really to the mid range cards where in actual fact on paper you look at say the quantity of shader processors and you would naturally think the 2600XT would be the performance choice but in reality it's not so important how many but actually how fast and thats why overclocked the 8600GTS and GT jump to life.

Having said that, it really does go to show how good the 2900XT's architecture is that although it has more, they run slower and it still competes with the best NVidia has to offer......damn saying nice things about the 2900XT now!


----------



## m3lisk (Jul 6, 2007)

Oddly enough, frames in Oblivion are increased with wide-screen resolutions. At 1440 x whatever, you are getting better frames in most cases than 1280 x 1024, according to some benchies at least. Yeah, I'm going to wait till more Dx10 cards come out. Anyone know of someone selling an x1950xt?


----------



## mandelore (Jul 6, 2007)

haha, dont worry Tatty, you will have more nice things to say in time to come hehehe


----------



## Chewy (Jul 6, 2007)

Mussels said:


> Nvidia is better for widescreens (ATI's ratio scaling is still broken in current drivers) and ATI is better for.... oh yeah, lower power use at idle (despite its massive power use at load)
> 
> there, i've done fanboi for both products in the same paragraph




I thin after checking out the benches in dx10 games that were done here on tpu the 2900xt will end up being like most 19xx series cards vrs 7900series.. the 2900xt has alot of overclocking headroom and beats out the gts in dx10 fairly well, look at mal's score with his 800mhz core 2900xt.. Im sure the 512mb version can get up to that clock pretty easily aswell.. its even reached 1144.5 core here: http://www.nordichardware.com/Reviews/?page=10&skrivelse=510 now the gtx or gts cant reach near that speed even on liquid nitrogen.

 I would like to see that call of Jurez game tested at high rez with AA, elc cranked up.. will the demo or performance test in the full game let you do that?

 Im even closer to buying a 2900xt now even though I think video cards are to expensive and Im trying to save $$ so I can take some courses, drivers licence with driver training ($700) for one. but theirs always more paychecks 

 So what Im saying here is we should see if the performance increase currently shown in dx10/Call of Jurez passed on to higher resolutions aswell.


----------



## HellasVagabond (Jul 6, 2007)

Theres an entire Gap between running a benchmark and running a Stable system.
I can push my GTS at 650/2100 for Benchmarks but for Games anything above 620/1970 isnt advised. And im only benchmarking whats stable.


----------



## Chewy (Jul 6, 2007)

his 2900xt is 800mhz stock though ( I think with a bit of bios V modding it could go higher and stable).. 

I think it uses the same core as the 512mb version?.. so I think if I get a 512mb version I should be able to hit 800mhz fairly easily.. not sure if the 1gb version has the same stock voltage.. I dont understand why its clocked so much higher stock than the 512mb version. Aint it the same core?


----------



## mandelore (Jul 6, 2007)

i think its a better performing core


----------



## Chewy (Jul 6, 2007)

mandelore said:


> i think its a better performing core



 your right, probably hand picked maybe? I thought the 512mb and 1gb 2900xt's use the same cores on paper right? with just a higher clock? guess I'll go try google and add what I find to this post. actually its supertime right now, so brb.


----------



## mandelore (Jul 6, 2007)

i think the initial low-yield samples were an issue, hence the 1gb was late coming, maybe they were stock piling? or even just found a way to get good yields on high performance cores


----------



## MarcusTaz (Jul 6, 2007)

I will have a 2900XT 1GB verion here early next week, my 2900 XT 512 is sold. Will bench both with my Q6600 then and let you all know the difference...


----------



## mandelore (Jul 6, 2007)

MarcusTaz said:


> I will have a 2900XT 1GB verion here early next week, my 2900 XT 512 is sold. Will bench both with my Q6600 then and let you all know the difference...



ooooooohhhh.... lucky git, care to give me your cpu? hahahaha, thought not...


----------



## Wile E (Jul 7, 2007)

HellasVagabond said:


> You are all seeing the tree but miss the forest as usuall...
> So i should start calling a victor in DX10 when ? 2,3,4,5 years from now ?
> THESE ARE DX10 COMPATIBLE CARDS....If you cant play a DX10 Game with it then dont name it a DX10 card...These are the LATEST BETA drivers from both brands.
> Bottom line its weird that the people that Bought the X2900 struggle everyday to excuse its failure by saying that the drivers aint good cause its early and on the same time the same people say that ATI makes better drivers...
> Crazy man Crazy...


They say ATI has better drivers because in 2 updates they improved their performance(without cutting features), by an amount it took nVidia months to accomplish.

And I like how you constantly refer to the 2900 as a failure. Are you getting nervous that it's catching you? What other reason do you have to constantly bash it?

As far as how long to wait, umm how about when there's more full, non-beta DX10 games.

After reading the Anandtech article, I think it may not matter anyway, it looks as tho all of us early adopters may have been pwned, regardless of card maker.


@everyone talking about the 2900 core speed- My 512MB will run at 830Mhz stock volts 24/7. I think they just clocked the 1GB higher because the gpu has no problems doing it.


----------



## HellasVagabond (Jul 7, 2007)

Like i said in another post some people buy a card to keep it a couple of years and from the looks of it NOW the 8800 IS a better future-proof card and it was released 7 months before the X2900. I dont think i can be more clear than that. 
To me it doesnt apply cause i change cards every 5-6 months but like i said others dont.


----------



## mandelore (Jul 7, 2007)

but why wouldnt the 2900xt be good for long term, you never say??

itsjust as good in dx10, if not better, so with more and more dx10 games coming in years to come, that would make the 2900 a more logical choice?


----------



## HellasVagabond (Jul 7, 2007)

I fail to see to what DX10 Benchmarks you are refering to that the X2900 is better...I only see it get infront in Company Of Heroes.


----------



## Chewy (Jul 7, 2007)

The call of Juraz review/test going on onthe fourms the 2900xt is leading in that over an overclocked gts or 2 I'll find ya the link.

 I dont even concider COH benches froma while ago all that in favour for 2900xt.. I and im sure most like to game with graphics high and 1280x1020 (or whatever it is) resolution, and the 2900xt lost in that bench. but I think they just got a new driver release again so who knows.. I could get a card this week but as it seems theses cards are just barely cutting good frames in dx10 games with max everything.. so I wait more and not go like broke this weekend lol.

Link: http://forums.techpowerup.com/showthread.php?t=32937


----------

