# Full Review: 9800GX2 vs HD3870X2!!!



## reviewhunter (Mar 13, 2008)

Seems like 9800GX2 wins in almost every benchies!
Although there were a few games the NV card couldn't run with AA on.
Both websites talk about the same thing:

in English:
http://lly316.blogspot.com/2008/03/geforce-9800-gx2-vs-radeon-hd-3870-x2.html

in Chinese:
http://www.pconline.com.cn/diy...iews/0803/1241871.html


Here are some of them:


----------



## das müffin mann (Mar 13, 2008)

thanks man, i've been looking for something like this, though its a bit early yet to really see how the card peforms, though it gives us a good idea,still not sure if its worth the extra money for the card vs the 3870x2 but we will see, i would still probably go with the 3870x2 due to teh price, but maybe this card wont be a titanic of a faliure like i intially thought


----------



## Kreij (Mar 13, 2008)

Thought I would add this from the review. I think it's of some importance ...


> On Gigabyte X48, the R680 had no problem running through all tests. But the 9800 GX2 could only pass 3DMark06 on the very same board. Running any other games would results in errors, or worse, BSoD. Since we know from the previous test that SLI needs to be enabled for maximum performance, yet Intel chipset based boards do not support SLI, we might be looking at a compatibility issue here.
> 
> It was then the GeForce 9800GX2 moved to the Striker II Formula platform, where it managed to perform all tests. On the other hand, the 3870 X2 maintained its stability on both platforms. We have to give credit to AMD on its compatibility advantage. In conclusion, we would advice those considering the 9800 GX2 to get a Nvidia chip based motherboard.


----------



## reviewhunter (Mar 13, 2008)

das müffin mann said:


> thanks man, i've been looking for something like this, though its a bit early yet to really see how the card peforms, though it gives us a good idea,still not sure if its worth the extra money for the card vs the 3870x2 but we will see, i would still probably go with the 3870x2 due to teh price, but maybe this card wont be a titanic of a faliure like i intially thought



you're welcome bro 

If you look at the price/performance ratio, I'd pick the 3870X2. Moreover, 3870X2 is going to have a further price-cut some time soon. Also not forgetting the GDDR4 version!


----------



## das müffin mann (Mar 13, 2008)

im really looking forward to the ddr4 version


----------



## farid (Mar 13, 2008)

Looks like nVidia is gonna pwn ATI again, but in case ATI cut the prices off the x2 I will go for one of those


----------



## MilkyWay (Mar 13, 2008)

the ati card isnt far off and we all know that a single 3870 could never beat a 9800 card

so good for AMD/ATi coz the performance is about right and well it almost matches up to the nvidia

it will boil down to price and chipset i think coz i think you could cross fire a single 3780 and a 3870x2 for triple setup and the price is way cheaper


----------



## EastCoasthandle (Mar 13, 2008)

Yeah I already posted about it here


----------



## newtekie1 (Mar 13, 2008)

70% faster in Crysis...wow...to bad AA kills it.   Not that you really need AA at 1920x1200.

I hope some driver maturing fixes the AA issues.


----------



## sheps999 (Mar 13, 2008)

But games still _look_ better with the ATi, right? If the ATi gets 40fps, and the Nvidia gets 60fps, but the game looks better on the Ati, I'd still go for the 3870X2.


----------



## reviewhunter (Mar 13, 2008)

EastCoasthandle said:


> Yeah I already posted about it here



sorry dude, didn't see it


----------



## calvary1980 (Mar 13, 2008)

sheps999 said:


> But games still _look_ better with the ATi, right? If the ATi gets 40fps, and the Nvidia gets 60fps, but the game looks better on the Ati, I'd still go for the 3870X2.



your talking in past tense.

- Christine


----------



## [I.R.A]_FBi (Mar 13, 2008)

calvary1980 said:


> your talking in past tense.
> 
> - Christine



i thought breast awareness month was october?


----------



## Scrizz (Mar 13, 2008)

wth


----------



## calvary1980 (Mar 13, 2008)

Scrizz 4 Prez lol

- Christine


----------



## EastCoasthandle (Mar 13, 2008)

reviewhunter said:


> sorry dude, didn't see it



No prob.  it happens


----------



## sheps999 (Mar 13, 2008)

calvary1980 said:


> your talking in past tense.
> 
> - Christine


----------



## reviewhunter (Mar 13, 2008)

the difference in image quality is barely noticeable these days, so much so that it wouldn't affect my purchase decision.


----------



## DarkMatter (Mar 13, 2008)

sheps999 said:


> But games still _look_ better with the ATi, right? If the ATi gets 40fps, and the Nvidia gets 60fps, but the game looks better on the Ati, I'd still go for the 3870X2.



http://forums.techpowerup.com/showpost.php?p=696331&postcount=67

Dont forget to follow the link and read the entire article.  



reviewhunter said:


> the difference in image quality is barely noticeable these days, so much so that it wouldn't affect my purchase decision.



Agreed. There should be a sticky in the graphics card section about this. With at least a link to that experiment they did at Maximum PC (posted in the link above). I'm getting tired of correcting that false belief.


----------



## yogurt_21 (Mar 13, 2008)

DarkMatter said:


> Agreed. There should be a sticky in the graphics card section about this. With at least a link to that experiment they did at Maximum PC (posted in the link above). I'm getting tired of correcting that false belief.


Actually i thought the g80/g92 had better image quality than the r600/rv670, at least any review that went in depth discovered that, especially with sample aa enabled.


----------



## DarkMatter (Mar 13, 2008)

yogurt_21 said:


> Actually i thought the g80/g92 had better image quality than the r600/rv670, at least any review that went in depth discovered that, especially with sample aa enabled.



Huh! I haven't seen any in depth review about this that said one is over the other. Most of the ones said they are pretty similar. Well I haven't seen any in depth IQ review anyway, except the ones at TechReport. Have you any link, please?


----------



## OrbitzXT (Mar 13, 2008)

sheps999 said:


> But games still _look_ better with the ATi, right?



This statement is false and has been for quite some time. Prior to the release of nVidia's 8-series, the common train of thought was that ATI had superior image quality whereas nVidia sacrificed some of that quality to give out more frames. This statement hasn't been true for awhile now. Infact when the 8800 GTX came out it had far superior performance than anything ATI was offering obviously, and a superior image as well.

Honestly the 9800 GX2 exceeded my expectations, only because I heard some earlier numbers a couple of weeks ago where it seemed the 3870 X2 was ahead. Even though it doesn't blow away the 3870 X2, the fact that it still wins overall will make it the card of choice for enthusiasts who completely ignore price. This almost reminds me of the Democratic Primaries with the way campaigns will downplay expectations as an election draws near, then if they win even by 1 point will come out claiming an outstanding victory. I personally don't think I'll be buying one though, the only card from nVidia that has my interest right now is the 9600 GT.


----------



## wolf (Mar 14, 2008)

no matter what ATi throws at them NV seems to come out on tops! im seriously considering selling my Vmodded 8800GT on ebay and getting one of these puppies now 

and +1 OrbitzXT, ATi diehards have been throwing IQ in our faces for a while now and its just not true, and even when it was, the difference was so very negligible, that it would not affect my purchase decision either.


----------



## das müffin mann (Mar 14, 2008)

atm that cards a bit expensive to have me consider buying the card for a minimal performance gain over ati's solution, but hey if nvidia lowers the price i will pick one up


----------



## Water Drop (Mar 14, 2008)

Why, was anyone expecting the HD3870X2 to beat the 9800GX2?

They are not meant to be competitors as far as I know.


----------



## das müffin mann (Mar 14, 2008)

yes they are, the gx2 is nvidias answer to the x2, we were expecting the x2 to beat the gx2 based on early benches of the card didn't look to promising


----------



## Sasqui (Mar 14, 2008)

Funny, about 1.5 years ago, I wrote a post - wondering when they were going to make dual GPU cards...  SLI in a card or XFire in a card, whatever!  Took a while, and they aren't anywhere as pretty as I imagined... they are beasts.

Question:  When are they going to make *dual-core* GPU's?

My guess = 1 yr.  Has anyone heard of any plans released by AMD or NVDA or INTC?


----------



## reviewhunter (Mar 14, 2008)

Water Drop said:


> Why, was anyone expecting the HD3870X2 to beat the 9800GX2?
> 
> They are not meant to be competitors as far as I know.



Nah, not even with GDDR4.

Why do you say so?


----------



## erocker (Mar 14, 2008)

Water Drop said:


> Why, was anyone expecting the HD3870X2 to beat the 9800GX2?
> 
> They are not meant to be competitors as far as I know.



Very good point.  I sure in the heck didn't.  Good for Nvidia for making a powerful card, I hope this time around they will continue to support thier dual pcb wonder.  It's going to be and has always been a back and forth battle.  The one plus of the 3870x2 is that it can work on most motherboards, while the GX2 will only work with SLi boards, so far anyways.



reviewhunter said:


> Nah, not even with GDDR4.
> 
> Why do you say so?



Both price and performance don't match up.


----------



## yogurt_21 (Mar 14, 2008)

DarkMatter said:


> Huh! I haven't seen any in depth review about this that said one is over the other. Most of the ones said they are pretty similar. Well I haven't seen any in depth IQ review anyway, except the ones at TechReport. Have you any link, please?





> Even this is really hard to pick from. If you want to knit pick, you may find that from one small part, it seems the filtering is a little sharper on the NVIDIA:


http://sg.vr-zone.com/articles/ATi_Radeon_2000_Series_Launch:_X2900XT_Review/4946-15.html
http://sg.vr-zone.com/articles/ATi_Radeon_2000_Series_Launch:_X2900XT_Review/4946-16.html


----------



## imperialreign (Mar 14, 2008)

Sasqui said:


> Funny, about 1.5 years ago, I wrote a post - wondering when they were going to make dual GPU cards...  SLI in a card or XFire in a card, whatever!  Took a while, and they aren't anywhere as pretty as I imagined... they are beasts.
> 
> Question:  When are they going to make *dual-core* GPU's?
> 
> My guess = 1 yr.  Has anyone heard of any plans released by AMD or NVDA or INTC?



yes, AMD/ATI has had plans for a dual-core GPU since at least late 07, possibly earlier.  It's supposed to be the R700 GPU, and will feature two cores on the same die, supposedly communicating utilizing AMD's hypertransport, and the GPU is also rumored to support GDDR5.  Release for this GPU has been rumored to be with the HD5000 series, which, IIRC, is slated for 4Q 08 or 1Q 09.

after the initial rumor of the 3870x2, I got the feeling ATI was testing the waters with that card, and it's success will determine their road map.  Sure enough, the card went over better than anyone expected, and we start getting our first rumors of the HD4000 series, which will include a 4870x2, also.

I forsee the HD5000 series to also include a 5870x2 - which would mean 2 dual-core GPUs on one card . . . single PCB quad fire.


----------



## reviewhunter (Mar 14, 2008)

hey, do read the review again, seems like the author, the one in English, updated the review. 
http://lly316.blogspot.com/2008/03/geforce-9800-gx2-vs-radeon-hd-3870-x2.html

Now its says:



> CoJ --> Although the 9800 GX2 is 1 fps faster than the AMD, but it was unable to run the benchmark with 4xAA turn on. *It happened at the moment the benchmark begins, it bounced right back into desktop*. Therefore, the score is recorded as zero.
> 
> 
> UT3--> Correction:
> ...


----------



## gOJDO (Mar 14, 2008)

expect significant performance improvement and a lot of fixes with a new driver soon.


----------



## kenkickr (Mar 14, 2008)

I watch alot of movies on my system so there for the 3870X2 wins my money.


----------



## Sasqui (Mar 14, 2008)

imperialreign said:


> yes, AMD/ATI has had plans for a dual-core GPU since at least late 07, possibly earlier.  It's supposed to be the R700 GPU, and will feature two cores on the same die, supposedly communicating utilizing AMD's hypertransport, and the GPU is also rumored to support GDDR5.  Release for this GPU has been rumored to be with the HD5000 series, which, IIRC, is slated for 4Q 08 or 1Q 09.



Cool thanks, just curious where you picked that up...


----------



## DarkMatter (Mar 14, 2008)

yogurt_21 said:


> http://sg.vr-zone.com/articles/ATi_Radeon_2000_Series_Launch:_X2900XT_Review/4946-15.html
> http://sg.vr-zone.com/articles/ATi_Radeon_2000_Series_Launch:_X2900XT_Review/4946-16.html



Hmm, in the review they say that when maximum quality settings are enabled Nvidia has better anisotropic and Ati better AA. But that makes the game look a bit sharper on Nvidia's card when using lower settings, but they are alot closer in appearance on those. (Higher settings are not usually possible anyway)
Interestingly enough that was the same conclusion that I extracted from my personal experience. My monitor has two inputs and I can go from one to the other with a switch, so I plugged one of my friends' PC alongside mine and we did some comparisons. But TBH differences were really small, and I could only see the differences clearly when I pumped up resolution to the max (2056x1536) and got really close to the screen. Even in that article is clear that they are using massive zoom to see the differences. Still at normal size there are subtle differences, but not in favor of any of the two, I used to like 8800 more, while my friend liked HD3870's appearance. When zoomed in some areas could look worse than what they do when zoomed out. 

http://en.wikipedia.org/wiki/Anti-aliasing Look at figure 2. and what they say about it.

Bottom line is both are at the same level with some subtle differences, some people like Ati's while others like Nvidia's, but difference is not enough to represent a purchase decision. When someone buys one instead of the other strongly believing he is buying better IQ, well, he is buying a quimera.


----------



## gOJDO (Mar 14, 2008)

kenkickr said:


> I watch alot of movies on my system so there for the 3870X2 wins my money.


You can watch movies on any integrated graphics and there won't be any difference compared to any other graphics card(actually you don't need a 3D accelerator for that purpose). For HD, you need a decent CPU and a Radeon 3850 or GeForce 8800GS.


----------



## EastCoasthandle (Mar 14, 2008)

gOJDO said:


> You can watch movies on any integrated graphics and there won't be any difference compared to any other graphics card(actually you don't need a 3D accelerator for that purpose). For HD, you need a decent CPU and a Radeon 3850 or GeForce 8800GS.



That's were you are wrong. The HD does offer better IQ when it comes to moves over the Geforce.  Anyone who (like myself) had the opportunity to view both cards should tell you that.  No reason to lie about this.


----------



## DarkMatter (Mar 14, 2008)

EastCoasthandle said:


> That's were you are wrong. The HD does offer better IQ when it comes to moves over the Geforce.  Anyone who (like myself) had the opportunity to view both cards should tell you that.  No reason to lie about this.



But IMO almost negligible, I can barely notice a worthy difference. Not only IMO anyway, look at post #19, MaximumPC. If near 100% of specialists can't say one is better than the other...

As stated like 100000 times. The difference in IQ can't justify the purchase of one over the other in performance/enthusiast parts. You are free to buy an HD3450 for movies, because there, that minimal IQ can lead to a purchase because there's no performance concerns there. But on $250+ hardware sacrificing 20% of performance because an IQ difference on movies (TBH you don't pay that much for movies, but for games) that not even specialists can notice, is plain stupid.


----------



## newtekie1 (Mar 14, 2008)

Agreed, the IQ different is 100% negligible, if there really even is an IQ different.


----------



## EastCoasthandle (Mar 14, 2008)

DarkMatter said:


> But IMO almost negligible, I can barely notice a worthy difference. Not only IMO anyway, look at post #19, MaximumPC. If near 100% of specialists can't say one is better than the other...
> 
> As stated like 100000 times. The difference in IQ can't justify the purchase of one over the other in performance/enthusiast parts. You are free to buy an HD3450 for movies, because there, that minimal IQ can lead to a purchase because there's no performance concerns there. But on $250+ hardware sacrificing 20% of performance because an IQ difference on movies (TBH you don't pay that much for movies, but for games) that not even specialists can notice, is plain stupid.


Video playback on both video cards are very distinct, negligible is simply a play on words.  The differences in IQ between the 2 are obvious, unlike the differences in AF. 
Lets not draw comparisons between video play back and video games.


----------



## newtekie1 (Mar 14, 2008)

EastCoasthandle said:


> Video playback on both video cards are very distinct, negligible is simply a play on words.  The differences in IQ between the 2 or obvious, unlike the differences in AF.
> Lets not draw comparisons between video play back and video games.



Really, if they are really that different, how come 5 people picked nVidia as having better IQ?  Yes, 7 picked ATI, but if the different is so obvious, not a single person should have picked nVidia.  The IQ difference is gone, admit it and move on and find some other reason to justify buying overpriced hardware.


----------



## EastCoasthandle (Mar 14, 2008)

newtekie1 said:


> Really, if they are really that different, how come 5 people picked nVidia as having better IQ?  Yes, 7 picked ATI, but if the different is so obvious, not a single person should have picked nVidia.  The IQ difference is going, admit it and move on and find some other reason to justify buying overpriced hardware.



Acting in fanbois fashion is not the best way to rebut a point.


----------



## newtekie1 (Mar 14, 2008)

EastCoasthandle said:


> Acting in fanbois fashion is not the best way to rebut a point.



It seem to be working pretty well for you so far.  And you don't have to be a fanboy to say ATi's hardware is overpriced, you just have to be able to see the obvious.


----------



## DarkMatter (Mar 14, 2008)

newtekie1 said:


> Really, if they are really that different, how come 5 people picked nVidia as having better IQ?  Yes, 7 picked ATI, but if the different is so obvious, not a single person should have picked nVidia.  The IQ difference is going, admit it and move on.



It all comes down to this:

You have two cards A and B, same price of $300, A is 10% faster than B. B is as better on movies IQ to A as, HD is to Nvidia's.

Are you telling me that you would choose B, the slower one, just because it plays movies a little bit better?? Don't want to insult anybody, but that's stupid.



EastCoasthandle said:


> Acting in fanbois fashion is not the best way to rebut a point.



God!! What we have to hear... Who is acting like a fanboi? No one but you my friend. You say that for you Ati's IQ is better, we say for us is the same, actually newtekie says for him Nvidia is better. I provide a link that express this same variance in opinions, that demostrate that thre isn't any noticeable difference. Yet we are the fanbois? Come on... Grow up.



> The differences in IQ between the 2 are obvious, unlike the differences in AF.



Also who are you to decide if that difference is bigger or not? And more specifically, to decide if it's more important on a purchase decision of an enthusiast graphics card? (AKA gaming hardware)


----------



## newtekie1 (Mar 14, 2008)

I'm with you, even if ATi was better at HD movies, it isn't worth the price premium and performance hit.


----------



## EastCoasthandle (Mar 14, 2008)

DarkMatter said:


> It all comes down to this:
> 
> You have two cards A and B, same price of $300, A is 10% faster than B. B is as better on movies IQ to A as, HD is to Nvidia's.
> 
> ...



I think the growing up starts when your posts comes off a bit more mature then what I have read today.  I posted based on my own experience something you have not demonstrated.  Therefore, before you continue with your rabid responses I suggest a bit of discretion that at least shows you comprehend the context of my post.  Other then fueling your own with "my card is better" responses.


----------



## newtekie1 (Mar 14, 2008)

You own experiences don't mean anything.  An bunch of independant specialists have looked at both camps and the conclusion was that the IQ difference doesn't exists.

Oh, and you are not the only person with personal experience, don't ever assume that.  A good argument doesn't involve personal experience, which is why we have left that part out of the IQ argument.  You should be able to prove your point without saying "well my personal experience is...so that is how it is".  You can't do that, so the only logical conclusion is that your argument is false.  Helping that conclusion along is the fact that it has been proven false by independant studies.


----------



## gOJDO (Mar 14, 2008)

EastCoasthandle said:


> That's were you are wrong. The HD does offer better IQ when it comes to moves over the Geforce.  Anyone who (like myself) had the opportunity to view both cards should tell you that.  No reason to lie about this.


Maybe I expressed my self not enough clearly so you missed my point. Let alone the better IQ argument, do you really need a 3870X2 or a 9800GX2 to watch movies?


----------



## DarkMatter (Mar 14, 2008)

EastCoasthandle said:


> I think the growing up starts when your posts comes off a bit more mature then what I have read today.  I posted based on my own experience something you have not demonstrated.  Therefore, before you continue with your rabid responses I suggest a bit of discretion that at least shows you comprehend the context of my post.  Other then fueling your own with "my card is better" responses.



I won't continue with this. You acuse me of fanboi and infantile, when it is you acting like that, and forgeting about facts (links). I say what my opinion is, you say what yours is. You say there's a big difference, I say there isn't and I provide a link that backs me up. Your opinion is not better than mine. Hell it's not better than those specialists' opinion, that's for sure...
I have as much personal experience with both cards as you, post #36, and I have demostrated as much as you, because I don't see any demostration from your part...
And lastly, since my whole point is that both cards have *same IQ* in oposition to this is better, it is you who is insisting in "my card is better" argument. 

As I said qrow up.


----------



## EastCoasthandle (Mar 14, 2008)

newtekie1 said:


> You own experiences don't mean anything.  An bunch of independant specialists have looked at both camps and the conclusion was that the IQ difference doesn't exists.
> 
> Oh, and you are not the only person with personal experience, don't ever assume that.  A good argument doesn't involve personal experience, which is why we have left that part out of the IQ argument.  You should be able to prove your point without saying "well my personal experience is...so that is how it is".  You can't do that, so the only logical conclusion is that your argument is false.  Helping that conclusion along is the fact that it has been proven false by independant studies.



I can offer my opinion on the subject

AND

I have offered my opinion on the subject




DarkMatter said:


> I won't continue with this. You acuse me of fanboi and infantile, when it is you acting like that, and forgeting about facts (links). I say what my opinion is, you say what yours is. You say there's a big difference, I say there isn't and I provide a link that backs me up.
> I have as much personal experience with both cards as you, post #36, and I have demostrated as much as you, because I don't see any demostration from your part...
> And lastly, since my whole point is that both cards have *same IQ* in oposition to this is better, it is you who is insisting in "my card is better" argument.
> 
> As I said qrow up.



Your only accusation of me being a fanboi is offering an opinion that is different then your own.  This is not the definition of being a fanboi.  However you on the other hand through your rabid responses clearly show otherwise.  Per the mentioned PC Maximum review 7 found ATI's offering better then nvidia (which 5 took interest).  If you read the article they are referencing CF vs SLI which is why I take no real interest in it.  However, their was a distinct difference in the color of the fire (whatever that means there is no additional information regarding this).

Read the chart from the link:
Source


----------



## newtekie1 (Mar 14, 2008)

DarkMatter said:


> actually newtekie says for him Nvidia is better



Actually, I'm with you, they are the same.



EastCoasthandle said:


> I can offer my opinion on the subject
> 
> AND
> 
> I have offered my opinion on the subject



Yes, and your opinion doesn't mean anything unless you can back it up, and you can't.  So offer away, it doesn't mean anything.  And we are allowed to argue with your opinion, and prove it false(which we have, actually it was proven false before you even offered your opinion).

And the person that resorts to insults, is usually the person losing the argument, and usually the least mature of the group.  So it is ironic that you talk about maturity, when you seem to be the least mature.


----------



## EastCoasthandle (Mar 14, 2008)

newtekie1 said:


> Actually, I'm with you, they are the same.
> 
> 
> 
> ...



If my opinion doesn't mean anything to you there is no need to offer an opinion about it.  This sort of self reassurance isn't needed if you truly felt that way.  As the saying goes actions speak louder then words.


As for being proven false and insults (because I seen for myself that IQ is not the same, which is not an insult). I'll let chart below speak for me, that way I don't insult anyone.







As I said before, anyone who had the opportunity to use both cards can clearly see a distinct difference.


----------



## driver66 (Mar 14, 2008)

OMG it never fails /facepalm  :shadedshu


----------



## newtekie1 (Mar 14, 2008)

EastCoasthandle said:


> If my opinion doesn't mean anything to you there is no need to offer an opinion about it.  This sort of self reassurance isn't needed if you truly felt that way.  As the saying goes actions speak louder then words.
> 
> As for being proven false and insults (because I seen for myself that IQ is not the same, which is not an insult). I'll let chart below speak for me, that way I don't insult anyone.
> 
> ...



Sure there is a reason, when you present your opinion as fact, a fact that is actually false.

I don't see how you can post a chart showing people claiming both sides look better, and then claim that is proof that ATi has better IQ and the "difference is obvious".  Really?  If the difference is so obvious, then why does anyone pick nVidia?  If ATi is that much better, and the difference is so obvious, then why did anyone pick nVidia?


----------



## EastCoasthandle (Mar 14, 2008)

newtekie1 said:


> Sure there is a reason, when you present your opinion as fact, a fact that is actually false.



No, I specifically said "opinion" as found in post 53, 51, etc.  As a matter of fact, you quoted me saying it.


----------



## newtekie1 (Mar 14, 2008)

EastCoasthandle said:


> No, I specifically said "opinion" as found in post 53, 51, etc.  As a matter of fact, you quoted me saying it.



Read post #38, your post.  The first post on the subject we are arguing about.  NO mention of it being opinion.  Just direct statements of fact, false facts.  You then continue with no mention of opinion in post #41.  You only then claimed those were your own personal opinions later one, when we already started arguing them.



EastCoasthandle said:


> That's were you are wrong. The HD does offer better IQ when it comes to moves over the Geforce.  Anyone who (like myself) had the opportunity to view both cards should tell you that.  No reason to lie about this.



No statement of opinion there.



EastCoasthandle said:


> Video playback on both video cards are very distinct, negligible is simply a play on words.  The differences in IQ between the 2 are obvious, unlike the differences in AF.
> Lets not draw comparisons between video play back and video games.



None there either.

You present your opinion that the IQ difference is obvious as a fact, not as an opinion and that is what started the argument.  You only went back and said it was opinion later on, but you originally did not present it as opinion.


----------



## DarkMatter (Mar 14, 2008)

EastCoasthandle said:


> Your only accusation of me being a fanboi is offering an opinion that is different then your own.  This is not the definition of being a fanboi.  However you on the other hand through your rabid responses clearly show otherwise.  Per the mentioned PC Maximum review 7 found ATI's offering better then nvidia (which 5 took interest).  If you read the article they are referencing CF vs SLI which is why I take no real interest in it.  However, their was a distinct difference in the color of the fire (whatever that means there is no additional information regarding this).
> 
> 
> Source



On the profound fanboism in which you are summerged, you are not even able to notice that going against your opinion "Ati IQ is better" is not the same as saying "Nvidia is better", mine is a neutral opinion, in any case I would be a "fanboi of neutralism", but let's forget about this because you won't never see the difference. We have sustained the same discusion in the past and you always only listen to one person: you.

Now let's look at the charts and do some math:

15 + 9 = 24 > 21

So there's more people with an opinion different to yours than backing you up. And if you bothered to read the article, which is clear you didn't, you would realize this:

1- There should be alot more people saying Ati better if it actually was better.
2- The control group (showing them the same image, they saw a difference) demostrated that they felt forced to choose one. In that situation the fact that there's more from one than the other is fortuite, they were seeing the "same image" just as in the control group but they chose one over the other:



> THE CONTROL GROUP
> 
> We were surprised that only three of the six people in our control group expressed no preference between display A and display B—and in only one category each at that. Since they were unknowingly comparing identical rigs, we thought nearly everyone would admit there was no difference between the two displays. Since most of our subjects are professional critics, we suspect that they felt an inherent obligation to discern some difference between the two displays they were staring at (despite our assurances to the contrary).
> 
> The control group did help eliminate the display itself as a variable: If the monitors had colored our evaluators’ opinions, the votes would have been lopsided in favor of one or the other. Of the control group’s 18 opinions, nine favored monitor A, six favored monitor B, and three expressed no preference.



Explaining why that happened is as easy as throwing a coin and look at the results, throw it 10 times and you won't have 5/5 even when theoretically probabilities are 50%, throw it 50 times and it will be closer to 50%, throw it 10000000 times and there you have, almost 50%.

Don't be a fanboi, you are the only one who thinks that saying "not better than" == saying "worse than". You are the only one here who feels we are attacking your beloved card when we say it has the same IQ as the competition. That's fanboism.


----------



## EastCoasthandle (Mar 14, 2008)

newtekie1 said:


> Read post #38, your post.  The first post on the subject we are arguing about.  NO mention of it being opinion.  Just direct statements of fact, false facts.  You then continue with no mention of opinion in post #41.  You only then claimed those were your own personal opinions later one, when we already started arguing them.
> 
> 
> 
> ...


If I at a later point explain my posts (for clarity) then that is how it should be interpreted. You cannot define what I mean beyond that.  However, it's obvious you are simply posting this to argue and this is getting off topic.




DarkMatter said:


> On the profound fanboism in which you are summerged, you are not even able to notice that going against your opinion "Ati IQ is better" is not the same as saying "Nvidia is better", mine is a neutral opinion, in any case I would be a "fanboi of neutralism", but let's forget about this because you won't never see the difference. We have sustained the same discusion in the past and you always only listen to one person: you.
> 
> Now let's look at the charts and do some math:
> 
> ...


You are really grasping for straws and showing more and more your preference.  The article clearly show the results of their tests and they are conclusive, regardless of how you feel about them.  The "what ifs" and "suppose to be" are more excuses then an explanation to the results.   This all goes to show how easily offended you are regarding situations like this.




> Yes, but the problem is that your original post hint in no way that they are your opinions. You present them in ways to make people believe they are facts. So if they are presented as facts, we will argue them if they are incorrect. Later on saying they were only opinions doesn't mean we shouldn't have argued them in the first place. You can't expect us to be mind readers and know that when you present things in ways that make them seem factual, they are really opinions so we shouldn't argue them.
> 
> When you say "The HD DOES have better IQ than the GeForce" that is a statement of fact, not opinion. A statement you need to be able to back up with evidence. You only later realized your "fact" was false, and began claiming it was only your opinion.



It obvious you are not very astute and are now "making it up" at this point in an attempt to save face.  This alone discredits your entire argument regarding the IQ results and your interpretation of my posts.  However, based upon your lack of facts you haven't offered one shred of evidence to the contrary.  For example, you have not used both video cards yet you argue with me with baseless accusations.  LOL


----------



## newtekie1 (Mar 14, 2008)

DarkMatter said:


> Don't be a fanboi, you are the only one who thinks that saying "not better than" == saying "worse than". You are the only one here who feels we are attacking your beloved card when we say it has the same IQ as the competition. That's fanboism.



+1



EastCoasthandle said:


> If I at a later point explain my posts (for clarity) then that is how it should be interpreted. You cannot define what I mean beyond that.  However, it's obvious you are simply posting this to argue and this is getting off topic.



Yes, but the problem is that your original post hint in no way that they are your opinions.  You present them in ways to make people believe they are facts.  So if they are presented as facts, we will argue them if they are incorrect.  Later on saying they were only opinions doesn't mean we shouldn't have argued them in the first place.  You can't expect us to be mind readers and know that when you present things in ways that make them seem factual, they are really opinions so we shouldn't argue them.

When you say "The HD *DOES* have better IQ than the GeForce" that is a statement of fact, not opinion.  A statement you need to be able to back up with evidence.  You only later realized your "fact" was false, and began claiming it was only your opinion.


----------



## DarkMatter (Mar 14, 2008)

newtekie1 said:


> Yes, but the problem is that your original post hint in no way that they are your opinions.  You present them in ways to make people believe they are facts.  So if they are presented as facts, we will argue them if they are incorrect.  Later on saying they were only opinions doesn't mean we shouldn't have argued them in the first place.  You can't expect us to be mind readers and know that when you present things in ways that make them see factual, they are really opinions so we shouldn't argue them.
> 
> When you say "The HD *DOES* have better IQ than the GeForce" that is a statement of fact, not opinion.  A statement you need to be able to back up with evidence.  You only later realized your "fact" was false, and began claiming it was only your opinion.



+1

Now let's get on topic again please, he will never concede.


----------



## reviewhunter (Mar 14, 2008)

Gosh, what are you guys arguing about?


----------



## niko084 (Mar 14, 2008)

newtekie1 said:


> 70% faster in Crysis...wow...to bad AA kills it.   Not that you really need AA at 1920x1200.
> 
> I hope some driver maturing fixes the AA issues.



This should not be surprising seriously considering Crysis is programmed directly for Nvidia cards... Namely the 8800's, which is not much different from the 9800...

Our Ati cards are still not using half our shaders..... :shadedshu on Crytek.


----------



## driver66 (Mar 14, 2008)

Seriously guys let it rest


----------



## DarkMatter (Mar 14, 2008)

EastCoasthandle said:


> You are really grasping for straws and showing more and more your preference.  The article clearly show the results of their tests and they are conclusive, regardless of how you feel about them.  The "what ifs" and "suppose to be" are more excuses then an explanation to the results.   This all goes to show how easily offended you are regarding situations like this.



You don't know how to read in the results. But you didn't had to anyway, READ the actual article!! It says it all!!

According to your "conclusive" opinions, it's a lot more clear that the win of HP LP3065 30-inch LCD over HP LP3065 30-inch LCD (yeah it's the same, just in case you didn't notice) is greater than Ati vs. Nvidia. That win is more "conclusive" (the numbers are conclusive right?) than Ati vs. Nvidia win, but yeah I am grasping for straws here too... 



> The control group did help eliminate the display itself as a variable: If the monitors had colored our evaluators’ opinions, the votes would have been lopsided in favor of one or the other. Of the control group’s 18 opinions, nine favored monitor A, six favored monitor B, and three expressed no preference.


----------



## rampage (Mar 14, 2008)

sence the thread has been hijacked, why dosent someone create a new thread and we can have a vote on wich card has the god dam prittiest pictue, but then hay, i guess it also come down to what monitor your are using as well, so meh, who realy gives a rats ass, its not like the days of 1>2 years ago with image quality


----------



## DarkMatter (Mar 14, 2008)

rampage said:


> sence the thread has been hijacked, why dosent someone create a new thread and we can have a vote on wich card has the god dam prittiest pictue, but then hay, i guess it also come down to what monitor your are using as well, so meh, who realy gives a rats ass, its not like the days of 1>2 years ago with image quality



I already proposed that, but I just didn't want to start one myselft. But looking at circunstances I'm going to do it.


----------



## rampage (Mar 14, 2008)

DarkMatter said:


> I already proposed that, but I just didn't want to start one myselft. But looking at circunstances I'm going to do it.



ah ok, soz i skimmed over the thread, any who, it seams as things are the gx2 has a nice leed, but with drivers to come and the ati boys with ddr4 i still think things will be pritty close


----------



## mandelore (Mar 14, 2008)

wow, ive wandered into a cat fight. 

*throws catnip about and chills everyone out


----------



## newtekie1 (Mar 14, 2008)

niko084 said:


> This should not be surprising seriously considering Crysis is programmed directly for Nvidia cards... Namely the 8800's, which is not much different from the 9800...
> 
> Our Ati cards are still not using half our shaders..... :shadedshu on Crytek.



I don't know where you got that it is programmed directly for nVidia cards.  Just because it has the nVidia logo at the beginning doesn't mean it is programmed directly for nVidia cards, it just means nVidia paided them to put it there.  It has Intel's logo at the beginning too, do you really think it was programmed directly for Intel CPUs too?  It is just a marketting tool.  3Dmark06 has advertisements for Alienware computers in it, do you really think they programmed it directly for Alienware computers?

If the Crytek engine isn't using half the shaders, it is because there are too many shaders, though I doubt that is the case.


----------



## xfire (Mar 14, 2008)

You guys need to relax. Let more number tests come out with the Gx2 especially TPU's.


----------



## zOaib (Mar 14, 2008)

i love this place , we have almost an equal number of opposing factions i.e ATi Clan and Nvidia Clan members dueling it out .................. its sometimes fun when its a more mature argument , not when its down to fanboisim . =)


----------



## TooFast (Mar 14, 2008)

http://www.computerbase.de/news/har...8/februar/benchmarks_bilder_geforce_9800_gx2/

looks like nvidia is in trouble


----------



## calvary1980 (Mar 14, 2008)

that article is dated Feb 26th, a full review was done yesterday.

- Christine


----------



## niko084 (Mar 14, 2008)

newtekie1 said:


> I don't know where you got that it is programmed directly for nVidia cards.  Just because it has the nVidia logo at the beginning doesn't mean it is programmed directly for nVidia cards, it just means nVidia paided them to put it there.  It has Intel's logo at the beginning too, do you really think it was programmed directly for Intel CPUs too?  It is just a marketting tool.  3Dmark06 has advertisements for Alienware computers in it, do you really think they programmed it directly for Alienware computers?
> 
> If the Crytek engine isn't using half the shaders, it is because there are too many shaders, though I doubt that is the case.



Check their advertisements...

*BUILT ON THE 8800*
It is programmed directly to ONLY use 128 shaders.

So thats where I get my info. And yes it is programmed directly FOR nVidia cards!
Doesn't mean their ATI support is non existant, just means its not *tweaked* for them.

It gives a small slight advantage off the start. It shows when you compare how close these cards run and then hit Crysis....


----------



## cdawall (Mar 14, 2008)

would put good money on the fact that a month from now the scores will have changed completely due to new drivers for *both* cards.


and if i was getting one it would be the 3870X2 because cheaper+better design=purchase in my book even if the 9800 has a little bit of an advantage. lets face it NV drivers *SUCK!*


----------



## niko084 (Mar 14, 2008)

cdawall said:


> would put good money on the fact that a month from now the scores will have changed completely due to new drivers for *both* cards.
> 
> 
> and if i was getting one it would be the 3870X2 because cheaper+better design=purchase in my book even if the 9800 has a little bit of an advantage. lets face it NV drivers *SUCK!*



True.... Honestly I don't care who comes out on top, as long as technology moves forward and we get away from ultra high power requirements and space heaters.


----------



## imperialreign (Mar 14, 2008)

Sasqui said:


> Cool thanks, just curious where you picked that up...



It's been reported on Fudzilla, The Inquirer, Dailytech and a couple other "grain-of-salt" sites.


Anyhow - as to the rest of this thread over the last 12+ hours I've been away . . . :shadedshu

yet another ATI/nVidia thread reduced to kindergarten antics


----------



## rick22 (Mar 14, 2008)

One big thing Nvidia has over ( AMD ATI )      money.....for the people who were saying Nvidia  is scared of ATI ..nice try


----------



## DarkMatter (Mar 14, 2008)

cdawall said:


> would put good money on the fact that a month from now the scores will have changed completely due to new drivers for *both* cards.
> 
> 
> and if i was getting one it would be the 3870X2 because cheaper+better design=purchase in my book even if the 9800 has a little bit of an advantage. lets face it NV drivers *SUCK!*



Why do they suck? That's something that I read a lot in the forums, but I have never had a problem with them and I have used both WHQL and betas.


----------



## das müffin mann (Mar 14, 2008)

ok can we please drop all this fanboy shit, its amazing what happens while your gone, anywho the reason for the nvidia drivers suck remark (i believe) is based off of early drivers for a card (ati is just as bad) i've never had problems with either camp, except a few of their earlier drivers.  does it really mater at all which camp has the highest performing card? the difference in game is very minimal if any, as niko said who cares as long as technology advances and they use less pwer


----------



## OrbitzXT (Mar 14, 2008)

When it comes to drivers, nVidias have always worked better for me over ATI's. ATI I've had a number of issues, and one of my gripes with them is that the Image Scaling doesn't work all the time. When someone brings up nVidia's bad drivers, the only thing I ever think about and especially lately was the ill fated 7950 X2. On paper it should have performed well, but drivers killed it. For the most part both companies are good enough when it comes to drivers. Now if you want to talk about bad drivers, lets all reach under our desks for our Creative! bashing club.


----------



## Tatty_One (Mar 14, 2008)

Damn and I thought I could be a bit of a fanboi! Both cards look good, top of the range cards account for only around 5% of the market, in the case of ATi with their pricing policy for the HD3870x2, that will probably stretch to about 8% of the market, in the case of NVidia with their 9800GX2 pricing policy, that will probably shrink to around 2% of the market 

There are always some that want the best and are willing to pay almost anything for it, then again there are a lot more who are sensible enough to know where to invest their hard earnt cash, in this case I am afraid.....IMO thats in the direction of the HD3870x2 if the GX2 is really going to hit the shelves at near $600 .......and thats coming from a semi NVidia fanboi........Me!


----------



## tkpenalty (Mar 14, 2008)

The reviewers never mentioned how hot the card got... I have to say the fact that it only works on 780i boards makes it very undesirable...


----------



## yogurt_21 (Mar 15, 2008)

DarkMatter said:


> Bottom line is both are at the same level with some subtle differences, some people like Ati's while others like Nvidia's, but difference is not enough to represent a purchase decision. When someone buys one instead of the other strongly believing he is buying better IQ, well, he is buying a quimera.



actually it should when you factor in ati's horrible scaling with aa enabled. IQ is really an argument for nvidia theses days. sure ati can do a 24x edge-detect aa, but I've yet to see that run smoothly on any resolution above 1024x768 on games made since 2005. (well at stock anyway, at 940 core is a different matter and even then the offerings are few)

so bottom line should be IQ and perfomance= nvidia, cheaper =ati. 

sad really, I remember the 9800 days when ati had IQ, performance, and the cheaper price (the 128mb pro's, the xt was ridiculously priced at that time).


----------



## cdawall (Mar 15, 2008)

DarkMatter said:


> Why do they suck? That's something that I read a lot in the forums, but I have never had a problem with them and I have used both WHQL and betas.



you want to know why they suck because NV cant offer support for cards they still fucking sell! there flagship top of the line agp card (the 7800GS) crashes in almost every god damn game and benchmark i for one see that as complete BS. its 2 gens old now not even a full year in my PC and is lost driver support? WTF NV thats shit! oh and i hope you don't want DX10 on agp because thats just not happening from the nvidia camp! lets just ostracize a huge chunk of users and force them to purchase new boards, well guess what NV as soon as my 939 sells im done say hello ATi 580X board and maybe a 3850!


----------



## wolf (Mar 15, 2008)

cmon man, fair enough your super p-o'd that its not working out for ya, but could you tone it down a tad?


----------



## cdawall (Mar 15, 2008)

wolf said:


> cmon man, fair enough your super p-o'd that its not working out for ya, but could you tone it down a tad?



ugh they go for $250 or so on ebay as of a month ago thats more than a 8800GTS! its stupid they wont do a good driver for them....oh well its going into a backup rig so i'm past it i have an 8600GTS now cause it was cheap well actually i got paid to take it  thanks xazax


----------



## DarkMatter (Mar 15, 2008)

cdawall said:


> you want to know why they suck because NV cant offer support for cards they still fucking sell! there flagship top of the line agp card (the 7800GS) crashes in almost every god damn game and benchmark i for one see that as complete BS. its 2 gens old now not even a full year in my PC and is lost driver support? WTF NV thats shit! oh and i hope you don't want DX10 on agp because thats just not happening from the nvidia camp! lets just ostracize a huge chunk of users and force them to purchase new boards, well guess what NV as soon as my 939 sells im done say hello ATi 580X board and maybe a 3850!



Well I have my 7900GTX running on my second rig very well and it's from the same generation. All games run pretty well except Crysis and that's because of the CPU not the card. Not a single crash in my entire life, not from Nvidia nor from Ati cards. I still don't think their drivers suck.


----------



## cdawall (Mar 15, 2008)

DarkMatter said:


> Well I have my 7900GTX running on my second rig very well and it's from the same generation. All games run pretty well except Crysis and that's because of the CPU not the card. Not a single crash in my entire life, not from Nvidia nor from Ati cards. I still don't think their drivers suck.



PCI-e drivers are fine but the AGP 7800GS doesn't work worth a shit


----------



## EastCoasthandle (Mar 15, 2008)

cdawall said:


> PCI-e drivers are fine but the AGP 7800GS doesn't work worth a shit



I thought they stopped supporting AGP?  The last driver that I recall is the official 169.21 back in Dec 2007.

In any case I have to wonder if their driver support for the 9800 GX2 will be anything like the 7950GX2?  We will soon find out!


----------



## newtekie1 (Mar 15, 2008)

niko084 said:


> Check their advertisements...
> 
> *BUILT ON THE 8800*
> It is programmed directly to ONLY use 128 shaders.
> ...



Once again, advertising for nVidia because nVidia pays you to promote their cards doesn't mean it is built directly for nVidia cards, and it certainly doesn't mean it is only designed to use 128 shaders.  The game simply sends commands to the card, how many shaders are used to complete those commands are up to the card.  You can't tell software to only use so many shaders, it just isn't possible.


----------



## EastCoasthandle (Mar 15, 2008)

newtekie1 said:


> Once again, advertising for nVidia because nVidia pays you to promote their cards doesn't mean it is built directly for nVidia cards, and it certainly doesn't mean it is only designed to use 128 shaders.  The game simply sends commands to the card, how many shaders are used to complete those commands are up to the card.  You can't tell software to only use so many shaders, it just isn't possible.



That's not entirely true.  I recall you posting about this before suggesting that it's simple advertising:


newtekie1 said:


> I don't know where you got that it is programmed directly for nVidia cards.  Just because it has the nVidia logo at the beginning doesn't mean it is programmed directly for nVidia cards, it just means nVidia paided them to put it there.  It has Intel's logo at the beginning too, do you really think it was programmed directly for Intel CPUs too?  It is just a marketting tool.  3Dmark06 has advertisements for Alienware computers in it, do you really think they programmed it directly for Alienware computers?
> 
> If the Crytek engine isn't using half the shaders, it is because there are too many shaders, though I doubt that is the case.




Read the quote below to get the correct understanding of how TWIMTBP program works.  


> ...One point that Roy really wanted to hammer home was the fact that The Way It’s Meant To Be Played is not just a marketing programme like some would like you to believe. Instead, it’s about supporting the development community and helping them to create great content.
> 
> He talked to us about the developer tools that Nvidia creates to help developers optimise their code, things like NVPerfHUD and FX Composer. “These tools cost Nvidia millions of dollars every year to develop and maintain. We provide them to developers for free and they even work on our competitor’s hardware.”
> 
> ...


source

In any case, I find it odd that this card doesn't do well with TWIMTBP games when AA is applied.  I would think it wouldn't take a lot of R&D to get drivers up and running.  Apparently, that doesn't seem to be the case right now per this review.


----------



## cdawall (Mar 15, 2008)

EastCoasthandle said:


> I thought they stopped supporting AGP?  The last driver that I recall is the official 169.21 back in Dec 2007.
> 
> In any case I have to wonder if their driver support for the 9800 GX2 will be anything like the 7950GX2?  We will soon find out!



they stopped supporting anything that isn't an 8X00 series or higher card...bastards


----------



## newtekie1 (Mar 15, 2008)

cdawall said:


> they stopped supporting anything that isn't an 8X00 series or higher card...bastards



That isn't even remotely true.  That was the case for a while after the 8800 cards were first released, but now.  As they release drivers for the 8800 series card, they also release the same drivers for the 7 and 6 series cards.  All the cards currently have the same 169.21 driver.  Your 7800GS card uses the same driver also.  The have released several beta drivers, that also work with all the cards.  Currently I have 174.12 running on all my cards.  The 174.16 drivers work with all the 6,7, and 8 series cards, however it is only WQHL certified for the 9600GT, which is why they only suppor that card by defualt.  However, a hacked INF file exists that makes it work with all cards, but it really doesn't offer anything over the 174.12 drivers other than 9600GT support, so there is no reason to hassle with hacking it.


----------



## DarkMatter (Mar 15, 2008)

EastCoasthandle said:


> That's not entirely true.  I recall you posting about this before suggesting that it's simple advertising:
> 
> 
> 
> ...



Every game have a diferent rendering paths and optimizations for almost each card. What Nvidia does with TWIMTBP is ensure that the path used for their cards are as best as possible. That doesn't hurt Ati's card in any way.


----------



## TooFast (Mar 15, 2008)

nvidia 600$= 9800 GX2 
ATI 600$= 3870x2+3870 nuff said!


----------



## EastCoasthandle (Mar 15, 2008)

DarkMatter said:


> Every game have a diferent rendering paths and optimizations for almost each card. What Nvidia does with TWIMTBP is ensure that the path used for their cards are as best as possible. That doesn't hurt Ati's card in any way.



This is not revelant to the fact that TWIMTBP is more then just marketing hype as implied by newtekie1.  Also, the fact that they go on record saying:
-





> The Way It’s Meant To Be Played is not just a marketing programme



-





> developer tools that Nvidia creates to help developers optimise their code



-





> we want to make the gaming experience as good we possibly can on all of our hardware



specifically lets the reader know that TWIMTBP program is to help give nvidia based video cards the edge.  Also, AMD/ATI told Bit-Tech that they are being locked out from developers with TWIMTBP program, specially with CF development.  


> The long of the short is that AMD believes that Nvidia is locking it out of the market with its TWIMTBP programme—something I’m sure Nvidia would disagree with—and that developers working with Nvidia often make it difficult for AMD to get access to code early enough to develop CrossFire drivers in time for a game’s launch.


Source

Based on this information that can and will hurt ATI.  In hindsight of your post it comes off as bias when the facts contradict your comment.  Not only does this show a pattern in how you post it also shows that you are not neutral regarding the subject.  Therefore, I have to ask if you are part of the nvidia focus group?  In any case, it's just odd they are not ready for release according to this review.  Specially when it comes to AA and TWIMTBP games.


----------



## gOJDO (Mar 15, 2008)




----------



## warhammer (Mar 15, 2008)

================================================================================
Supported display modes for NV_DISP.INF               Version 169.28, 12/18/2007
================================================================================

Format:
[INFSectionName]
  ////////////////////////////////////////////////////////////////////
  // PCI ID(hex) - Device Name
  ////////////////////////////////////////////////////////////////////
  ; Spanning Type
    XRes x YRes  bpp  refresh1 refresh2 refresh3 ...

================================================================================

[nv_SoftwareDeviceSettings]

  ////////////////////////////////////////////////////////////////////
  // 0040 - NVIDIA GeForce 6800 Series GPU/GeForce 6800 Ultra
  // 0041 - NVIDIA GeForce 6800
  // 0042 - NVIDIA GeForce 6800 LE
  // 0043 - NVIDIA GeForce 6800 XE
  // 0044 - NVIDIA GeForce 6800 XT
  // 0045 - NVIDIA GeForce 6800 GT
  // 0046 - NVIDIA GeForce 6800 GT
  // 0047 - NVIDIA GeForce 6800 GS
  // 0048 - NVIDIA GeForce 6800 XT
  // 004D - NVIDIA Quadro FX 3400/4400
  // 004E - NVIDIA Quadro FX 4000
  // 0090 - NVIDIA GeForce 7800 GTX
  // 0091 - NVIDIA GeForce 7800 GTX
  // 0092 - NVIDIA GeForce 7800 GT
  // 0093 - NVIDIA GeForce 7800 GS <<<<<<<<<<============ LOOK
  // 0095 - NVIDIA GeForce 7800 SLI
  // 009D - NVIDIA Quadro FX 4500
  // 00C0 - NVIDIA GeForce 6800 GS/XT
  // 00C1 - NVIDIA GeForce 6800
  // 00C2 - NVIDIA GeForce 6800 LE
  // 00C3 - NVIDIA GeForce 6800 XT
  // 00CD - NVIDIA Quadro FX 3450/4000 SDI
  // 00CE - NVIDIA Quadro FX 1400
  // 0140 - NVIDIA GeForce 6600 GT
  // 0141 - NVIDIA GeForce 6600
  // 0142 - NVIDIA GeForce 6600 LE
  // 0143 - NVIDIA GeForce 6600 VE
  // 0145 - NVIDIA GeForce 6610 XL
  // 0147 - NVIDIA GeForce 6700 XL
  // 014A - NVIDIA Quadro NVS 440
  // 014C - NVIDIA Quadro FX 540M
  // 014D - NVIDIA Quadro FX 550
  // 014E - NVIDIA Quadro FX 540
  // 014F - NVIDIA GeForce 6200
  // 0160 - NVIDIA GeForce 6500
  // 0161 - NVIDIA GeForce 6200 TurboCache(TM)
  // 0162 - NVIDIA GeForce 6200SE TurboCache(TM)
  // 0163 - NVIDIA GeForce 6200 LE
  // 0165 - NVIDIA Quadro NVS 285
  // 0169 - NVIDIA GeForce 6250
  // 016A - NVIDIA GeForce 7100 GS
  // 01D0 - NVIDIA GeForce 7350 LE
  // 01D1 - NVIDIA GeForce 7300 LE
  // 01D3 - NVIDIA GeForce 7300 SE/7200 GS
  // 01DD - NVIDIA GeForce 7500 LE
  // 01DE - NVIDIA Quadro FX 350
  // 01DF - NVIDIA GeForce 7300 GS
  // 0211 - NVIDIA GeForce 6800
  // 0212 - NVIDIA GeForce 6800 LE
  // 0215 - NVIDIA GeForce 6800 GT
  // 0218 - NVIDIA GeForce 6800 XT
  // 0221 - NVIDIA GeForce 6200
  // 0222 - NVIDIA GeForce 6200 A-LE
  // 0240 - NVIDIA GeForce 6150
  // 0241 - NVIDIA GeForce 6150 LE
  // 0242 - NVIDIA GeForce 6100
  // 0245 - NVIDIA Quadro NVS 210S / NVIDIA GeForce 6150LE
  // 0290 - NVIDIA GeForce 7900 GTX
  // 0291 - NVIDIA GeForce 7900 GT/GTO
  // 0292 - NVIDIA GeForce 7900 GS
  // 0293 - NVIDIA GeForce 7950 GX2
  // 0294 - NVIDIA GeForce 7950 GX2
  // 0295 - NVIDIA GeForce 7950 GT
  // 029C - NVIDIA Quadro FX 5500
  // 029D - NVIDIA Quadro FX 3500
  // 029E - NVIDIA Quadro FX 1500
  // 029F - NVIDIA Quadro FX 4500 X2
  // 0390 - NVIDIA GeForce 7650 GS
  // 0391 - NVIDIA GeForce 7600 GT
  // 0392 - NVIDIA GeForce 7600 GS
  // 0393 - NVIDIA GeForce 7300 GT
  // 0394 - NVIDIA GeForce 7600 LE
  // 0395 - NVIDIA GeForce 7300 GT
  // 039E - NVIDIA Quadro FX 560
  // 03D0 - NVIDIA GeForce 6150SE nForce 430
  // 03D1 - NVIDIA GeForce 6100 nForce 405
  // 03D2 - NVIDIA GeForce 6100 nForce 400
  // 03D5 - NVIDIA GeForce 6100 nForce 420
  // 053A - NVIDIA GeForce 7050 PV / NVIDIA nForce 630a
  // 053B - NVIDIA GeForce 7050 PV / NVIDIA nForce 630a
  // 053E - NVIDIA GeForce 7025 / NVIDIA nForce 630a
  // 07E0 - NVIDIA GeForce 7150 / NVIDIA nForce 630i
  // 07E1 - NVIDIA GeForce 7100 / NVIDIA nForce 630i
  // 07E2 - NVIDIA GeForce 7050 / NVIDIA nForce 630i
  // 07E3 - NVIDIA GeForce 7050 / NVIDIA nForce 610i
  // 07E5 - NVIDIA GeForce 7100 / NVIDIA nForce 620i
  ////////////////////////////////////////////////////////////////////


My old card the 7800gs 512 worked well with NVIDIA drivers and XP and VISTA


----------



## warhammer (Mar 15, 2008)

It would be interesting to see how the 9800x2 performs with AA


----------



## Nitro-Max (Mar 15, 2008)

The x2 and gx2 drivers are still very premature id like to see results with time.the x2 did ok with 4x AA enabled as far as i can see.
But the gx2 pricing scares me as it did when the gtx was released i think id rather save money and crossfire two x2 cards.


----------



## intel igent (Mar 15, 2008)

Nitro-Max said:


> The x2 and gx2 drivers are still very premature id like to see results with time.the x2 did ok with 4x AA enabled as far as i can see.
> But the gx2 pricing scares me as it did when the gtx was released i think id rather save money and crossfire two x2 cards.



i agree with you about the drivers and im kind of surprised that the Nvidia card is doing as well as it is ATM but ID personally still go for the X2 based on price if i was in the market for either of those


----------



## Nitro-Max (Mar 15, 2008)

intel igent said:


> i agree with you about the drivers and im kind of surprised that the Nvidia card is doing as well as it is ATM but ID personally still go for the X2 based on price if i was in the market for either of those



Well considering the gddr4 x2 is coming out and the gx2 im hoping to see a drop in x2 pricing for the gddr3 versions then im gonna buy another.

Is anyone running quad x2 yet? got any benchies?


----------



## cdawall (Mar 15, 2008)

warhammer said:


> ================================================================================
> Supported display modes for NV_DISP.INF               Version 169.28, 12/18/2007
> ================================================================================
> 
> ...






newtekie1 said:


> That isn't even remotely true.  That was the case for a while after the 8800 cards were first released, but now.  As they release drivers for the 8800 series card, they also release the same drivers for the 7 and 6 series cards.  All the cards currently have the same 169.21 driver.  Your 7800GS card uses the same driver also.  The have released several beta drivers, that also work with all the cards.  Currently I have 174.12 running on all my cards.  The 174.16 drivers work with all the 6,7, and 8 series cards, however it is only WQHL certified for the 9600GT, which is why they only suppor that card by defualt.  However, a hacked INF file exists that makes it work with all cards, but it really doesn't offer anything over the 174.12 drivers other than 9600GT support, so there is no reason to hassle with hacking it.



maybe i should just take a picture of the BSOD i get on any game in XP? i tried a ti4200 and it ran fine no issues in this rig but th 7800GS+any NV driver=CRAP


----------



## Nitro-Max (Mar 15, 2008)

I previously put this in another post but ill repeat it again here.

I dont wish to disscredit any info posted i do belive it to be genuine.

But i feel till both sides need to work on drivers more to give better comparrisons.

I really want to see how the gddr4 X2 version compares with Nvidia's gx2 we are forgetting the gx2 runs gddr 4 already {unless this has changed??}the current x2 doesnt.

But soon will  http://www.tweaktown.com/news/9079/index.html


----------



## newtekie1 (Mar 15, 2008)

cdawall said:


> maybe i should just take a picture of the BSOD i get on any game in XP? i tried a ti4200 and it ran fine no issues in this rig but th 7800GS+any NV driver=CRAP



Odd how others run 7800GS with nVidia drivers just fine though.  Did you ever consider that it wasn't the drivers fault?  There are a lot of things the could cause a BSOD.  Broken card is a high possibility.


----------



## niko084 (Mar 16, 2008)

You need to read up on what happens with games are programmed and optimized for certain video cards....

There has been direct proof of up to 300% performance increases in the worst of cases... So in reality whats even 50%?


----------



## cdawall (Mar 16, 2008)

newtekie1 said:


> Odd how others run 7800GS with nVidia drivers just fine though.  Did you ever consider that it wasn't the drivers fault?  There are a lot of things the could cause a BSOD.  Broken card is a high possibility.



its just the XP drivers actually i just ran in linux and it works fine 3D etc...


----------



## TooFast (Mar 17, 2008)

http://www.tweaktown.com/articles/1332/1/page_1_introduction/index.html

http://www.tweaktown.com/articles/1331/2/page_2_the_card/index.html


----------



## mandelore (Mar 17, 2008)

Both cards appear great.

But consider this:

Every ATI card, has seen DRAMATIC improvements in performance in the coming months after release due to driver tweaking. If you add up all the boosts the 2900xt got since its release as a %, the performance increase between the initial card and the card now would make you think they were 2 seperate cards!!

So, to really see how these compare, I know it sux, but we gotta wait for at least a couple of driver revisions from both sides, then compare the X2 and GX2 with more mature drivers and greater performance/compatability.

Its just too early to tell, like the start of a horse race, its all up for grabs till further down the track when the winners are starting to become apparent. 

My thoughts tho.. Price wise, get the X2, and you know months down the line the card you purchased will still be getting faster and better. I dont have so much experience with Nvidia cards so not certain how their driver implementation affects card performance

and as for gaming, does it REALLY matter if your playing at 90fps or 110fps? is the extra cost of a GX2 if it turns out in the long run to be slightly faster worth it? 

If both can play at 1920x1200 at very high settings and great FPS, then they are both winner cards. Now, for the benchmark junky, that extra cost may be worth it 

Personally, im going to skip these x2/gx2 cards with their current architecture, and wait for a truly nxt gen dual gpu card. my 2900xt does me fine at 1920x1200


----------



## Darren (Mar 19, 2008)

cdawall said:


> maybe i should just take a picture of the BSOD i get on any game in XP? i tried a ti4200 and it ran fine no issues in this rig but th 7800GS+any NV driver=CRAP



That doesn't necessarily means it's Nvida's fault. Your blue screen of death could be related to a number of variables, perhaps your 7800 GS is conflicting with your motherboard? in which case you could argue that it's your motherboards manufactures responsibility to provide drivers. Or maybe your card is just faulty. Nvida's drivers are not always perfect but you can't blame every issue on Nvida, clearly other people on this forum have the same card or a similar generation of card and don't share your problem. 

You said your ti4200 was fine, well wouldn't that be a clear indication that your 7800 GS is faulty or damaged?


----------



## zOaib (Mar 19, 2008)

just ordered the 9800 gx2 xfx version will relay benchmarks , comparitive to my current setup using a hd 3870 x2 right now ..................... and then decide which one to keep ......


i know dont throw that chair at me i cud not help but check the damn card out


----------



## Megasty (Mar 19, 2008)

Nitro-Max said:


> Well considering the gddr4 x2 is coming out and the gx2 im hoping to see a drop in x2 pricing for the gddr3 versions then im gonna buy another.
> 
> Is anyone running quad x2 yet? got any benchies?



Yeah, its nothing to write home about though when it comes to 3dmk. You'll need like a 8-10ghz cpu to see what 2 cards can do. Also since the drivers are immature they really don't do much over one card either accept for...

*Best*
Call of Juarez - everything maxed out AAx4 AFx8@ 1920x1200, 
min-avg-max
15-20-27 - 1 card
33-38-60 - CF

*Worst*
Crysis - everything on VH AAx2 @ 1920x1200,
min-avg-max
5-15-52 - 1 card
8-19-61 - CF

CoJ was very playable @20fps & great @38fps. The frame drops were few to none 

Crysis was what you expected - jacked up from start to finish but I did manage to finish the game both times - it just took all day


----------



## cdawall (Mar 19, 2008)

Darren said:


> That doesn't necessarily means it's Nvida's fault. Your blue screen of death could be related to a number of variables, perhaps your 7800 GS is conflicting with your motherboard? in which case you could argue that it's your motherboards manufactures responsibility to provide drivers. Or maybe your card is just faulty. Nvida's drivers are not always perfect but you can't blame every issue on Nvida, clearly other people on this forum have the same card or a similar generation of card and don't share your problem.
> 
> You said your ti4200 was fine, well wouldn't that be a clear indication that your 7800 GS is faulty or damaged?



meh god point...its going away soon so TBH IDC


----------



## VroomBang (Mar 21, 2008)

There's no way I'm buying a card that costs more than a good mobo + a Q6600 + 4GB of good DDR2.


----------



## VroomBang (Mar 21, 2008)

niko084 said:


> True.... Honestly I don't care who comes out on top, as long as technology moves forward and we get away from ultra high power requirements and space heaters.



I'll second that. I'm sick of the increasing psu requirements with the new cards. We'll soon need a entire power plant to run one.


----------



## Blacklash (Mar 21, 2008)

I am keeping my overclocked HD 3850s in my AMD rig until the next round of real next gen products. They do fine on a 1680x monitor.

If you think the GX2 is a lot imagine going Quad on 790i. That would be about 500usd for DDR3 2x2Gb, 349usd for the mobo, and if you didn't have an appropriate PSU even more cash.


----------

