# NVIDIA GeForce 9800 GX2 Reaches EOL in Three Months?



## malware (Apr 8, 2008)

This information from Expreview may dissapoing many GeForce 9800 GX2 owners if true. NVIDIA is about to EOL (end-of-life) the GeForce 9800 GX2 line-up in just three months, as a result of two new GT200 cards - the single GPU GeForce 9900GTX and the dual GPU GeForce 9900 GX2. One of the GT200 cards will have similar performance and production cost as the GeForce 9800 GX2, which will force the manufacturer to cut down the "older" card. There will be no rebranding for 9800 GX2, like the GeForce 8800 GS which will become 9600 GSO, but just a sudden death. Meanwhile, details of the new GT200 graphics are still unknown.

*View at TechPowerUp Main Site*


----------



## farlex85 (Apr 8, 2008)

I wonder how reliable this is, this is the second time we've heard this. I don't know though, just second hand it seems like the gx2 is selling ok despite its poor value, if their making money from it I very much doubt they will eol it. That is unless ati brings up something very soon to spank it, which may be what they're counting on.


----------



## DanishDevil (Apr 8, 2008)

Wow.  That's insane.  A two month lifespan for a card.

Hah and some video card's "lifetime warranty" is for the lifetime of the card.  That's hilarious.  Could you imagine, a 2-month warranty for a $500+ card? ROFL!


----------



## MKmods (Apr 8, 2008)

DanishDevil said:


> Wow.  That's insane.  A two month lifespan for a card.
> 
> Hah and some video card's "lifetime warranty" is for the lifetime of the card.  That's hilarious.  Could you imagine, a 2-month warranty for a $500+ card? ROFL!



LOL, a 2 month lifespan for a $600 card


----------



## cjoyce1980 (Apr 8, 2008)

not a good time to be an nvidia fan boy


----------



## candle_86 (Apr 8, 2008)

lol whats so funny the 7800GTX 512 had a 3 month lifespan. 9800GX2 is a stop gab anyway.


----------



## cjoyce1980 (Apr 8, 2008)

candle_86 said:


> lol whats so funny the 7800GTX 512 had a 3 month lifespan. 9800GX2 is a stop gab anyway.



really? an GX2 a stop gap? this card should be a flag ship, not a "it will do till we get bored!"

this is a major kick in the nuts to any 9800GX2 owners out there

bad nvidia


----------



## candle_86 (Apr 8, 2008)

what did you expect, ATI put pressure on Nvidia with there little toy x2. Nvidia said ok, here have this to deal with, so we can ignore you while we finish our work. Everyone knew GT200, aka GT100 aka G100 aka G90 was around the corner, its just a matter of how long. The 98xx line are stop gabs and once the 9900 cards arrive id expect the 9600GT for all its merits to fold and be comparible to the 8600GTS of the 8 serirs not the 7600GT or 6600GT of old


----------



## Wile E (Apr 8, 2008)

Didn't the 7950GX2 have a similarly short lifespan before the release of the 8800?


----------



## csendesmark (Apr 8, 2008)

L.O.L.
These Monster cards don't have long lifetime


----------



## AddSub (Apr 8, 2008)

G92 = stop gap, period.


----------



## Nanyang (Apr 8, 2008)

Wow.... to fast for upgrade.... just get my 9800GX2 this month then next 3 month upgrade again... damm... But Nvidia rocks.....


----------



## LiveOrDie (Apr 8, 2008)

Nanyang said:


> Wow.... to fast for upgrade.... just get my 9800GX2 this month then next 3 month upgrade again... damm... But Nvidia rocks.....



lol give in 6-8 months and a new board will be out 870i-890i  i never buy a new card in it 1st 3 months because just like the ultra faster model's came out


----------



## Bjorn_Of_Iceland (Apr 8, 2008)

DanishDevil said:


> Wow.  That's insane.  A two month lifespan for a card.
> 
> Hah and some video card's "lifetime warranty" is for the lifetime of the card.  That's hilarious.  Could you imagine, a 2-month warranty for a $500+ card? ROFL!



yeah! haha. pretty useless for expensive brands like BFG or the likes.. hehe. Go Palit or Inno3d guys


----------



## Th3-R3as0n (Apr 8, 2008)

Wow.. 
There should be some good rivalry between AMD and nVidia in the next few months i think.. with the new 4xxx series on the brink and the 9900GTX said to be the same performance as a 9800GX2? Gonna be some pwnage of crysis sooner than we think..


----------



## Kei (Apr 8, 2008)

Wow...simply wow. If this is true then...wow I really can't think of anything except wow!

K


----------



## DarkMatter (Apr 8, 2008)

Wile E said:


> Didn't the 7950GX2 have a similarly short lifespan before the release of the 8800?



I think you are right. 

Anyway, it seems the 9900 will be of similar performance and price as 9800 GX2. This means that, unless 7950 GX2, 9800 GX2 will still be valuable. Only the new GX2 will be faster, but a lot more expensive, I guess. OK, it will piss off all those 9800 GX2 owners who bought it for the braggin' rights, but for those who bought it for the performance, they will be ok.



Th3-R3as0n said:


> Wow..
> There should be some good rivalry between AMD and nVidia in the next few months i think.. with the new 4xxx series on the brink and the 9900GTX said to be the same performance as a 9800GX2? Gonna be some pwnage of crysis sooner than we think..



Rivalry rocks, dude! 

I have always thought that Crytek did Crysis thinking that the new cards would be out Q4 2007 as they first said. But since Crysis ended up as the only game that needed such a card, both Ati and Nvidia kept their chips until they were needed and with hopes of making them better and/or cheaper: lower process, better optimization, GDDR5 at cheaper prices, etc. Remember that before G92 GX2 and RV670 X2 every new faster card has launched at a higher price point (usually $50 more) than the previous champ, but with this last gen they have made a step in the other direction.


----------



## TheMailMan78 (Apr 8, 2008)

HD2900XT = stop gap, period. I own one. Payed 400 bucks for it just to have the 3870 come out for half the price with the same performance.


----------



## btarunr (Apr 8, 2008)

HD2900 XT didn't meet EOL. It still is on sale and is supported. Let's differentiate 'stop-gaps' from products meeting EOL. I bet if its wonderful specs translated into the product it was anticipated to be, it would still have been the flagship. Too bad for the current owners of 9800 GX2, knowing their product would no longer be sold/supported in just three more months' time.


----------



## EastCoasthandle (Apr 8, 2008)

If true its the 7950GX2 all over again.  The concern/issue isn't really the card reaching EOL more so then driver support after the card reaching EOL.  :shadedshu


----------



## TheMailMan78 (Apr 8, 2008)

btarunr said:


> HD2900 XT didn't meet EOL. It still is on sale and is supported. Let's differentiate 'stop-gaps' from products meeting EOL. I bet if its wonderful specs translated into the product it was anticipated to be, it would still have been the flagship. Too bad for the current owners of 9800 GX2, knowing their product would no longer be sold/supported in just three more months' time.



I stand corrected. I didnt relize that EOL meant no more support at all. That should be against the law for such a shortlife span. Even automakers have to produce parts for a car 15 years after its been discontinued. Nvidia should at least support it with drivers for 3 years maybe?


----------



## btarunr (Apr 8, 2008)

It's somewhat different with IT. For example, the moment a version of Microsoft Windows hits EOL, they discontinue updates/hotfixes/patches to it, and also stop production/sales.


----------



## Nanyang (Apr 8, 2008)

Hope the 9900GX2 or GTX are GDDR4 or GDDR5 model with 256SP or 512SP each core... 

By the way dose anyone here know 9900 spec?


----------



## Xaser04 (Apr 8, 2008)

EastCoasthandle said:


> If true its the 7950GX2 all over again.  The concern/issue isn't really the card reaching EOL more so then driver support after the card reaching EOL.  :shadedshu



Hopefully 'history' will repeat itself: (Timing may be slightly out but this is how I rememebr it)

- 7950GX2 was discontinued - Nvidia 'bruiser' aka 8800GTX launched totally obliterating the competition. 

 - 9800GX2 discontinued - 9900GTX / GX2 launched completey obliterating everything . . . . . 

If the report is true and the 9900GTX does manage to match the 9800GX2 in terms of performance (but not lose out with SLI inefficiencies) then we could be looking at one very powerful card indeed (albeit one VERY expensive one unless AMD / ATI pull their finger out).


----------



## newconroer (Apr 8, 2008)

cjoyce1980 said:


> really? an GX2 a stop gap? this card should be a flag ship, not a "it will do till we get bored!"
> 
> this is a major kick in the nuts to any 9800GX2 owners out there
> 
> bad nvidia



Why? If the following cards are not that much of an upgrade, what does it matter?

If they are better, then..well it's not like the GX2 is a POS, and considering the price...I'd say people are getting a decent deal, if we look back in retrospect at the cost of some of the similar tiered cards from the recent past.


----------



## jocksteeluk (Apr 8, 2008)

I think the turnover of the video card is simply astounding then companies like Nvidia try to blame everyone else for the current downward trend in pc games sales.


----------



## indybird (Apr 8, 2008)

If the BFG 9800GX2 gets to about $500 by April 20th or later then I'll probably get that and then (100-day) step up to either the 9900GTX or 9900GX2 (which ever has more performance than the 9800GX2).

If this is true, thats really too bad for the early adopters of the 9800GX2.  Even if they got evga or BFG their step-up will run out before the new cards are released.  And for everyone else they're just stuck with it.  These 9900 cards better be amazing otherwise nvidia will be in a little bit of trouble on the high-end market.

-Indybird


----------



## SpookyWillow (Apr 8, 2008)

fake, they updated their story



> Update: According to sources, 9900GX2 do not exist. We are sorry for the mistakes.


----------



## GSG-9 (Apr 8, 2008)

SpookyWillow said:


> fake, they updated their story


----------



## Tatty_One (Apr 8, 2008)

ffs I am dizzy again, if this continues I am over to the "Darkside" to get two x300's for XFire.  NVidia need to be capped on releasing no more than 10 models/generations/cards per year, that way they may have a bit more time to support their existing ones properly.


----------



## mascaras (Apr 8, 2008)

Update: According to sources, 9900GX2 do not exist for now. We are sorry for the mistakes.


http://en.expreview.com/2008/04/08/geforce-9800gx2-will-be-eoled-in-three-months/


regards


----------



## mab1376 (Apr 8, 2008)

i think nvidia is trying to capitalize on their "beta cards"

the 9800GX2 is probably identical to the 9900GX2 other than the GPU itself.

does anyone know the specs for the new GT200?


----------



## Silverel (Apr 8, 2008)

Wasn't the 9800 GX2 just 2 8800GT's on a board? G92 was just a die shrink with little performance gain, but still 8 series tech? The 9600GT was a rebranded and underclocked 8800GT?

Yeah, I kinda feel bad for nVidia fans over teh past couple months. They've been getting a lot of rehash, instead of actual advancement. Not that Ati has done any better on the tech side of things...


----------



## ShadowFold (Apr 8, 2008)

I told you it was gonna be just like the 7900GX2.. I TOLD YOU SO.


----------



## webwizard (Apr 8, 2008)

This is unreal I just bought 2 XFX 9600GT XXX Alpha Dog cards. Wonder how long they will be around. It almost pays to stay with the 8800's until nvidia figures out what they are doing next.


----------



## das müffin mann (Apr 8, 2008)

atm is you have a newer 8800 you should be fine for a while tbh nvidia really has released way to much way to soon, its kinda anoying


----------



## candle_86 (Apr 8, 2008)

EOL doesnt mean they stop making drivers for it.

They made drivers for the Geforce4 Ti card right untill late 2006. GeforceFX is just now being dropped also. The newest drivers also support the 7950GX2 where you heard it doesnt i don't know, but EOL means they arn't making it, it doesnt mean they stop supporting it.


Also the 9600 is not an 8800GT thats had half the crap turned off, its a brand new core that never had the crap built in


----------



## Exavier (Apr 8, 2008)

I think I'm going to stick with either an 8800GTS or an 9800GTX to be fair; even if the 99xx series is amazing, I'm pretty sure it won't be the 79xx > 88xx series jump people hope it will be.


----------



## evil bill (Apr 8, 2008)

newconroer said:


> Why? If the following cards are not that much of an upgrade, what does it matter?
> 
> If they are better, then..well it's not like the GX2 is a POS, and considering the price...I'd say people are getting a decent deal, if we look back in retrospect at the cost of some of the similar tiered cards from the recent past.




As has been said already, its not that this suddenly makes the 9800GX2 a bad card, but rather that driver support and game profiles for it will dry up, meaning potentially the card does not get recognised for what it is.


----------



## webwizard (Apr 8, 2008)

Update: According to sources, 9900GX2 do not exist.


----------



## Megasty (Apr 8, 2008)

It was doomed from the start. Its just two 8800gt's stuck together. The price point for the thing is just too high for the performance it provides. But that also means the 9900gtx should be a _real_ gtx like the beast that the 8800gtx was. The g94 chip is the true performer. If they'll just put 128 shaders on that's thing along with 768 DDR3 or DDR4 then that your real gtx.


----------



## brian.ca (Apr 8, 2008)

webwizard said:


> Update: According to sources, 9900GX2 do not exist.



I'm not sure how much stock I'd put into that.. I can't remember the source but there were others pointing towards the 9900GX2 coming out in July before this story and ultimately it would make sense... If ATI uses an x2 card with their new chip in the summer to take the performance crown, the only reason Nv wouldn't use the same tactic to take it back (exactly like they did with the 9800GX2) would be if there were significant issues with doubling up the chips/cards into a sandwhich that made it not worth it.


----------



## Bjorn_Of_Iceland (Apr 8, 2008)

> EOL doesnt mean they stop making drivers for it.



Lesser priority for fixing certain issues for it though.


----------



## webwizard (Apr 8, 2008)

Here's where that came from.

http://en.expreview.com/


----------



## AddSub (Apr 8, 2008)

> Wasn't the 9800 GX2 just 2 8800GT's on a board? G92 was just a die shrink with little performance gain, but still 8 series tech? The 9600GT was a rebranded and underclocked 8800GT?
> 
> Yeah, I kinda feel bad for nVidia fans over teh past couple months. They've been getting a lot of rehash, instead of actual advancement. Not that Ati has done any better on the tech side of things...



Pretty much. Nvidia is at this point competing against itself on almost every tier. So they have no reason to release any 32 ROP 320 SP monster parts, hence the G92 was born, a tamer, severely cut down and tightened up offspring of G80 that is cheap to produce. People jumped on these because they offered decent performance (albeit at perhaps lower resolutions with AA/AF turned a bit down vs. high end G80 parts) and they were affordable compared to previous G80 parts. People who couldn't budget a G80 product finally had something they could pick up without having to sell their kidneys.

As for the next series, I don't know. From what I've been reading on Expreview, NordicHardware, and other places, it seems like it will be a even further tweaked G92, only with a die shrink (55nm?) which will allow it even higher clocks but the rest will remain the same (paltry 16 ROP's, 256-bit memory bus) which will quite probably be counter-balanced by higher clocks (again, thanks to even smaller process) and use of super-clocked GDDR4/5 VRAM, but nothing revolutionary. 

It really depends on what AMD puts on the shelves this summer. Why release anything serious when the competition (AMD) is having trouble on every level. A struggling competitor is better than a bankrupt competitor, from a business perspective anyways. nVidia was one of the most profitable corporations last year and was designated by Forbes as "Company of the Year" for 2007. With AMD in such condition, I don't seen any reason they would change their tactics.


----------



## Darkrealms (Apr 8, 2008)

AddSub said:


> Pretty much. Nvidia is at this point competing against itself on almost every tier. So they have no reason to release any 32 ROP 320 SP monster parts, hence the G92 was born, a tamer, severely cut down and tightened up offspring of G80 that is cheap to produce. People jumped on these because they offered decent performance (albeit at perhaps lower resolutions with AA/AF turned a bit down vs. high end G80 parts) and they were affordable compared to previous G80 parts. People who couldn't budget a G80 product finally had something they could pick up without having to sell their kidneys.
> 
> As for the next series, I don't know. From what I've been reading on Expreview, NordicHardware, and other places, it seems like it will be a even further tweaked G92, only with a die shrink (55nm?) which will allow it even higher clocks but the rest will remain the same (paltry 16 ROP's, 256-bit memory bus) which will quite probably be counter-balanced by higher clocks (again, thanks to even smaller process) and use of super-clocked GDDR4/5 VRAM, but nothing revolutionary.
> 
> It really depends on what AMD puts on the shelves this summer. Why release anything serious when the competition (AMD) is having trouble on every level. A struggling competitor is better than a bankrupt competitor, from a business perspective anyways. nVidia was one of the most profitable corporations last year and was designated by Forbes as "Company of the Year" for 2007. With AMD in such condition, I don't seen any reason they would change their tactics.


Unfortunately I have to agree with you on your third paragraph.  
Sadly for us though I think there is more performance available but if ATI doesn't put up Nvidia won't be able to for fear of becoming the "monopoly" (conceptually on the high end).

Being an Nvidia/AMD fan, I would honestly like to see ATI come out with something that kicks @$$.  I feel that Nvidia is getting lax with all these releases.

*On topic.*
I think almost everyone knew the fate of the GX2 but few of us wanted to believe it (including me).  _Not that it is a bad card, just that it isn't "the" card._


----------



## Tatty_One (Apr 8, 2008)

Silverel said:


> Wasn't the 9800 GX2 just 2 8800GT's on a board? G92 was just a die shrink with little performance gain, but still 8 series tech? The 9600GT was a rebranded and underclocked 8800GT?
> 
> Yeah, I kinda feel bad for nVidia fans over teh past couple months. They've been getting a lot of rehash, instead of actual advancement. Not that Ati has done any better on the tech side of things...



two 8800GTS.  Rehash?....agreed but dont just feel bad for NVidia owners, feel sorry for ATI owners too......2900XT>>>HD3870


----------



## yogurt_21 (Apr 8, 2008)

MKmods said:


> LOL, a 2 month lifespan for a $600 card



X1800XT anyone? except that one was more lik a month and a half. I bought it for 600$ on launch in december and then in mid- january the x1900xtx came out. lol


----------



## candle_86 (Apr 8, 2008)

lol the x1800xt, that was super failure right there is 1950XTX for all of the two months it was top dog.


----------



## imperialreign (Apr 9, 2008)

yogurt_21 said:


> X1800XT anyone? except that one was more lik a month and a half. I bought it for 600$ on launch in december and then in mid- january the x1900xtx came out. lol



yeah, but the 1800XT were still around for a long while afterwards . . . they were just overshadowed real quikc by the 1900 series.


As to the OP - if true, there's gonna be a TON of pissed off nvidia users and fanbois


----------



## v-zero (Apr 9, 2008)

All this says to me is that the GT200 is going to be nothing special at all...


----------



## farlex85 (Apr 9, 2008)

v-zero said:


> All this says to me is that the GT200 is going to be nothing special at all...



How does this tell you that?


----------



## brian.ca (Apr 9, 2008)

webwizard said:


> Here's where that came from.
> 
> http://en.expreview.com/



If that was a reply to my post above that wasn't what I was referring to.. I went back looking for it and found the original article(s), http://www.techpowerup.com/56608/NVIDIA_GeForce_9900_Series_Set_for_July_Launch?.html



AddSub said:


> Pretty much. Nvidia is at this point competing against itself on almost every tier. So they have no reason to release any 32 ROP 320 SP monster parts, hence the G92 was born, a tamer, severely cut down and tightened up offspring of G80 that is cheap to produce. People jumped on these because they offered decent performance (albeit at perhaps lower resolutions with AA/AF turned a bit down vs. high end G80 parts) and they were affordable compared to previous G80 parts. People who couldn't budget a G80 product finally had something they could pick up without having to sell their kidneys.
> 
> As for the next series, I don't know. From what I've been reading on Expreview, NordicHardware, and other places, it seems like it will be a even further tweaked G92, only with a die shrink (55nm?) which will allow it even higher clocks but the rest will remain the same (paltry 16 ROP's, 256-bit memory bus) which will quite probably be counter-balanced by higher clocks (again, thanks to even smaller process) and use of super-clocked GDDR4/5 VRAM, but nothing revolutionary.
> 
> It really depends on what AMD puts on the shelves this summer. Why release anything serious when the competition (AMD) is having trouble on every level. A struggling competitor is better than a bankrupt competitor, from a business perspective anyways. nVidia was one of the most profitable corporations last year and was designated by Forbes as "Company of the Year" for 2007. With AMD in such condition, I don't seen any reason they would change their tactics.



I wouldn't be too quick to write off ATI as a competitor.  Remember the rumors that the original g92 cards were pushed being pushed out early as a first strike vs. ATI's new 38x0 cards?  Given the shortage at the time and the recent reiterations of the chip those rumors probably held truth.  

Then there was the 3870 X2 which took the performance crown and Nvidia pushed out the 9800 GX2.   Now these new cards are supposed to come out the same time as ATI's 4000 series.   There shouldn't be any doubt that there's still competition here since Nv is clearly reacting (pretty successfully - though they are catching flak for putting out so many cards and their naming schemes) to ATIs movements.


----------



## imperialreign (Apr 9, 2008)

brian.ca said:


> If that was a reply to my post above that wasn't what I was referring to.. I went back looking for it and found the original article(s), http://www.techpowerup.com/56608/NVIDIA_GeForce_9900_Series_Set_for_July_Launch?.html
> 
> 
> 
> ...




The biggest tactic I've noticed nVidia use - which, IMO, is a big cause behind their hardware supply shortages - is they love to flood the market with new cards in one shot.  All their licensed manufacturers all release the same card on the same day.  Problem being, is because for all their licensed brands to meet that release date, there now has to be _x_ number of GPUs manufactured by date _n_ for everything to go well.  But, when _x-y_ GPUs are produced, we see a shortage in the supply . . . even moreso if the demand is high.

ATI's model, which they've been following for quite some time, is, IMO what has helped them keep their supply up to exceed demand.  They stagger their licensed brands out, so not everyone is releasing new hardware on the same day; typically it's about a week apart. Top-tier release first, followed by mid-tier and the bottom-tier; and usually following that is the top-tier "variations" . . . the specialty cards.

But, what really buggars nVidia even more, is the fact that anytime they make a miniscule change to the hardware, they want to release it as another card line within the same series, which is why we see the likes of 8800 GS, 8800 GT, 8800 GTS, 8800 GTX - and then you have the mid-range and lower cards from the same series, the 8300, 8500, 8600, + all their suffix laden varieties as well.  To the average consumer, the choices can be extremelly confusing, because to them, there doesn't appear to be that much of a difference between card models, so they just buy something.

Brute force tactics are, IMO, a defining trait of nVidia.


----------



## Bluefox1115 (Apr 9, 2008)

I can honestly see the card being discontinued. all it is, is an 8800GTS G92, only 2 gpus on one card... and box of a cooler.. same with the 9800GTX..


----------



## BumbRush (Apr 9, 2008)

btarunr said:


> It's somewhat different with IT. For example, the moment a version of Microsoft Windows hits EOL, they discontinue updates/hotfixes/patches to it, and also stop production/sales.



ms eol's stuff weird,(eol for distrobution) then they give a few years of support after that, i know 2k still gets critical updates for IE and such......


----------



## btarunr (Apr 9, 2008)

That's because IE is now treated as a product seperate from Windows. IE 6 has to be supported for longer irrespective of which OS it runs on. Can you run IE 7 on Win 2K ? If not, then IE 6 should get its security patches.


----------



## eidairaman1 (Apr 9, 2008)

I wouldnt say too much about AMD having trouble in every aspect, as the HD3 line are selling well, from low to top, the only Reason Nvidia released another 2x GPU card was due to fact ATI having one out on market before them.


AddSub said:


> Pretty much. Nvidia is at this point competing against itself on almost every tier. So they have no reason to release any 32 ROP 320 SP monster parts, hence the G92 was born, a tamer, severely cut down and tightened up offspring of G80 that is cheap to produce. People jumped on these because they offered decent performance (albeit at perhaps lower resolutions with AA/AF turned a bit down vs. high end G80 parts) and they were affordable compared to previous G80 parts. People who couldn't budget a G80 product finally had something they could pick up without having to sell their kidneys.
> 
> As for the next series, I don't know. From what I've been reading on Expreview, NordicHardware, and other places, it seems like it will be a even further tweaked G92, only with a die shrink (55nm?) which will allow it even higher clocks but the rest will remain the same (paltry 16 ROP's, 256-bit memory bus) which will quite probably be counter-balanced by higher clocks (again, thanks to even smaller process) and use of super-clocked GDDR4/5 VRAM, but nothing revolutionary.
> 
> It really depends on what AMD puts on the shelves this summer. Why release anything serious when the competition (AMD) is having trouble on every level. A struggling competitor is better than a bankrupt competitor, from a business perspective anyways. nVidia was one of the most profitable corporations last year and was designated by Forbes as "Company of the Year" for 2007. With AMD in such condition, I don't seen any reason they would change their tactics.


----------



## jaydeejohn (Apr 9, 2008)

nVidia is treading a thin line here. After all the naming schemes, such as someone buying a 9500 and then finding out its really a 8600, just renemaed. Now if this is true, this isnt good from nVidia. Theyre flooding too much, too fast, mixing all thier naming schemes up, and having way too short eols. Im hoping for both nVidia and ATI to actually put out a new arch, one that has a real life span, has real world improvements in both fps and eye candy, and are worthy of a new naming scheme


----------



## DarkMatter (Apr 9, 2008)

imperialreign said:


> But, what really buggars nVidia even more, is the fact that anytime they make a miniscule change to the hardware, they want to release it as another card line within the same series, which is why we see the likes of 8800 GS, 8800 GT, 8800 GTS, 8800 GTX - and then you have the mid-range and lower cards from the same series, the 8300, 8500, 8600, + all their suffix laden varieties as well.  To the average consumer, the choices can be extremelly confusing, because to them, there doesn't appear to be that much of a difference between card models, so they just buy something.
> 
> Brute force tactics are, IMO, a defining trait of nVidia.



Sometimes I feel like the only one with some memory, although memory it's not really needed when you have wikipedia at hand:

*Nvidia 6 series:*

6200, 6200 TC2, 6500, 6600 LE, 6600, 6600 GT, 6600 XL, 6800 LE, 6800 XT, 6800, 6800 GTO, 6800 GS, 6800 GT, 6800 Ultra.

Total: 14 cards.

*Ati 10 series:*

X300 SE, X300, X550 SE, X550, X600 Pro, AIW X600 Pro, X600 XT, X700, X700 Pro, X700 XT, X800 SE, X800 GT128, X800 GT 256, X800 GTO, X800, X800 GTO2, X800 GTO-16, X800 Pro, X800 Pro VIVO, X800 XL, AIW X800 XL, X800 XT, X800 XT VIVO, AIW X800 XT, X800 XT PE, X850 Pro, X850 XT, X850 XT PE.

Total: 28 cards.

Let's see the next generation.

*Nvidia 7 series:*

7100 GS, 7200 GS, 7300 SE, 7300 LE, 7300 GS, 7300 GT, 7600 GS, 7600 GT, 7600 GT Rev 2, 7800 GS, 7800 GT, 7800 GTX, 7800 GTX 512, 7900 GS, 7900 GT, 7900 GTO, 7900 GTX, 7950 GT, 7950 GX2.

Total: 19 cards.

*Ati 11 series:*

X1300, X1300 Pro, X1300 XT, X1550 SE, X1550, X1600 Pro, X1600 XT, X1650, X1650 Pro, X1650 GT, X1650 XT, X1800 GTO, X1800 GTO Rev. 2, X1800 XL, AIW X1800 XL, X1800 XT, X1900 GT, X1900 GT Rev. 2, AIW X1900, X1900 CrossFire, X1900 XT, X1900 XTX, X1950 GT, X1950 Pro, X1950 XT, X1950 XTX.

Total: 26 cards.

*Nvidia 8 series:*

8400 GS, 8500 GT, 8600 GT, 8600 GTS, 8800 GS, 8800 GTS G80, 8800 GT, 8800 GTS G92, 8800 GTX, 8800Ultra.

Total: 10 cards.

But let's add OEM and 9 series since it's based on the same chip (though I could do the same in the above lists and add quite some more).

8600 GS, 9500 GT, 9600 GT, 9800 GT (GTS? Is this one even going to be launched?), 9800 GTX, 9800 GX2

Total: 16 cards.

I could go with Ati series 9 vs. Nvidia series 5 too, but I think I have proven my point with this... (Huh! I didn't make any point? Guess it. lol)

That's what happens when you are in the lead with a strong architecture that can scale well.
And TBH I don't think that's bad, actually I think it's good for the consumer, because you have many cards at different price points with small differences in performance. You can spend as much as you want and you'll get the performance accordingly, you don't have to settle for a slow card (slower than what you want) or spend big $ for a card that is more than what you need. Cough* HD series *cough.


----------



## jaydeejohn (Apr 9, 2008)

Im no fanboy. I was wondering, is a 8800GS worse than all the GTS's?


----------



## newconroer (Apr 9, 2008)

evil bill said:


> As has been said already, its not that this suddenly makes the 9800GX2 a bad card, but rather that driver support and game profiles for it will dry up, meaning potentially the card does not get recognised for what it is.



Apparently you didn't read above. Drivers will not necessarily lose support. With the amount of drivers that they dish out nearly every week, and how new or current the GX2 would be, they'd have to abruptly and completely drop support in order for owners to be significantly affected.

You don't have any proof of that, and I don't remember them doing it in the past..so where you getting your information?


----------



## Xaser04 (Apr 9, 2008)

jaydeejohn said:


> Im no fanboy. I was wondering, is a 8800GS worse than all the GTS's?



Yes The GS in simple terms is like a crippled GT (less shaders (96 vs 112) and lower memory interface (192bit vs 256bit). It is however still a good card and performs around the same as a 512mb HD3850 in most games (although its lack of memory can become a problem in certain titles once you up the resolution / AA/AF settings)

Performance wise it would go:

GTS 512mb > GT > GS


----------



## DarkMatter (Apr 9, 2008)

jaydeejohn said:


> Im no fanboy. I was wondering, is a 8800GS worse than all the GTS's?



I think it's definately faster than 8800 GTS 320. And also GTS 640 when AA is disabled. Because of it's 192 bit memory interface, ROP count (12) and frame buffer (384 MB), the GS was never designed to run on high resolutions or AA levels, but it seems it handles AF pretty well. The GS is fast when AA is disabled and you don't go as high as 1920x1200.  At let's say 1680x1050 0x AA 16x AF it's faster than 8800 GTS 640, HD2900 XT and HD3870 on most games. But if you are going to ever use higher settings, I wouldn't recomend the card. Overall HD3870 or 9600 GT are better deals. None of the aforementioned are worth for an upgrade over a GTS 640 though.


----------



## Darren (Apr 9, 2008)

jaydeejohn said:


> Im no fanboy. I was wondering, is a 8800GS worse than all the GTS's?



The GS is the bottom line 8800 series and is slightly faster than ATI's 3850. The 8800 GS is slower than the 3870, in fact the ATI 3870 is equivalent in to Nvidia's 9600 GT. Technically in order it should be GS, GT, and then GTS. However on later card's the GT models are proving equivalent or faster than the GTS models. But to answer your question to completion both the GT and GTS models are faster than the GS.


----------



## btarunr (Apr 9, 2008)

DarkMatter,

You missed X1950 Crossfire


----------



## Tatty_One (Apr 9, 2008)

brian.ca said:


> If that was a reply to my post above that wasn't what I was referring to.. I went back looking for it and found the original article(s), http://www.techpowerup.com/56608/NVIDIA_GeForce_9900_Series_Set_for_July_Launch?.html
> 
> 
> Then there was the 3870 X2 which took the performance crown and Nvidia pushed out the 9800 GX2.   Now these new cards are supposed to come out the same time as ATI's 4000 series.   There shouldn't be any doubt that there's still competition here since Nv is clearly reacting (pretty successfully - though they are catching flak for putting out so many cards and their naming schemes) to ATIs movements.



I beleive in the case of the GX2, it was estimated to be released in March as far back as late October 2007 before the 3870x2's released date was officially conveyed to us consumers, I dont think the GX2 was ever intended to be a deliberate and direct competitor for the HD4000 series, I think that the 9900's were always supposed to be that.......I might be wrong there but the very reason whay some of us say the G92 was a "stopgap" was because the GT200 or whatever it's called was going to have the greatest architectural/performance development....as I said, just my thoughts, not necessarily fact.


----------



## candle_86 (Apr 9, 2008)

if ATI brings back the R200 to compete with the GT200, Nvidia wins by default


----------



## newtekie1 (Apr 9, 2008)

btarunr said:


> It's somewhat different with IT. For example, the moment a version of Microsoft Windows hits EOL, they discontinue updates/hotfixes/patches to it, and also stop production/sales.



Wait, since when has EOL meant support stopped?  Even in Windows, EOL didn't mean support stopped, it just meant production and sales stopped.

Just look at Windows 98, Microsoft EOL'd it in 2004, but continued support into 2006.  EOL does not mean support for the product ends, it just means the product isn't being produced anymore.

The 7 series cards have long been EOL'd, and they still get driver updates and support.


----------



## [I.R.A]_FBi (Apr 9, 2008)

newtekie1 said:


> Wait, since when has EOL meant support stopped?  Even in Windows, EOL didn't mean support stopped, it just meant production and sales stopped.
> 
> Just look at Windows 98, Microsoft EOL'd it in 2004, but continued support into 2006.  EOL does not mean support for the product ends, it just means the product isn't being produced anymore.
> 
> The 7 series cards have long been EOL'd, and they still get driver updates and support.



bt what abot the 7 series gx2?


----------



## newtekie1 (Apr 9, 2008)

[I.R.A]_FBi said:


> bt what abot the 7 series gx2?



What about it, just because driver support for the 7 series GX2 "died" around the time the card was EOL'd that doesn't mean it is always the case.  It wasn't even the case with the 7 series GX2.  It is still supported in the latest drivers for the 7 series, 174.74 supports the 7950 GX2.


----------



## Darren (Apr 9, 2008)

+1 to newtekie1

My Auzentech X-Meridian sound card reached EOL last year, I'm still getting driver support, the latest driver release was a few weeks ago. I bet creative dont treat their customers this well




[I.R.A]_FBi said:


> customers? what customers .. yo mean the ppl they fleece?



Agreed! I used to be a big Creative fan boy a few years back, I didn't mind paying three times the cost just for EAX support until I started researching home cinema systems and read a lot of forums with pissed off customers complaining because Creative told customers they could get Dolby encoding on the fly over SPDIF. They actually used to market their cards as Dolby authentic product with stickers and logo's claiming of it's encoding abilities. Ever since then I lost all respect for creative and decided to hold onto my Hercules Muse 5.1 until I could afford a good non-creative card. That's when Auzentech became


----------



## [I.R.A]_FBi (Apr 9, 2008)

Darren said:


> +1 to newtekie1
> 
> My Auzentech X-Meridian sound card reached EOL last year, I'm still getting driver support, the latest driver release was a few weeks ago. I bet creative dont treat their customers this well



customers? what customers .. yo mean the ppl they fleece?


----------



## candle_86 (Apr 9, 2008)

newtekie1 said:


> Wait, since when has EOL meant support stopped?  Even in Windows, EOL didn't mean support stopped, it just meant production and sales stopped.
> 
> Just look at Windows 98, Microsoft EOL'd it in 2004, but continued support into 2006.  EOL does not mean support for the product ends, it just means the product isn't being produced anymore.
> 
> The 7 series cards have long been EOL'd, and they still get driver updates and support.



lol

NV34, NV44, and G72 are not EOL yet.


----------



## newtekie1 (Apr 9, 2008)

candle_86 said:


> lol
> 
> NV34, NV44, and G72 are not EOL yet.



What does that have to do with anything?

G70(7800GT/GTX), G71(7900GS/GT/GTX/GX2, 7950GT/GX2), and G73(7300GT, 7600GS, 7600GT) are all EOL, and still supported.


----------



## candle_86 (Apr 9, 2008)

just making a point here. Also i find it rather funny NV34 isnt EOL yet the newest drivers dont work with the FX line of cards lol.

But always Remeber that the FX, 6, and 7 line are not dead as these cards are still being made

GeforceFX 5200, GeForceFX 5500, GeForce 6200TC, Geforce 7300LE.


----------



## newtekie1 (Apr 9, 2008)

174.74 works just fine with the FX cards, I have them installed on my FX5200 right now.


----------



## candle_86 (Apr 9, 2008)

I can't get them to install on my FX5600 Ultra FC card


----------



## BumbRush (Apr 9, 2008)

hax!!!


----------



## imperialreign (Apr 9, 2008)

DarkMatter said:


> Sometimes I feel like the only one with some memory, although memory it's not really needed when you have wikipedia at hand:
> 
> *Nvidia 6 series:*
> 
> ...



I'm sorry if I wasn't all that clear, in that earlier post about the series generations - yeah, you can bunch all ATI's cards into the X100 and X1000 series; but, what I was trying to get across is that for many, ATI's card naming schemes are a little easier to interpret.  The higher the number, the better they think the card is, and there was typically a noticeable difference between two "sub-series" (i.e. a 1000 vs 1300, 1300 vs 1500, 1500 vs 1600, etc).  But, you knew that they considered a 1950 to be better than a 1900, and the 1900 was better than the 1800, the 1650 better than the 1600, etc.  I guess that was the biggest difference between nVidia's and ATI's naming schemes a few years back, ATI had "sub-series" lineups.  It's only been since the HD2000 series that they've gone to a more relaxed naming scheme similar to what nVidia has used (2400, 2600, 2900, etc), even going so far as to drop the suffixes in place of the ~50 and ~70 tags.

Now, I'll defi give you the fact that ATI went hog-wild-and-a-half with the 1900 lineup, there were tow revisions of the GT, two revisions of the PRO, two revisions of the XT, then the XTX, the crossfire edition, then the 1950 lineup.  ATI beat that sub-series to a dead horse, buried it, brought it back and beat it some more . . . and they loved the crap out of their name suffixes, too.


----------



## candle_86 (Apr 9, 2008)

well lets show the Radeon 9 compared to Nvidia FX just for kicks

FX

5200, 5200Ultra, 5300, 5500, 5600XT, 5600, 5600Ultra, 5600 Ultra FC, 5700le, 5700, 5700 Ultra. 5700 Ultra DDR3, 5750, 5800, 5800 Ultra, 5900XT, 5900, 5900Ultra, 5950 Ultra

Radeon 9

9000SE, 9000, 9000Pro, 9200SE, 9200, 9200 Pro, 9500, 9500Pro, 9550, 9600SE, 9600, 9600Pro, 9600XT, 9700TX, 9700, 9700Pro, 9800SE, 9800, 9800Pro, 9800Pro 256, 9800XT

last time it was about even


----------



## newtekie1 (Apr 9, 2008)

candle_86 said:


> I can't get them to install on my FX5600 Ultra FC card




Are you using the WHQL version or the Beta version?  The WHQL version only supports a few cards, but the beta version support pretty much every card ever released.  The FX 5600 Ultra is listed under the supported cards for the beta version.


----------



## btarunr (Apr 9, 2008)

newtekie1 said:


> Wait, since when has EOL meant support stopped?  Even in Windows, EOL didn't mean support stopped, it just meant production and sales stopped.
> 
> Just look at Windows 98, Microsoft EOL'd it in 2004, but continued support into 2006.  EOL does not mean support for the product ends, it just means the product isn't being produced anymore.
> 
> The 7 series cards have long been EOL'd, and they still get driver updates and support.



Sure, you have drivers from NVidia for FX 5200 too, BUT support in the sense of driver updates that fix issues, 'enhance performance' doesn't happen. You get the latest driver, but apart from the Control Panel, the driver has nothing new for the FX 5200 for example as also with all other NVidia products that hit EOL. Besides, it was extremely shameful of them to abandon the 7950 GX2. It's sort of like they throw a 4 month baby into a dumpster and drive away.


----------



## newtekie1 (Apr 9, 2008)

btarunr said:


> Sure, you have drivers from NVidia for FX 5200 too, BUT support in the sense of driver updates that fix issues, 'enhance performance' doesn't happen. You get the latest driver, but apart from the Control Panel, the driver has nothing new for the FX 5200 for example as also with all other NVidia products that hit EOL. Besides, it was extremely shameful of them to abandon the 7950 GX2. It's sort of like they throw a 4 month baby into a dumpster and drive away.



The products still get the general enhancements that come with the updated drivers, but you are correct there are generally no real enhancements geared directly at the product.  But that happens with pretty much every old piece of hardware and it doesn't happen just because it is EOL'd.  It happens because new products need more work.  The FX series was as good as it was going to get, it wasn't getting any better via driver updates.  The same goes for the 7950GX2.  When hardware gets old, companies don't waste time reworking drivers to improve them.


----------



## btarunr (Apr 10, 2008)

newtekie1 said:


> The same goes for the 7950GX2.  When hardware gets old, companies don't waste time reworking drivers to improve them.



Give me a driver that lets me run that card under Vista64, give me a driver that allows me to use one of its marketing features, SLI (of two 7950 GX2 cards) under any OS, you choose. It was pretty much an abandon.


----------



## DarkMatter (Apr 10, 2008)

imperialreign said:


> I'm sorry if I wasn't all that clear, in that earlier post about the series generations - yeah, you can bunch all ATI's cards into the X100 and X1000 series; but, what I was trying to get across is that for many, ATI's card naming schemes are a little easier to interpret.  The higher the number, the better they think the card is, and there was typically a noticeable difference between two "sub-series" (i.e. a 1000 vs 1300, 1300 vs 1500, 1500 vs 1600, etc).  But, you knew that they considered a 1950 to be better than a 1900, and the 1900 was better than the 1800, the 1650 better than the 1600, etc.  I guess that was the biggest difference between nVidia's and ATI's naming schemes a few years back, ATI had "sub-series" lineups.  It's only been since the HD2000 series that they've gone to a more relaxed naming scheme similar to what nVidia has used (2400, 2600, 2900, etc), even going so far as to drop the suffixes in place of the ~50 and ~70 tags.
> 
> Now, I'll defi give you the fact that ATI went hog-wild-and-a-half with the 1900 lineup, there were tow revisions of the GT, two revisions of the PRO, two revisions of the XT, then the XTX, the crossfire edition, then the 1950 lineup.  ATI beat that sub-series to a dead horse, buried it, brought it back and beat it some more . . . and they loved the crap out of their name suffixes, too.



You say that for many Ati card naming scheme is easier to interpret. I can assure you that for many Nvidia's is easier. It's that simple. I don't see any difficulty in any of the two. I can admit that I like the scheme that Ati is using with its HD3000 series and that xx30, xx50, xx70 is better than sufixes. We can say also that at the same time there's been all this naming confusion on Nvidia's part, but remember that Ati and it's jump to the HD3000 is the guillty for all this confusion. Well maybe not with the 8800 GTS, but yes with the jump to 9 series. Before these series both Ati and Nvidia followed pretty much the same scheme. And frankly, I don't know what you are trying to say when you say some years back. It has always been the same naming scheme: first number for the series, second one for the model (usually using 3 numbers, the 8 for the high-end and using the others in a way that better explains the performance difference between them, if there's a big revision +1 to all the sub-series number) and the last two for an small revision. The sufixes usually for differences that came out from the same chip durng production to improve yields (such as disabled pipelines or simply different clocks). This was used by both companies up until now, I can't see how you can argue one was clearer than the other, because the two had the same scheme. In Nvidia cards you also know that 7600 is better than 7500, 7900 better than 7800 and so on. And then you have the suffix just as in Ati before HD3000.

And just so that you know that a higher number wasn't always a sign of a better card, you have the X1900 GT vs. X1950 GT. I couldn't find myself a Nvidia card with the same problem.

What I'm saying is that you can't say what you said and relate it to common Nvidia trait, when both companies have done the same for years. Indeed is Ati who liked more flooding the market with many different models until R600. And if they are not doing it right now, is because their architecture is not flexible enough to permit it.


----------



## Wile E (Apr 10, 2008)

The 7950GX2 is still supported by the latest drivers, but hasn't seen any fixes or optimizations targeted to it. Just because it's on the supported card list, doesn't mean they've done anything meaningful for it in terms of performance or bug fixes or features. REAL support for that card has been abysmal.


----------



## btarunr (Apr 10, 2008)

Looking at NVidia's track record of dealing with products that hit EOL way too soon (eg: 7950 GX2), I would stay light years away from buying a 9800 GX2 now.


----------



## BumbRush (Apr 10, 2008)

btarunr said:


> Sure, you have drivers from NVidia for FX 5200 too, BUT support in the sense of driver updates that fix issues, 'enhance performance' doesn't happen. You get the latest driver, but apart from the Control Panel, the driver has nothing new for the FX 5200 for example as also with all other NVidia products that hit EOL. Besides, it was extremely shameful of them to abandon the 7950 GX2. It's sort of like they throw a 4 month baby into a dumpster and drive away.



first time i have seen you say anything that was no "nvidia rocks ati sucks" and you acctualy made sence this time 

nvidia really raped people who bought then gx2 , i know ppl who had them, every singel person got pissed and dumped it as soon as it was clear driver support was NOT gonna happen(after the 8800 came out it was never gonna get fixed....)

nvidia really raped people on the gx2 back then......


----------



## BumbRush (Apr 10, 2008)

DarkMatter said:


> You say that for many Ati card naming scheme is easier to interpret. I can assure you that for many Nvidia's is easier. It's that simple. I don't see any difficulty in any of the two. I can admit that I like the scheme that Ati is using with its HD3000 series and that xx30, xx50, xx70 is better than sufixes. We can say also that at the same time there's been all this naming confusion on Nvidia's part, but remember that Ati and it's jump to the HD3000 is the guillty for all this confusion. Well maybe not with the 8800 GTS, but yes with the jump to 9 series. Before these series both Ati and Nvidia followed pretty much the same scheme. And frankly, I don't know what you are trying to say when you say some years back. It has always been the same naming scheme: first number for the series, second one for the model (usually using 3 numbers, the 8 for the high-end and using the others in a way that better explains the performance difference between them, if there's a big revision +1 to all the sub-series number) and the last two for an small revision. The sufixes usually for differences that came out from the same chip durng production to improve yields (such as disabled pipelines or simply different clocks). This was used by both companies up until now, I can't see how you can argue one was clearer than the other, because the two had the same scheme. In Nvidia cards you also know that 7600 is better than 7500, 7900 better than 7800 and so on. And then you have the suffix just as in Ati before HD3000.
> 
> And just so that you know that a higher number wasn't always a sign of a better card, you have the X1900 GT vs. X1950 GT. I couldn't find myself a Nvidia card with the same problem.
> 
> What I'm saying is that you can't say what you said and relate it to common Nvidia trait, when both companies have done the same for years. Indeed is Ati who liked more flooding the market with many different models until R600. And if they are not doing it right now, is because their architecture is not flexible enough to permit it.



5700ultra vs 5950xt the 5700ultra was faster in any modern game(dx9/ogl2)  hell the 5900ultra vs the 5950xt(xt on nvidia=worse of the perticular number line..poor clockers that run hot)

i could go on and on, 5500 vs 5500xt  this case they used the suffix BUT its the same problem, they copyed ati's suffixes but they reverced them, insted of xt being the best they made it the worst(dirty trick!!!)

or the 5700ultra vs 5750xt(yes i have seen these cards, and in every case had to replace them with an ati card to give the person decent dx9 perf) the 5750 was not a well known run, it was just a 5700 with worse ram in most cases, horrible cards the 5750xt was close in my exp to the higher end 5200's..........horrible cards.......horrible    :shadedshu


----------



## calvary1980 (Apr 10, 2008)

you can add the 8800GTS to the EOL list according to Fudzilla and NordicHardware as of Today. whats interesting is in another thread I posted a link from HOCP about the *9800GTX* and 9800GX2 going EOL in 3 months about 3 days ago however this news bit from expreview only mentions the 9800GX2.

- Christine


----------



## candle_86 (Apr 10, 2008)

BumbRush said:


> 5700ultra vs 5950xt the 5700ultra was faster in any modern game(dx9/ogl2)  hell the 5900ultra vs the 5950xt(xt on nvidia=worse of the perticular number line..poor clockers that run hot)
> 
> i could go on and on, 5500 vs 5500xt  this case they used the suffix BUT its the same problem, they copyed ati's suffixes but they reverced them, insted of xt being the best they made it the worst(dirty trick!!!)
> 
> or the 5700ultra vs 5750xt(yes i have seen these cards, and in every case had to replace them with an ati card to give the person decent dx9 perf) the 5750 was not a well known run, it was just a 5700 with worse ram in most cases, horrible cards the 5750xt was close in my exp to the higher end 5200's..........horrible cards.......horrible    :shadedshu



umm, the 5600XT was out before the 9800XT and 9600XT, it dates to the launch of the 9800pro. Nvidia has never stolen ATI's naming crap.

As for the cards themselves they did what they where supposed to do, run DX8 games, not even an ATI card from that time can run a DX9 game at full tilt. Back in 2003 who cared if it handled SM2 right, they first game to use it was FarCry and the 6800Ultra beat it out the door. As for poor clockers is that why my prolink pixelview 5900XT Limited Golden Edition clocked to 475/1000 from 350/700 because it was a bad overclocker? As for saying the 5750 is a crappier card than the normal is total BS also, the 5750 was the PCX 5700. Go read bombrush, stop spouting your stupid crap.


----------



## BumbRush (Apr 10, 2008)

the 5600XT was not out sooner, the 5600 was, they added the XT later after the 9800XT sales where so high, i was a huge nvidia fan back then and was watching the 5600's because i fully expected nvidia to fix the perf in dx9 with drivers, they didnt at all.......

and the 9500pro/9700 can run dx9 games from back then JUST FINE, i know i had a 9800SE 256bit(moded to true pro) and o farcry wasnt the first to use sm2, there where others, i dont got a list, but i had a bunch of betas i was in and many of them came out b4 farcry did.

as to the 6800 killing the x800, the x800 was faster and had better IQ, when farcry updated to a sm3 patch it brought the gf6 line to THE SAME QUILITY as ati's cards already had includint the 9500-9600-9700-9800 cards.

the 5750xt was worse then the 5700 normal, period.

now compare the 5800u to the 59*0 cards for example, in many benches its a couple fps faster, its a shit card as are all the fx line cards but still, its 5800 vs 5900/5950 and u see that the 5800 is slower in some cases and faster in others, but still, by name it shoutd NEVER be faster........EVER!!!!!


----------



## newtekie1 (Apr 10, 2008)

btarunr said:


> Give me a driver that lets me run that card under Vista64, give me a driver that allows me to use one of its marketing features, SLI (of two 7950 GX2 cards) under any OS, you choose. It was pretty much an abandon.



174.74 allows both.  It supports the 7950 GX2 under Vista64, and supports Quad-SLI, most of the drivers released have been like this.  Have you actually tried it?  I have a customer that comes in my shop regulary that bought two 7950 GX2's though me, he still uses them in Quad-SLI and runs Vista64, 174.74 has been working wonders for him, so have several previous driver releases.



Wile E said:


> The 7950GX2 is still supported by the latest drivers, but hasn't seen any fixes or optimizations targeted to it. Just because it's on the supported card list, doesn't mean they've done anything meaningful for it in terms of performance or bug fixes or features. REAL support for that card has been abysmal.



Real support for any of the 7 series cards, even the ones that are not EOL, has been abysmal.  Just like real support for the x1k series has been non-existant also.  Once a new series comes out, both graphics camps pretty much drop real support for their older cards.  Usually, it isn't a problem since most of the cards have had more than enough time to mature before the new series was released.  However, in the cases of cards released at the very end of a series lifespan, support is usually dropped rather quickly, but the cards still work and still get the general benefits of the new drivers.  ATi did the same thing with their Dual x1950Pro, there haven't been driver improvemnts directly for the cards since the day it was released.


----------



## newtekie1 (Apr 10, 2008)

candle_86 said:


> Go read bombrush, stop spouting your stupid crap.



He is an ATi fanboy, and generally doesn't have a clue what he is talking about.  Just add him to your ignore list and move on.


----------



## DarkMatter (Apr 10, 2008)

BumbRush said:


> 5700ultra vs 5950xt the 5700ultra was faster in any modern game(dx9/ogl2)  hell the 5900ultra vs the 5950xt(xt on nvidia=worse of the perticular number line..poor clockers that run hot)
> 
> i could go on and on, 5500 vs 5500xt  this case they used the suffix BUT its the same problem, they copyed ati's suffixes but they reverced them, insted of xt being the best they made it the worst(dirty trick!!!)
> 
> or the 5700ultra vs 5750xt(yes i have seen these cards, and in every case had to replace them with an ati card to give the person decent dx9 perf) the 5750 was not a well known run, it was just a 5700 with worse ram in most cases, horrible cards the 5750xt was close in my exp to the higher end 5200's..........horrible cards.......horrible    :shadedshu



So in the end you can't understand the naming scheme? :shadedshu It's easy...

xx50 is the slightly revamped xx00 and is always faster (or as fast) IF THE SUFIX IS THE SAME, because the sufix is (and was on Ati) the greates indicator of performance after the second number (x7xx), the model number or sub-series number, call it as you will. For example, xx50 GT is supposed to be faster than xx00 GT but NEVER faster than xx00 GTX. It's easy.

To my knowledge 5950 XT and 5750 XT never existed as Nvidia cards. Maybe they were some partner made special cads. I know of many others like that, when a card was already out of production Nvidia would let some of its partners to make some crippled cards, using lower memory and such, to sell all of the stock. Maybe that's the case.

5800 was NEVER and nowhere faster than 5900 or 5950. In some fantasy world of yours maybe. Or in a very specific game and level, at an specific setting and resolution maybe, otherwise 5900 and up was a hell of a lot faster. LOL :shadedshu

And AFAIK 5600 XT launched some months before the 9800 XT, so I have to agree with candle. And BTW only an Ati _lover_ would think that XT must be an indicator of a faster card other than in Ati's lineup.

On the other hand, I have to agree with you in that Nvidia's FX series were crap.


----------



## Tatty_One (Apr 10, 2008)

BumbRush said:


> as to the 6800 killing the x800, the x800 was faster and had better IQ, when farcry updated to a sm3 patch it brought the gf6 line to THE SAME QUILITY as ati's cards already had includint the 9500-9600-9700-9800 cards.



I find it hard to beleive, having owned both the 6800 vanilla and Ultra as well as an 800XT that the IQ in the 800 was better, the 800 if I remember rightly didnt support SM3 where as the 6800 did if my memory serves me correctly, a big jump in gfx quality and effects between SM2 and SM3.  There were many things IMO that the X800 was better at than the 6800 but IQ was not one of them.....but as I said, thats just my opinion.


----------



## btarunr (Apr 10, 2008)

Yes, Radeon X series lacked Shader Model 3.0 support.


----------



## newtekie1 (Apr 10, 2008)

Tatty_One said:


> I find it hard to beleive, having owned both the 6800 vanilla and Ultra as well as an 800XT that the IQ in the 800 was better, the 800 if I remember rightly didnt support SM3 where as the 6800 did if my memory serves me correctly, a big jump in gfx quality and effects between SM2 and SM3.  There were many things IMO that the X800 was better at than the 6800 but IMO IQ was not one of them.....but as I said, thats just my opinion.



Agreed, the x800 series were great cards, but they were definitely not better in the IQ department, that was one of the few weak points.  I still regret selling my two x800GTO2's.  But the x800XL is doing its job amazingly, I even play Crysis on it.


----------



## BumbRush (Apr 10, 2008)

Tatty_One said:


> I find it hard to beleive, having owned both the 6800 vanilla and Ultra as well as an 800XT that the IQ in the 800 was better, the 800 if I remember rightly didnt support SM3 where as the 6800 did if my memory serves me correctly, a big jump in gfx quality and effects between SM2 and SM3.  There were many things IMO that the X800 was better at than the 6800 but IQ was not one of them.....but as I said, thats just my opinion.



http://www.hardocp.com/article.html?art=Njc4LDUsLGhlbnRodXNpYXN0


> *We did notice shader quality improvements from Patch 1.1 to Patch 1.3, which now make the image quality on the GeForce 6800GT comparable to the shader image quality as seen on Radeon X800 video cards. The shader quality with Shader Model 3.0 is not better than Shader Model 2.0, it is now equal, where it wasn’t before in this game with Patch 1.1.*



http://www.anandtech.com/video/showdoc.aspx?i=2102&p=11


> Image quality of both SM2.0 paths are on par with eachother, and the SM3.0 path on NVIDIA hardware shows negligable differences. The very slight variations are most likely just small fluctuations between the mathematical output of a single pass and a multipass lighting shader. The difference is honestly so tiny that you can't call either rendering lower quality from a visual standpoint. We will still try to learn what exactly causes the differences we noticed from CryTek.




so yeah, basickly the x800's iq was better, the ps3/sm3 path sped up the 6800 and gave it quility =to the x800 cards but did not make it look better 

i had both cards, the x800's aa and af looked far better and till games optimized for nvidia ps3 support IQ and perf where FAR worse on the 6800gt@ultra i had, then the x800pro vivo@xt pe(flashed) and the x800pro vivo cost me less yet was faster.....lol

im on an 8800gt now it was best deal i could get at the time, and after alot of tweaking the drivers are ok, still not as good IQ wise PER SETTING as my x1900xtx was but at least it works, im just wondering if they will abandon updates for the 8800gt's once they move beyond the g92 core as they did with the 7 seirse, thats something i was alwase impressed by since i moved from nvidia to Ati back in the FX line days, (tho i have owned nvidia cards from each gen in those times) ati updates even their older cards drivers to fix issues, i know somebody told me recently that ati's 8 drivers fixed a problem with a game on his x800gto@xt pe(flash mod) thats far better then my experiance with nvidia have bene over the years, even back when i was a huge nvidia fan i knew that my older nvidia cards wouldnt be getting bug fixes, after the gf2 came out the tnt cards didnt even get bug fixes for common games that had seirous issues, and yet they where still selling the tnt/tnt2 based cards as budget seirse cards to OEM's(the gfmx was mid range the full gf cards where high end and the tnt cards where value line) 

sorry last rant was a bit long, hope people can deal with more then 2 lines of text in a row, if not i will go back to dubble spacing my posts......

neither ati or nvidia are golden when it comes to their remarking older parts or supporting some older parts, tho really nobody supports dx8 and older cards anymore, but i can say this, nvidia cut driver updates/fixes for their dx8 cards sooner then ati did(the 8500/9100/9200 and such) all got driver support up till they cut support for all the sub 9500 cards

the gf4 and older cards all stoped getting meaningfull updates not long after the fx line hit.
i know because at the time i had a ti4400(had better cooler then the 4600's did and was able to clock higher in my case) and i was effectivly proded into buying a 5800ultra by the hype nvidia put out about it jesus that card sucked tho........drove me to try ati again after years of HATING them due to their shitty rage pro/rage2/rage128 drivers sucking ass.

i better stop b4 i spend another page ranting about why i hate ati and why i hate nvidia   i like them both in ways but both also piss me off at times, stupid bastages.......oh well at least if you buy a card from eather today you will still get something you can use for a couple years (maby not for gaming but gamings a small part of the pc market really) 

this is bs tho think about it, they put out the 7950gx2 and NEVER give it proper support, then the 9800gx2 and EOL it just after it comes out, im SURE they wont give it proper support now eather, i would also bet they are regreting their dual pcb design as its far more costly then amdti's 3870x2 cards are to make.

now b4 any of you try and say im full of it, use logic here.

you have a 3870x2 thats 1 card, and can use a moded ver of the cooler theyuse on the normal cards OR most 3rd party coolers will fit.

then you have the 9800gx2 that you have to design and order special coolers for, as well as having to pay more to assimble the cards because its dual pcb with flexable links and such, each pcb being quite long/marge as well as being quite complex, basicly they made it overly complex and more of a PITA to deal with, hell look at the price compared to the x2 card.......nasty!!!

if i had bought one of these i would be returning it asap or selling it on ebay or something, because if they eol it this quick u KNOW your gonna get screwed on driver support just as they did to the last gx2 card.......

at least ati's first x2 card got support dispite it being very poorly known, but then again it dosnt need really special drivers, its just seen as a crossfire setup and gets enhancements from any crossfire based update 

blah lets not fight about it, cant we all agree that we would be pissed if we owned one of these?


----------



## BumbRush (Apr 10, 2008)

Tatty_One said:


> I find it hard to beleive, having owned both the 6800 vanilla and Ultra as well as an 800XT that the IQ in the 800 was better, the 800 if I remember rightly didnt support SM3 where as the 6800 did if my memory serves me correctly, a big jump in gfx quality and effects between SM2 and SM3.  There were many things IMO that the X800 was better at than the 6800 but IQ was not one of them.....but as I said, thats just my opinion.



oh forgot to say the only real diffrance you will see in most games is HDR support between the x800 and 6800 and even then the 6/7 range cards have to choose between AA and HDR because they cant do both at the same time if its SM3 hdr, the 8800 and x1k cards can(infact for the x1k cards theres no perf penilty to have both enabled in games like farcry and oblivion) 

HDR could be done under sm2.0c, it just requiered diffrent coding that took more time and skill, check out HL2 lost coast and the HL2 expantions, IMHO with current patches it looks just as good as any other HDR use even tho its sm2.0 based not sm3 

blah, i did it again, i ranted more then intended


----------



## Tatty_One (Apr 10, 2008)

BumbRush said:


> oh forgot to say the only real diffrance you will see in most games is HDR support between the x800 and 6800 and even then the 6/7 range cards have to choose between AA and HDR because they cant do both at the same time if its SM3 hdr, the 8800 and x1k cards can(infact for the x1k cards theres no perf penilty to have both enabled in games like farcry and oblivion)
> 
> HDR could be done under sm2.0c, it just requiered diffrent coding that took more time and skill, check out HL2 lost coast and the HL2 expantions, IMHO with current patches it looks just as good as any other HDR use even tho its sm2.0 based not sm3
> 
> blah, i did it again, i ranted more then intended



Agreed but one or the other is much better than non (x800) and also there is more to SM3 than HDR   And yes you are right, it took NVidia far too long to develop a card that could simultaneously deliver both HDR and AA, in contrast, ATI just dont have a card now that can effectively deliver AA!!!!  (sorry that was un-called for .......I just couldnt resist )


----------



## BumbRush (Apr 10, 2008)

lol, well u know why the 2900/3800 use the method they do for aa? because ati STUPIDLY went with what MICROSOFT wanted for dx10/10.1 they wanted AA to be done with shaders insted of detocated hardware, that was part of the requierments for 10.1, i think ms has since changed that, but still.....dumb idea if you ask me.......still ati should have just supported shader based as well as using a hardware AA unit(not run aa in software on shaders) 

but hey at least when you choose 2xAA on an ati card it looks as good as 4x or 8x nvidia aa(tested it myself with my 1900xtx vs 8800gt) kinda dissapointing that per setting they cant out do ati with all the bruit force they put into their cards.......


----------



## Tatty_One (Apr 10, 2008)

BumbRush said:


> lol, well u know why the 2900/3800 use the method they do for aa? because ati STUPIDLY went with what MICROSOFT wanted for dx10/10.1 they wanted AA to be done with shaders insted of detocated hardware, that was part of the requierments for 10.1, i think ms has since changed that, but still.....dumb idea if you ask me.......still ati should have just supported shader based as well as using a hardware AA unit(not run aa in software on shaders)
> 
> but hey at least when you choose 2xAA on an ati card it looks as good as 4x or 8x nvidia aa(tested it myself with my 1900xtx vs 8800gt) kinda dissapointing that per setting they cant out do ati with all the bruit force they put into their cards.......



Obviously our eyesight differ, I bought a HD3870 at launch and when measured against my old G92 8800GTS I actually thought the GTS IQ looked better but in my experience, more often than not, ATI owners seem to beleive that ATI IQ is the best, where....strangely enuff.....NVidia owners think just the opposite......wonder why that is?   for me, I am kind of predictable so i tend to take the word of the majority and as that is probably NVidia owners ATM then enuff said!!


----------



## AddSub (Apr 10, 2008)

Wow, this topic has completely derailed.



> I find it hard to beleive, having owned both the 6800 vanilla and Ultra as well as an 800XT that the IQ in the 800 was better



Pretty much every Radeon card I owned had better image quality than any nVidia card I ever owned. Both in 2D/Desktop and 3D. Well, except maybe my current 8800GTX.  I actually owned a GeForce 6800 vanilla (Apollo brand). It was the worst experience I ever had with a video card, in addition to being the only card to date that I had to RMA on the same day I received it from newegg. It was artifacting at stock settings, both in 2D and 3D. Although I guess that has to do more with Apollo's quality control than anything else. Anyways, I went from that card to a BFG GeForce 6600GT. Which I had for about 2-3 months. Until a DAC unit on the BFG went apeshit and killed one of my CRT's. After that I temporarily went to a backup GeForce 3 TI 500 for a week or two. The old GeForce 3 had better image quality than the newer 6800/6600 cards and after using the GF3 for few weeks I received a Radeon X800GTO (my first Radeon card ever) and my eyes were amazed. It had better IQ than any nVidia card I owned by that point (and I owned about a dozen by then). I replaced all my nVidia cards with Radeon/ATI alternatives in single month. X850XT's, few X800GTO's and even a lowly x700pro card. 

And before anyone calls me a ATI/AMD fanboi, please take into consideration that currently I'm running a GeForce 8800GTX and an nForce based motherboard on my primary machine.



> a big jump in gfx quality and effects between SM2 and SM3.



SM3 was really SM 2.5, feature wise. But we all know how marketing works. Difference was not that big of a ...well "big jump". SM3 really allowed for increased performance vs. SM2 than it actually introduced any new features (which it did, admittedly). 

Here is a great and informative article @ Hardocp written back in 2004, comparing the new features of SM3 vs. SM2 and SM1.1, in FarCry no less. 

http://www.hardocp.com/article.html?art=NjA5


----------



## eidairaman1 (Apr 10, 2008)

You know it could of been Neweggs Fault because no one knows how these retailers store and handle their products.


AddSub said:


> Wow, this topic has completely derailed.
> 
> 
> 
> ...


----------



## Tatty_One (Apr 10, 2008)

AddSub said:


> Wow, this topic has completely derailed.
> 
> 
> 
> ...



Yup, cant argue with that and it showed in AA the 800XT outperformed the 6800Ultra at max settings, partially due to the fact that the Ultra's AA range was 2x, 4x or 8x where the 800's was 2x, 4x and 6x.  But also in that same article in May 2004 regarding IQ specifically which was my origional point....I quote:

*Comparing IQ Technology*:


*Looking at the Anti-Aliasing and Anisotropic image quality between the X800 series and the GeForce 6800Ultra we find them to be very comparable*. There is one difference though. The X800 is so powerful, 6XAA is actually a useable Anti-Aliasing setting on the X800 series whereas comparable 8XAA on the 6800Ultra, is basically not usable, as it is too demanding in terms of performance because it is a super-sampling + multi-sampling technique. 


The only shader quality differences we noticed were in FarCry where the X800 series is providing much better image quality. Compared to the 9800XT the X800 series have identical AA, AF and shader quality.


----------



## AddSub (Apr 10, 2008)

> You know it could of been Neweggs Fault because no one knows how these retailers store and handle their products.



I actually mentioned this before in few other topics, but the Apollo 6800 card was cut down. What I mean is, it had 128-bit memory interface vs. 256-bit on other 6800v/reference cards and it had some other discrepancies as well, which I will not go into now. (Upon closer examination I noticed right away the arrangement and the count of VRAM ICs which clearly indicated a 128-bit part, something that was confirmed by RivaTuner as well.) Something else to consider is that GeCube and Apollo are different branches of the same corporation, and GeCube has had a tendency of releasing cards that are quite different from reference models. For example, GeCube X800GTO few years back which was the only 128-bit GTO part on the market to my knowledge (vs. 256-bit on reference/others) and the most recent fiasco on Newegg where they advertised a GeCube 2600XT with a 256-bit interface when in reality all 2600XT cards have 128-bit, including theirs. I have more examples, but that's another topic. Apollo/GeCube = shady.


----------



## BumbRush (Apr 11, 2008)

reposting this since i think it got missed due to the 2nd one being on a new page 

-----------------------------------------------------------------------------------------------------------





Tatty_One said:


> I find it hard to beleive, having owned both the 6800 vanilla and Ultra as well as an 800XT that the IQ in the 800 was better, the 800 if I remember rightly didnt support SM3 where as the 6800 did if my memory serves me correctly, a big jump in gfx quality and effects between SM2 and SM3.  There were many things IMO that the X800 was better at than the 6800 but IQ was not one of them.....but as I said, thats just my opinion.



http://www.hardocp.com/article.html?art=Njc4LDUsLGhlbnRodXNpYXN0


> *We did notice shader quality improvements from Patch 1.1 to Patch 1.3, which now make the image quality on the GeForce 6800GT comparable to the shader image quality as seen on Radeon X800 video cards. The shader quality with Shader Model 3.0 is not better than Shader Model 2.0, it is now equal, where it wasn’t before in this game with Patch 1.1.*



http://www.anandtech.com/video/showdoc.aspx?i=2102&p=11


> Image quality of both SM2.0 paths are on par with eachother, and the SM3.0 path on NVIDIA hardware shows negligable differences. The very slight variations are most likely just small fluctuations between the mathematical output of a single pass and a multipass lighting shader. The difference is honestly so tiny that you can't call either rendering lower quality from a visual standpoint. We will still try to learn what exactly causes the differences we noticed from CryTek.




so yeah, basickly the x800's iq was better, the ps3/sm3 path sped up the 6800 and gave it quility =to the x800 cards but did not make it look better 

i had both cards, the x800's aa and af looked far better and till games optimized for nvidia ps3 support IQ and perf where FAR worse on the 6800gt@ultra i had, then the x800pro vivo@xt pe(flashed) and the x800pro vivo cost me less yet was faster.....lol

im on an 8800gt now it was best deal i could get at the time, and after alot of tweaking the drivers are ok, still not as good IQ wise PER SETTING as my x1900xtx was but at least it works, im just wondering if they will abandon updates for the 8800gt's once they move beyond the g92 core as they did with the 7 seirse, thats something i was alwase impressed by since i moved from nvidia to Ati back in the FX line days, (tho i have owned nvidia cards from each gen in those times) ati updates even their older cards drivers to fix issues, i know somebody told me recently that ati's 8 drivers fixed a problem with a game on his x800gto@xt pe(flash mod) thats far better then my experiance with nvidia have bene over the years, even back when i was a huge nvidia fan i knew that my older nvidia cards wouldnt be getting bug fixes, after the gf2 came out the tnt cards didnt even get bug fixes for common games that had seirous issues, and yet they where still selling the tnt/tnt2 based cards as budget seirse cards to OEM's(the gfmx was mid range the full gf cards where high end and the tnt cards where value line) 

sorry last rant was a bit long, hope people can deal with more then 2 lines of text in a row, if not i will go back to dubble spacing my posts......

neither ati or nvidia are golden when it comes to their remarking older parts or supporting some older parts, tho really nobody supports dx8 and older cards anymore, but i can say this, nvidia cut driver updates/fixes for their dx8 cards sooner then ati did(the 8500/9100/9200 and such) all got driver support up till they cut support for all the sub 9500 cards

the gf4 and older cards all stoped getting meaningfull updates not long after the fx line hit.
i know because at the time i had a ti4400(had better cooler then the 4600's did and was able to clock higher in my case) and i was effectivly proded into buying a 5800ultra by the hype nvidia put out about it jesus that card sucked tho........drove me to try ati again after years of HATING them due to their shitty rage pro/rage2/rage128 drivers sucking ass.

i better stop b4 i spend another page ranting about why i hate ati and why i hate nvidia   i like them both in ways but both also piss me off at times, stupid bastages.......oh well at least if you buy a card from eather today you will still get something you can use for a couple years (maby not for gaming but gamings a small part of the pc market really) 

this is bs tho think about it, they put out the 7950gx2 and NEVER give it proper support, then the 9800gx2 and EOL it just after it comes out, im SURE they wont give it proper support now eather, i would also bet they are regreting their dual pcb design as its far more costly then amdti's 3870x2 cards are to make.

now b4 any of you try and say im full of it, use logic here.

you have a 3870x2 thats 1 card, and can use a moded ver of the cooler theyuse on the normal cards OR most 3rd party coolers will fit.

then you have the 9800gx2 that you have to design and order special coolers for, as well as having to pay more to assimble the cards because its dual pcb with flexable links and such, each pcb being quite long/marge as well as being quite complex, basicly they made it overly complex and more of a PITA to deal with, hell look at the price compared to the x2 card.......nasty!!!

if i had bought one of these i would be returning it asap or selling it on ebay or something, because if they eol it this quick u KNOW your gonna get screwed on driver support just as they did to the last gx2 card.......

at least ati's first x2 card got support dispite it being very poorly known, but then again it dosnt need really special drivers, its just seen as a crossfire setup and gets enhancements from any crossfire based update 

blah lets not fight about it, cant we all agree that we would be pissed if we owned one of these?


----------



## Wile E (Apr 11, 2008)

newtekie1 said:


> 174.74 allows both.  It supports the 7950 GX2 under Vista64, and supports Quad-SLI, most of the drivers released have been like this.  Have you actually tried it?  I have a customer that comes in my shop regulary that bought two 7950 GX2's though me, he still uses them in Quad-SLI and runs Vista64, 174.74 has been working wonders for him, so have several previous driver releases.
> 
> 
> 
> Real support for any of the 7 series cards, even the ones that are not EOL, has been abysmal.  Just like real support for the x1k series has been non-existant also.  Once a new series comes out, both graphics camps pretty much drop real support for their older cards.  Usually, it isn't a problem since most of the cards have had more than enough time to mature before the new series was released.  However, in the cases of cards released at the very end of a series lifespan, support is usually dropped rather quickly, but the cards still work and still get the general benefits of the new drivers.  ATi did the same thing with their Dual x1950Pro, there haven't been driver improvemnts directly for the cards since the day it was released.


Yeah, the 7950GX2 is one of those cards that suffer from lack of development time. Hell, it took nVidia months before they even bothered to get it working acceptably in Vista.

And ATI didn't make a dual gpu 1950 pro. That was an independent design and release by Sapphire.


----------



## DaedalusHelios (Apr 11, 2008)

My 9800GX2 will step up to the 9900GX2 when the time comes. I might get two if they sort out the drivers..... so I will wait. 

It already plays "Very High" Crysis well. I wonder if it will do well with Alan Wake? I want that game badly.


Sorry to derail your thread guys.


----------



## Tatty_One (Apr 11, 2008)

BumbRush said:


> reposting this since i think it got missed due to the 2nd one being on a new page
> 
> -----------------------------------------------------------------------------------------------------------
> 
> ...



You ever thought of becoming an author?....war and peace springs to mind!  You have gone off on a bit of a tangent there, I never said the 6800 was a better card, in fact I did prefer the x800, my point was that IMO IQ was the same in MY experience, some of us have linked articles/reviews that partly agree with that, and partly disagree, the very Hardcop review that said that Antialiasing performance on the x800 was superior and that IQ in Far Cry was better on the x800 also went on to say that IQ across the board was comparable (even if that was eventually compariable), we are on a no win here (or no lose depending which way you look at it) as IQ is very subjective depending on the users eyes, perception and quality settings.

I went from a 7900GTO to a 1950XT briefly, I DID see better IQ from the 1950XT but I think that once NVidia released the G80 and finally sorted out simultaneous HDR/AA that the days of superior IQ in one or the other sides has more or less disappeared but again that is my subjective opinion.


----------



## AddSub (Apr 11, 2008)

> G80 and finally sorted out simultaneous HDR/AA that the days of superior IQ in one or the other sides has more or less disappeared but again that is my subjective opinion.



I agree. With the arrival of G80 the IQ seems to have gotten better, or at least up to the point where you can't really notice that much difference between the red and green. (Or is it green and green at this point? I can't keep track of all the corporate colors. ) 

The IQ issues with nVidia cards, at least as far as my own experiences go, (and at this point I've owned at least one nVidia card from each generation, except 9xxx), the issues really started with 5xxx series, and IQ seemed to get worse in 6xxx and 7xxx series. I'm not sure if it was architectural problems tied to the GPU design or just driver issues (my own guess would be drivers) but once I started using ATI cards for the first time the difference, in my own eyes at least, become even more noticeable.


----------



## erocker (Apr 11, 2008)

Well, this thread has me completely intrigued to buy a card with a g92 core on it.  I might as well as I'm letting my other rig borrow my two 3870's for a while.  I really want to see for myself.  I expect nothing.

*Edit:  Oh, wait a minute... I thought this was the IQ thread but am mistaken.  Fooled by off topic posts, stay on track folks.


----------



## AddSub (Apr 11, 2008)

Yeah erocker, this topic = 







about 60 posts ago...


----------



## VroomBang (Apr 11, 2008)

malware said:


> This information from Expreview may dissapoing many GeForce 9800 GX2 owners if true. NVIDIA is about to EOL (end-of-life) the GeForce 9800 GX2 line-up in just three months, as a result of two new GT200 cards - the single GPU GeForce 9900GTX and the dual GPU GeForce 9900 GX2. One of the GT200 cards will have similar performance and production cost as the GeForce 9800 GX2, which will force the manufacturer to cut down the "older" card. There will be no rebranding for 9800 GX2, like the GeForce 8800 GS which will become 9600 GSO, but just a sudden death. Meanwhile, details of the new GT200 graphics are still unknown.
> 
> Source: Expreview.com



Clearly not the best time to upgrade the graphics card. I'd wait till ATI's HD4xxx and NVidia 99xx cards are released and fully tested, hopefully by mid year?


----------



## newtekie1 (Apr 11, 2008)

Wile E said:


> Yeah, the 7950GX2 is one of those cards that suffer from lack of development time. Hell, it took nVidia months before they even bothered to get it working acceptably in Vista.
> 
> And ATI didn't make a dual gpu 1950 pro. That was an independent design and release by Sapphire.



It doesn't matter who designed the card, most of ATi's cards are designed by Sapphire, what matters is that ATi allowed their partners to produce and sell the card, so ATi is responsible for providing driver support for it.


----------



## BumbRush (Apr 11, 2008)

AddSub said:


> Yeah erocker, this topic =
> 
> 
> 
> ...



what fool let you drive a train?......jeebus........look what u did!!!!


----------



## GSG-9 (Apr 11, 2008)

newtekie1 said:


> It doesn't matter who designed the card, most of ATi's cards are designed by Sapphire, what matters is that ATi allowed their partners to produce and sell the card, so ATi is responsible for providing driver support for it.


Really? I did not know that, I did not suspect Sapphire to be the company to make that.


----------



## candle_86 (Apr 11, 2008)

ATI doesnt provide special support, the drivers read it as a 1950pro Crossfire, same as the ASUS 7800GT DUAL, or Gigabye 6600GT, 6800GT 3D1


----------



## asb2106 (Apr 11, 2008)

GSG-9 said:


> Really? I did not know that, I did not suspect Sapphire to be the company to make that.



Sapphire and ATI paired up a few years ago, 

It shocks me alittle still, but its funny, Ive bought about 7 different cards from ATI since the 1k series release, and I have had the best luck OCing with sapphire cards.  Their cooling isnt the greatest but if you like to water cool, or upgrade the cooling, they are nice and cheap and perform great!


**9800GX2

If this is true, it really would not suprise me, the release of the new 200 series cores will make this card very hard to sell, and to continue to produce them would not be a good idea.  

Plus I dont feel Nvidia ever really had good luck with putting 2 GPUs into one card. The whole 2 PCB idea never seemed to work right...


----------



## eidairaman1 (Apr 12, 2008)

cooling is fine for stock applications, thats what it was originally meant for, now if you could mount the cooler to say the northbridge and southbridge it be killer bro.


----------



## GSG-9 (Apr 12, 2008)

asb2106 said:


> Sapphire and ATI paired up a few years ago,
> 
> It shocks me alittle still, but its funny, Ive bought about 7 different cards from ATI since the 1k series release, and I have had the best luck OCing with sapphire cards.  Their cooling isnt the greatest but if you like to water cool, or upgrade the cooling, they are nice and cheap and perform great!



The last card I had from them was a 9800pro flashed to xt. One time while I was on a family vacation (this was a long time ago lol) one of the push pins on the cooler somehow came off and the 9800 sat there on with the cooler hanging off until we came home a week later. They let me rma it no question, but I was trying to move to pcie and never did get around to sending it in.


----------

