Tuesday, April 8th 2008

NVIDIA GeForce 9800 GX2 Reaches EOL in Three Months?

This information from Expreview may dissapoing many GeForce 9800 GX2 owners if true. NVIDIA is about to EOL (end-of-life) the GeForce 9800 GX2 line-up in just three months, as a result of two new GT200 cards - the single GPU GeForce 9900GTX and the dual GPU GeForce 9900 GX2. One of the GT200 cards will have similar performance and production cost as the GeForce 9800 GX2, which will force the manufacturer to cut down the "older" card. There will be no rebranding for 9800 GX2, like the GeForce 8800 GS which will become 9600 GSO, but just a sudden death. Meanwhile, details of the new GT200 graphics are still unknown.
Source: Expreview.com
Add your own comment

122 Comments on NVIDIA GeForce 9800 GX2 Reaches EOL in Three Months?

#101
Tatty_Two
Gone Fishing
BumbRushoh forgot to say the only real diffrance you will see in most games is HDR support between the x800 and 6800 and even then the 6/7 range cards have to choose between AA and HDR because they cant do both at the same time if its SM3 hdr, the 8800 and x1k cards can(infact for the x1k cards theres no perf penilty to have both enabled in games like farcry and oblivion)

HDR could be done under sm2.0c, it just requiered diffrent coding that took more time and skill, check out HL2 lost coast and the HL2 expantions, IMHO with current patches it looks just as good as any other HDR use even tho its sm2.0 based not sm3 :)

blah, i did it again, i ranted more then intended :P
Agreed but one or the other is much better than non (x800) and also there is more to SM3 than HDR :p And yes you are right, it took NVidia far too long to develop a card that could simultaneously deliver both HDR and AA, in contrast, ATI just dont have a card now that can effectively deliver AA!!!! (sorry that was un-called for .......I just couldnt resist :o)
Posted on Reply
#102
BumbRush
lol, well u know why the 2900/3800 use the method they do for aa? because ati STUPIDLY went with what MICROSOFT wanted for dx10/10.1 they wanted AA to be done with shaders insted of detocated hardware, that was part of the requierments for 10.1, i think ms has since changed that, but still.....dumb idea if you ask me.......still ati should have just supported shader based as well as using a hardware AA unit(not run aa in software on shaders)

but hey at least when you choose 2xAA on an ati card it looks as good as 4x or 8x nvidia aa(tested it myself with my 1900xtx vs 8800gt) kinda dissapointing that per setting they cant out do ati with all the bruit force they put into their cards.......
Posted on Reply
#103
Tatty_Two
Gone Fishing
BumbRushlol, well u know why the 2900/3800 use the method they do for aa? because ati STUPIDLY went with what MICROSOFT wanted for dx10/10.1 they wanted AA to be done with shaders insted of detocated hardware, that was part of the requierments for 10.1, i think ms has since changed that, but still.....dumb idea if you ask me.......still ati should have just supported shader based as well as using a hardware AA unit(not run aa in software on shaders)

but hey at least when you choose 2xAA on an ati card it looks as good as 4x or 8x nvidia aa(tested it myself with my 1900xtx vs 8800gt) kinda dissapointing that per setting they cant out do ati with all the bruit force they put into their cards.......
Obviously our eyesight differ, I bought a HD3870 at launch and when measured against my old G92 8800GTS I actually thought the GTS IQ looked better but in my experience, more often than not, ATI owners seem to beleive that ATI IQ is the best, where....strangely enuff.....NVidia owners think just the opposite......wonder why that is? :D for me, I am kind of predictable so i tend to take the word of the majority and as that is probably NVidia owners ATM then enuff said!!
Posted on Reply
#104
AddSub
Wow, this topic has completely derailed.
I find it hard to beleive, having owned both the 6800 vanilla and Ultra as well as an 800XT that the IQ in the 800 was better
Pretty much every Radeon card I owned had better image quality than any nVidia card I ever owned. Both in 2D/Desktop and 3D. Well, except maybe my current 8800GTX. I actually owned a GeForce 6800 vanilla (Apollo brand). It was the worst experience I ever had with a video card, in addition to being the only card to date that I had to RMA on the same day I received it from newegg. It was artifacting at stock settings, both in 2D and 3D. Although I guess that has to do more with Apollo's quality control than anything else. Anyways, I went from that card to a BFG GeForce 6600GT. Which I had for about 2-3 months. Until a DAC unit on the BFG went apeshit and killed one of my CRT's. After that I temporarily went to a backup GeForce 3 TI 500 for a week or two. The old GeForce 3 had better image quality than the newer 6800/6600 cards and after using the GF3 for few weeks I received a Radeon X800GTO (my first Radeon card ever) and my eyes were amazed. It had better IQ than any nVidia card I owned by that point (and I owned about a dozen by then). I replaced all my nVidia cards with Radeon/ATI alternatives in single month. X850XT's, few X800GTO's and even a lowly x700pro card.

And before anyone calls me a ATI/AMD fanboi, please take into consideration that currently I'm running a GeForce 8800GTX and an nForce based motherboard on my primary machine.
a big jump in gfx quality and effects between SM2 and SM3.
SM3 was really SM 2.5, feature wise. But we all know how marketing works. Difference was not that big of a ...well "big jump". SM3 really allowed for increased performance vs. SM2 than it actually introduced any new features (which it did, admittedly).

Here is a great and informative article @ Hardocp written back in 2004, comparing the new features of SM3 vs. SM2 and SM1.1, in FarCry no less. :)

www.hardocp.com/article.html?art=NjA5
Posted on Reply
#105
eidairaman1
The Exiled Airman
You know it could of been Neweggs Fault because no one knows how these retailers store and handle their products.
AddSubWow, this topic has completely derailed.



Pretty much every Radeon card I owned had better image quality than any nVidia card I ever owned. Both in 2D/Desktop and 3D. Well, except maybe my current 8800GTX. I actually owned a GeForce 6800 vanilla (Apollo brand). It was the worst experience I ever had with a video card, in addition to being the only card to date that I had to RMA on the same day I received it from newegg. It was artifacting at stock settings, both in 2D and 3D. Although I guess that has to do more with Apollo's quality control than anything else. Anyways, I went from that card to a BFG GeForce 6600GT. Which I had for about 2-3 months. Until a DAC unit on the BFG went apeshit and killed one of my CRT's. After that I temporarily went to a backup GeForce 3 TI 500 for a week or two. The old GeForce 3 had better image quality than the newer 6800/6600 cards and after using the GF3 for few weeks I received a Radeon X800GTO (my first Radeon card ever) and my eyes were amazed. It had better IQ than any nVidia card I owned by that point (and I owned about a dozen by then). I replaced all my nVidia cards with Radeon/ATI alternatives in single month. X850XT's, few X800GTO's and even a lowly x700pro card.

And before anyone calls me a ATI/AMD fanboi, please take into consideration that currently I'm running a GeForce 8800GTX and an nForce based motherboard on my primary machine.



SM3 was really SM 2.5, feature wise. But we all know how marketing works. Difference was not that big of a ...well "big jump". SM3 really allowed for increased performance vs. SM2 than it actually introduced any new features (which it did, admittedly).

Here is a great and informative article @ Hardocp written back in 2004, comparing the new features of SM3 vs. SM2 and SM1.1, in FarCry no less. :)

www.hardocp.com/article.html?art=NjA5
Posted on Reply
#106
Tatty_Two
Gone Fishing
AddSubWow, this topic has completely derailed.



Pretty much every Radeon card I owned had better image quality than any nVidia card I ever owned. Both in 2D/Desktop and 3D. Well, except maybe my current 8800GTX. I actually owned a GeForce 6800 vanilla (Apollo brand). It was the worst experience I ever had with a video card, in addition to being the only card to date that I had to RMA on the same day I received it from newegg. It was artifacting at stock settings, both in 2D and 3D. Although I guess that has to do more with Apollo's quality control than anything else. Anyways, I went from that card to a BFG GeForce 6600GT. Which I had for about 2-3 months. Until a DAC unit on the BFG went apeshit and killed one of my CRT's. After that I temporarily went to a backup GeForce 3 TI 500 for a week or two. The old GeForce 3 had better image quality than the newer 6800/6600 cards and after using the GF3 for few weeks I received a Radeon X800GTO (my first Radeon card ever) and my eyes were amazed. It had better IQ than any nVidia card I owned by that point (and I owned about a dozen by then). I replaced all my nVidia cards with Radeon/ATI alternatives in single month. X850XT's, few X800GTO's and even a lowly x700pro card.

And before anyone calls me a ATI/AMD fanboi, please take into consideration that currently I'm running a GeForce 8800GTX and an nForce based motherboard on my primary machine.



SM3 was really SM 2.5, feature wise. But we all know how marketing works. Difference was not that big of a ...well "big jump". SM3 really allowed for increased performance vs. SM2 than it actually introduced any new features (which it did, admittedly).

Here is a great and informative article @ Hardocp written back in 2004, comparing the new features of SM3 vs. SM2 and SM1.1, in FarCry no less. :)

www.hardocp.com/article.html?art=NjA5
Yup, cant argue with that and it showed in AA the 800XT outperformed the 6800Ultra at max settings, partially due to the fact that the Ultra's AA range was 2x, 4x or 8x where the 800's was 2x, 4x and 6x. But also in that same article in May 2004 regarding IQ specifically which was my origional point....I quote:

Comparing IQ Technology:


Looking at the Anti-Aliasing and Anisotropic image quality between the X800 series and the GeForce 6800Ultra we find them to be very comparable. There is one difference though. The X800 is so powerful, 6XAA is actually a useable Anti-Aliasing setting on the X800 series whereas comparable 8XAA on the 6800Ultra, is basically not usable, as it is too demanding in terms of performance because it is a super-sampling + multi-sampling technique.


The only shader quality differences we noticed were in FarCry where the X800 series is providing much better image quality. Compared to the 9800XT the X800 series have identical AA, AF and shader quality.
Posted on Reply
#107
AddSub
You know it could of been Neweggs Fault because no one knows how these retailers store and handle their products.
I actually mentioned this before in few other topics, but the Apollo 6800 card was cut down. What I mean is, it had 128-bit memory interface vs. 256-bit on other 6800v/reference cards and it had some other discrepancies as well, which I will not go into now. (Upon closer examination I noticed right away the arrangement and the count of VRAM ICs which clearly indicated a 128-bit part, something that was confirmed by RivaTuner as well.) Something else to consider is that GeCube and Apollo are different branches of the same corporation, and GeCube has had a tendency of releasing cards that are quite different from reference models. For example, GeCube X800GTO few years back which was the only 128-bit GTO part on the market to my knowledge (vs. 256-bit on reference/others) and the most recent fiasco on Newegg where they advertised a GeCube 2600XT with a 256-bit interface when in reality all 2600XT cards have 128-bit, including theirs. I have more examples, but that's another topic. Apollo/GeCube = shady.
Posted on Reply
#108
BumbRush
reposting this since i think it got missed due to the 2nd one being on a new page

-----------------------------------------------------------------------------------------------------------
Tatty_OneI find it hard to beleive, having owned both the 6800 vanilla and Ultra as well as an 800XT that the IQ in the 800 was better, the 800 if I remember rightly didnt support SM3 where as the 6800 did if my memory serves me correctly, a big jump in gfx quality and effects between SM2 and SM3. There were many things IMO that the X800 was better at than the 6800 but IQ was not one of them.....but as I said, thats just my opinion.
www.hardocp.com/article.html?art=Njc4LDUsLGhlbnRodXNpYXN0
We did notice shader quality improvements from Patch 1.1 to Patch 1.3, which now make the image quality on the GeForce 6800GT comparable to the shader image quality as seen on Radeon X800 video cards. The shader quality with Shader Model 3.0 is not better than Shader Model 2.0, it is now equal, where it wasn’t before in this game with Patch 1.1.
www.anandtech.com/video/showdoc.aspx?i=2102&p=11
Image quality of both SM2.0 paths are on par with eachother, and the SM3.0 path on NVIDIA hardware shows negligable differences. The very slight variations are most likely just small fluctuations between the mathematical output of a single pass and a multipass lighting shader. The difference is honestly so tiny that you can't call either rendering lower quality from a visual standpoint. We will still try to learn what exactly causes the differences we noticed from CryTek.
so yeah, basickly the x800's iq was better, the ps3/sm3 path sped up the 6800 and gave it quility =to the x800 cards but did not make it look better

i had both cards, the x800's aa and af looked far better and till games optimized for nvidia ps3 support IQ and perf where FAR worse on the 6800gt@ultra i had, then the x800pro vivo@xt pe(flashed) and the x800pro vivo cost me less yet was faster.....lol

im on an 8800gt now it was best deal i could get at the time, and after alot of tweaking the drivers are ok, still not as good IQ wise PER SETTING as my x1900xtx was but at least it works, im just wondering if they will abandon updates for the 8800gt's once they move beyond the g92 core as they did with the 7 seirse, thats something i was alwase impressed by since i moved from nvidia to Ati back in the FX line days, (tho i have owned nvidia cards from each gen in those times) ati updates even their older cards drivers to fix issues, i know somebody told me recently that ati's 8 drivers fixed a problem with a game on his x800gto@xt pe(flash mod) thats far better then my experiance with nvidia have bene over the years, even back when i was a huge nvidia fan i knew that my older nvidia cards wouldnt be getting bug fixes, after the gf2 came out the tnt cards didnt even get bug fixes for common games that had seirous issues, and yet they where still selling the tnt/tnt2 based cards as budget seirse cards to OEM's(the gfmx was mid range the full gf cards where high end and the tnt cards where value line)

sorry last rant was a bit long, hope people can deal with more then 2 lines of text in a row, if not i will go back to dubble spacing my posts......

neither ati or nvidia are golden when it comes to their remarking older parts or supporting some older parts, tho really nobody supports dx8 and older cards anymore, but i can say this, nvidia cut driver updates/fixes for their dx8 cards sooner then ati did(the 8500/9100/9200 and such) all got driver support up till they cut support for all the sub 9500 cards

the gf4 and older cards all stoped getting meaningfull updates not long after the fx line hit.
i know because at the time i had a ti4400(had better cooler then the 4600's did and was able to clock higher in my case) and i was effectivly proded into buying a 5800ultra by the hype nvidia put out about it jesus that card sucked tho........drove me to try ati again after years of HATING them due to their shitty rage pro/rage2/rage128 drivers sucking ass.

i better stop b4 i spend another page ranting about why i hate ati and why i hate nvidia :P i like them both in ways but both also piss me off at times, stupid bastages.......oh well at least if you buy a card from eather today you will still get something you can use for a couple years (maby not for gaming but gamings a small part of the pc market really)

this is bs tho think about it, they put out the 7950gx2 and NEVER give it proper support, then the 9800gx2 and EOL it just after it comes out, im SURE they wont give it proper support now eather, i would also bet they are regreting their dual pcb design as its far more costly then amdti's 3870x2 cards are to make.

now b4 any of you try and say im full of it, use logic here.

you have a 3870x2 thats 1 card, and can use a moded ver of the cooler theyuse on the normal cards OR most 3rd party coolers will fit.

then you have the 9800gx2 that you have to design and order special coolers for, as well as having to pay more to assimble the cards because its dual pcb with flexable links and such, each pcb being quite long/marge as well as being quite complex, basicly they made it overly complex and more of a PITA to deal with, hell look at the price compared to the x2 card.......nasty!!!

if i had bought one of these i would be returning it asap or selling it on ebay or something, because if they eol it this quick u KNOW your gonna get screwed on driver support just as they did to the last gx2 card.......

at least ati's first x2 card got support dispite it being very poorly known, but then again it dosnt need really special drivers, its just seen as a crossfire setup and gets enhancements from any crossfire based update :)

blah lets not fight about it, cant we all agree that we would be pissed if we owned one of these?
Posted on Reply
#109
Wile E
Power User
newtekie1174.74 allows both. It supports the 7950 GX2 under Vista64, and supports Quad-SLI, most of the drivers released have been like this. Have you actually tried it? I have a customer that comes in my shop regulary that bought two 7950 GX2's though me, he still uses them in Quad-SLI and runs Vista64, 174.74 has been working wonders for him, so have several previous driver releases.



Real support for any of the 7 series cards, even the ones that are not EOL, has been abysmal. Just like real support for the x1k series has been non-existant also. Once a new series comes out, both graphics camps pretty much drop real support for their older cards. Usually, it isn't a problem since most of the cards have had more than enough time to mature before the new series was released. However, in the cases of cards released at the very end of a series lifespan, support is usually dropped rather quickly, but the cards still work and still get the general benefits of the new drivers. ATi did the same thing with their Dual x1950Pro, there haven't been driver improvemnts directly for the cards since the day it was released.
Yeah, the 7950GX2 is one of those cards that suffer from lack of development time. Hell, it took nVidia months before they even bothered to get it working acceptably in Vista.

And ATI didn't make a dual gpu 1950 pro. That was an independent design and release by Sapphire.
Posted on Reply
#110
DaedalusHelios
My 9800GX2 will step up to the 9900GX2 when the time comes. I might get two if they sort out the drivers..... so I will wait. :)

It already plays "Very High" Crysis well. I wonder if it will do well with Alan Wake? I want that game badly.


Sorry to derail your thread guys. :laugh:
Posted on Reply
#111
Tatty_Two
Gone Fishing
BumbRushreposting this since i think it got missed due to the 2nd one being on a new page

-----------------------------------------------------------------------------------------------------------

www.hardocp.com/article.html?art=Njc4LDUsLGhlbnRodXNpYXN0


www.anandtech.com/video/showdoc.aspx?i=2102&p=11



so yeah, basickly the x800's iq was better, the ps3/sm3 path sped up the 6800 and gave it quility =to the x800 cards but did not make it look better

i had both cards, the x800's aa and af looked far better and till games optimized for nvidia ps3 support IQ and perf where FAR worse on the 6800gt@ultra i had, then the x800pro vivo@xt pe(flashed) and the x800pro vivo cost me less yet was faster.....lol

im on an 8800gt now it was best deal i could get at the time, and after alot of tweaking the drivers are ok, still not as good IQ wise PER SETTING as my x1900xtx was but at least it works, im just wondering if they will abandon updates for the 8800gt's once they move beyond the g92 core as they did with the 7 seirse, thats something i was alwase impressed by since i moved from nvidia to Ati back in the FX line days, (tho i have owned nvidia cards from each gen in those times) ati updates even their older cards drivers to fix issues, i know somebody told me recently that ati's 8 drivers fixed a problem with a game on his x800gto@xt pe(flash mod) thats far better then my experiance with nvidia have bene over the years, even back when i was a huge nvidia fan i knew that my older nvidia cards wouldnt be getting bug fixes, after the gf2 came out the tnt cards didnt even get bug fixes for common games that had seirous issues, and yet they where still selling the tnt/tnt2 based cards as budget seirse cards to OEM's(the gfmx was mid range the full gf cards where high end and the tnt cards where value line)

sorry last rant was a bit long, hope people can deal with more then 2 lines of text in a row, if not i will go back to dubble spacing my posts......

neither ati or nvidia are golden when it comes to their remarking older parts or supporting some older parts, tho really nobody supports dx8 and older cards anymore, but i can say this, nvidia cut driver updates/fixes for their dx8 cards sooner then ati did(the 8500/9100/9200 and such) all got driver support up till they cut support for all the sub 9500 cards

the gf4 and older cards all stoped getting meaningfull updates not long after the fx line hit.
i know because at the time i had a ti4400(had better cooler then the 4600's did and was able to clock higher in my case) and i was effectivly proded into buying a 5800ultra by the hype nvidia put out about it jesus that card sucked tho........drove me to try ati again after years of HATING them due to their shitty rage pro/rage2/rage128 drivers sucking ass.

i better stop b4 i spend another page ranting about why i hate ati and why i hate nvidia :P i like them both in ways but both also piss me off at times, stupid bastages.......oh well at least if you buy a card from eather today you will still get something you can use for a couple years (maby not for gaming but gamings a small part of the pc market really)

this is bs tho think about it, they put out the 7950gx2 and NEVER give it proper support, then the 9800gx2 and EOL it just after it comes out, im SURE they wont give it proper support now eather, i would also bet they are regreting their dual pcb design as its far more costly then amdti's 3870x2 cards are to make.

now b4 any of you try and say im full of it, use logic here.

you have a 3870x2 thats 1 card, and can use a moded ver of the cooler theyuse on the normal cards OR most 3rd party coolers will fit.

then you have the 9800gx2 that you have to design and order special coolers for, as well as having to pay more to assimble the cards because its dual pcb with flexable links and such, each pcb being quite long/marge as well as being quite complex, basicly they made it overly complex and more of a PITA to deal with, hell look at the price compared to the x2 card.......nasty!!!

if i had bought one of these i would be returning it asap or selling it on ebay or something, because if they eol it this quick u KNOW your gonna get screwed on driver support just as they did to the last gx2 card.......

at least ati's first x2 card got support dispite it being very poorly known, but then again it dosnt need really special drivers, its just seen as a crossfire setup and gets enhancements from any crossfire based update :)

blah lets not fight about it, cant we all agree that we would be pissed if we owned one of these?
You ever thought of becoming an author?....war and peace springs to mind! You have gone off on a bit of a tangent there, I never said the 6800 was a better card, in fact I did prefer the x800, my point was that IMO IQ was the same in MY experience, some of us have linked articles/reviews that partly agree with that, and partly disagree, the very Hardcop review that said that Antialiasing performance on the x800 was superior and that IQ in Far Cry was better on the x800 also went on to say that IQ across the board was comparable (even if that was eventually compariable), we are on a no win here (or no lose depending which way you look at it) as IQ is very subjective depending on the users eyes, perception and quality settings.

I went from a 7900GTO to a 1950XT briefly, I DID see better IQ from the 1950XT but I think that once NVidia released the G80 and finally sorted out simultaneous HDR/AA that the days of superior IQ in one or the other sides has more or less disappeared but again that is my subjective opinion.
Posted on Reply
#112
AddSub
G80 and finally sorted out simultaneous HDR/AA that the days of superior IQ in one or the other sides has more or less disappeared but again that is my subjective opinion.
I agree. With the arrival of G80 the IQ seems to have gotten better, or at least up to the point where you can't really notice that much difference between the red and green. (Or is it green and green at this point? I can't keep track of all the corporate colors. :D)

The IQ issues with nVidia cards, at least as far as my own experiences go, (and at this point I've owned at least one nVidia card from each generation, except 9xxx), the issues really started with 5xxx series, and IQ seemed to get worse in 6xxx and 7xxx series. I'm not sure if it was architectural problems tied to the GPU design or just driver issues (my own guess would be drivers) but once I started using ATI cards for the first time the difference, in my own eyes at least, become even more noticeable.
Posted on Reply
#113
erocker
*
Well, this thread has me completely intrigued to buy a card with a g92 core on it. I might as well as I'm letting my other rig borrow my two 3870's for a while. I really want to see for myself. I expect nothing.:)

*Edit: Oh, wait a minute... I thought this was the IQ thread but am mistaken. Fooled by off topic posts, stay on track folks.
Posted on Reply
#114
AddSub
Yeah erocker, this topic =



about 60 posts ago...
Posted on Reply
#115
VroomBang
malwareThis information from Expreview may dissapoing many GeForce 9800 GX2 owners if true. NVIDIA is about to EOL (end-of-life) the GeForce 9800 GX2 line-up in just three months, as a result of two new GT200 cards - the single GPU GeForce 9900GTX and the dual GPU GeForce 9900 GX2. One of the GT200 cards will have similar performance and production cost as the GeForce 9800 GX2, which will force the manufacturer to cut down the "older" card. There will be no rebranding for 9800 GX2, like the GeForce 8800 GS which will become 9600 GSO, but just a sudden death. Meanwhile, details of the new GT200 graphics are still unknown.

Source: Expreview.com
Clearly not the best time to upgrade the graphics card. I'd wait till ATI's HD4xxx and NVidia 99xx cards are released and fully tested, hopefully by mid year?
Posted on Reply
#116
newtekie1
Semi-Retired Folder
Wile EYeah, the 7950GX2 is one of those cards that suffer from lack of development time. Hell, it took nVidia months before they even bothered to get it working acceptably in Vista.

And ATI didn't make a dual gpu 1950 pro. That was an independent design and release by Sapphire.
It doesn't matter who designed the card, most of ATi's cards are designed by Sapphire, what matters is that ATi allowed their partners to produce and sell the card, so ATi is responsible for providing driver support for it.
Posted on Reply
#117
BumbRush
AddSubYeah erocker, this topic =



about 60 posts ago...
what fool let you drive a train?......jeebus........look what u did!!!! :D
Posted on Reply
#118
GSG-9
newtekie1It doesn't matter who designed the card, most of ATi's cards are designed by Sapphire, what matters is that ATi allowed their partners to produce and sell the card, so ATi is responsible for providing driver support for it.
Really? I did not know that, I did not suspect Sapphire to be the company to make that.
Posted on Reply
#119
candle_86
ATI doesnt provide special support, the drivers read it as a 1950pro Crossfire, same as the ASUS 7800GT DUAL, or Gigabye 6600GT, 6800GT 3D1
Posted on Reply
#120
asb2106
GSG-9Really? I did not know that, I did not suspect Sapphire to be the company to make that.
Sapphire and ATI paired up a few years ago,

It shocks me alittle still, but its funny, Ive bought about 7 different cards from ATI since the 1k series release, and I have had the best luck OCing with sapphire cards. Their cooling isnt the greatest but if you like to water cool, or upgrade the cooling, they are nice and cheap and perform great!


**9800GX2

If this is true, it really would not suprise me, the release of the new 200 series cores will make this card very hard to sell, and to continue to produce them would not be a good idea.

Plus I dont feel Nvidia ever really had good luck with putting 2 GPUs into one card. The whole 2 PCB idea never seemed to work right...
Posted on Reply
#121
eidairaman1
The Exiled Airman
cooling is fine for stock applications, thats what it was originally meant for, now if you could mount the cooler to say the northbridge and southbridge it be killer bro.
Posted on Reply
#122
GSG-9
asb2106Sapphire and ATI paired up a few years ago,

It shocks me alittle still, but its funny, Ive bought about 7 different cards from ATI since the 1k series release, and I have had the best luck OCing with sapphire cards. Their cooling isnt the greatest but if you like to water cool, or upgrade the cooling, they are nice and cheap and perform great!
The last card I had from them was a 9800pro flashed to xt. One time while I was on a family vacation (this was a long time ago lol) one of the push pins on the cooler somehow came off and the 9800 sat there on with the cooler hanging off until we came home a week later. They let me rma it no question, but I was trying to move to pcie and never did get around to sending it in.
Posted on Reply
Add your own comment
Dec 20th, 2024 06:07 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts