• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce 9800 GX2 Reaches EOL in Three Months?

BumbRush

New Member
Joined
Mar 5, 2008
Messages
225 (0.04/day)
I find it hard to beleive, having owned both the 6800 vanilla and Ultra as well as an 800XT that the IQ in the 800 was better, the 800 if I remember rightly didnt support SM3 where as the 6800 did if my memory serves me correctly, a big jump in gfx quality and effects between SM2 and SM3. There were many things IMO that the X800 was better at than the 6800 but IQ was not one of them.....but as I said, thats just my opinion.

oh forgot to say the only real diffrance you will see in most games is HDR support between the x800 and 6800 and even then the 6/7 range cards have to choose between AA and HDR because they cant do both at the same time if its SM3 hdr, the 8800 and x1k cards can(infact for the x1k cards theres no perf penilty to have both enabled in games like farcry and oblivion)

HDR could be done under sm2.0c, it just requiered diffrent coding that took more time and skill, check out HL2 lost coast and the HL2 expantions, IMHO with current patches it looks just as good as any other HDR use even tho its sm2.0 based not sm3 :)

blah, i did it again, i ranted more then intended :p
 

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,945 (3.75/day)
Location
Worcestershire, UK
Processor Intel Core i9 11900KF @ -.080mV PL max @220w
Motherboard MSI MAG Z490 TOMAHAWK
Cooling DeepCool LS520SE Liquid + 3 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel Bdie @ 3600Mhz CL14 1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC + 8% PL
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Software Win 11 Home x64
oh forgot to say the only real diffrance you will see in most games is HDR support between the x800 and 6800 and even then the 6/7 range cards have to choose between AA and HDR because they cant do both at the same time if its SM3 hdr, the 8800 and x1k cards can(infact for the x1k cards theres no perf penilty to have both enabled in games like farcry and oblivion)

HDR could be done under sm2.0c, it just requiered diffrent coding that took more time and skill, check out HL2 lost coast and the HL2 expantions, IMHO with current patches it looks just as good as any other HDR use even tho its sm2.0 based not sm3 :)

blah, i did it again, i ranted more then intended :p

Agreed but one or the other is much better than non (x800) and also there is more to SM3 than HDR :p And yes you are right, it took NVidia far too long to develop a card that could simultaneously deliver both HDR and AA, in contrast, ATI just dont have a card now that can effectively deliver AA!!!! (sorry that was un-called for .......I just couldnt resist :eek:)
 

BumbRush

New Member
Joined
Mar 5, 2008
Messages
225 (0.04/day)
lol, well u know why the 2900/3800 use the method they do for aa? because ati STUPIDLY went with what MICROSOFT wanted for dx10/10.1 they wanted AA to be done with shaders insted of detocated hardware, that was part of the requierments for 10.1, i think ms has since changed that, but still.....dumb idea if you ask me.......still ati should have just supported shader based as well as using a hardware AA unit(not run aa in software on shaders)

but hey at least when you choose 2xAA on an ati card it looks as good as 4x or 8x nvidia aa(tested it myself with my 1900xtx vs 8800gt) kinda dissapointing that per setting they cant out do ati with all the bruit force they put into their cards.......
 

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,945 (3.75/day)
Location
Worcestershire, UK
Processor Intel Core i9 11900KF @ -.080mV PL max @220w
Motherboard MSI MAG Z490 TOMAHAWK
Cooling DeepCool LS520SE Liquid + 3 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel Bdie @ 3600Mhz CL14 1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC + 8% PL
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Software Win 11 Home x64
lol, well u know why the 2900/3800 use the method they do for aa? because ati STUPIDLY went with what MICROSOFT wanted for dx10/10.1 they wanted AA to be done with shaders insted of detocated hardware, that was part of the requierments for 10.1, i think ms has since changed that, but still.....dumb idea if you ask me.......still ati should have just supported shader based as well as using a hardware AA unit(not run aa in software on shaders)

but hey at least when you choose 2xAA on an ati card it looks as good as 4x or 8x nvidia aa(tested it myself with my 1900xtx vs 8800gt) kinda dissapointing that per setting they cant out do ati with all the bruit force they put into their cards.......

Obviously our eyesight differ, I bought a HD3870 at launch and when measured against my old G92 8800GTS I actually thought the GTS IQ looked better but in my experience, more often than not, ATI owners seem to beleive that ATI IQ is the best, where....strangely enuff.....NVidia owners think just the opposite......wonder why that is? :D for me, I am kind of predictable so i tend to take the word of the majority and as that is probably NVidia owners ATM then enuff said!!
 
Joined
Aug 9, 2006
Messages
1,065 (0.16/day)
System Name [Primary Workstation]
Processor Intel Core i7-920 Bloomfield @ 3.8GHz/4.55GHz [24-7/Bench]
Motherboard EVGA X58 E758-A1 [Tweaked right!]
Cooling Cooler Master V8 [stock fan + two 133CFM ULTRA KAZE fans]
Memory 12GB [Kingston HyperX]
Video Card(s) constantly upgrading/downgrading [prefer nVidia]
Storage constantly upgrading/downgrading [prefer Hitachi/Samsung]
Display(s) Triple LCD [40 inch primary + 32 & 28 inch auxiliary displays]
Case Cooler Master Cosmos 1000 [Mesh Mod, CFM Overload]
Audio Device(s) ASUS Xonar D1 + onboard Realtek ALC889A [Logitech Z-5300 Spk., Niko 650-HP 5.1 Hp., X-Bass Hp.]
Power Supply Corsair TX950W [aka Reactor]
Software This and that... [All software 100% legit and paid for, 0% pirated]
Benchmark Scores Ridiculously good scores!!!
Wow, this topic has completely derailed.

I find it hard to beleive, having owned both the 6800 vanilla and Ultra as well as an 800XT that the IQ in the 800 was better

Pretty much every Radeon card I owned had better image quality than any nVidia card I ever owned. Both in 2D/Desktop and 3D. Well, except maybe my current 8800GTX. I actually owned a GeForce 6800 vanilla (Apollo brand). It was the worst experience I ever had with a video card, in addition to being the only card to date that I had to RMA on the same day I received it from newegg. It was artifacting at stock settings, both in 2D and 3D. Although I guess that has to do more with Apollo's quality control than anything else. Anyways, I went from that card to a BFG GeForce 6600GT. Which I had for about 2-3 months. Until a DAC unit on the BFG went apeshit and killed one of my CRT's. After that I temporarily went to a backup GeForce 3 TI 500 for a week or two. The old GeForce 3 had better image quality than the newer 6800/6600 cards and after using the GF3 for few weeks I received a Radeon X800GTO (my first Radeon card ever) and my eyes were amazed. It had better IQ than any nVidia card I owned by that point (and I owned about a dozen by then). I replaced all my nVidia cards with Radeon/ATI alternatives in single month. X850XT's, few X800GTO's and even a lowly x700pro card.

And before anyone calls me a ATI/AMD fanboi, please take into consideration that currently I'm running a GeForce 8800GTX and an nForce based motherboard on my primary machine.

a big jump in gfx quality and effects between SM2 and SM3.

SM3 was really SM 2.5, feature wise. But we all know how marketing works. Difference was not that big of a ...well "big jump". SM3 really allowed for increased performance vs. SM2 than it actually introduced any new features (which it did, admittedly).

Here is a great and informative article @ Hardocp written back in 2004, comparing the new features of SM3 vs. SM2 and SM1.1, in FarCry no less. :)

http://www.hardocp.com/article.html?art=NjA5
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
42,758 (6.69/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
You know it could of been Neweggs Fault because no one knows how these retailers store and handle their products.
Wow, this topic has completely derailed.



Pretty much every Radeon card I owned had better image quality than any nVidia card I ever owned. Both in 2D/Desktop and 3D. Well, except maybe my current 8800GTX. I actually owned a GeForce 6800 vanilla (Apollo brand). It was the worst experience I ever had with a video card, in addition to being the only card to date that I had to RMA on the same day I received it from newegg. It was artifacting at stock settings, both in 2D and 3D. Although I guess that has to do more with Apollo's quality control than anything else. Anyways, I went from that card to a BFG GeForce 6600GT. Which I had for about 2-3 months. Until a DAC unit on the BFG went apeshit and killed one of my CRT's. After that I temporarily went to a backup GeForce 3 TI 500 for a week or two. The old GeForce 3 had better image quality than the newer 6800/6600 cards and after using the GF3 for few weeks I received a Radeon X800GTO (my first Radeon card ever) and my eyes were amazed. It had better IQ than any nVidia card I owned by that point (and I owned about a dozen by then). I replaced all my nVidia cards with Radeon/ATI alternatives in single month. X850XT's, few X800GTO's and even a lowly x700pro card.

And before anyone calls me a ATI/AMD fanboi, please take into consideration that currently I'm running a GeForce 8800GTX and an nForce based motherboard on my primary machine.



SM3 was really SM 2.5, feature wise. But we all know how marketing works. Difference was not that big of a ...well "big jump". SM3 really allowed for increased performance vs. SM2 than it actually introduced any new features (which it did, admittedly).

Here is a great and informative article @ Hardocp written back in 2004, comparing the new features of SM3 vs. SM2 and SM1.1, in FarCry no less. :)

http://www.hardocp.com/article.html?art=NjA5
 

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,945 (3.75/day)
Location
Worcestershire, UK
Processor Intel Core i9 11900KF @ -.080mV PL max @220w
Motherboard MSI MAG Z490 TOMAHAWK
Cooling DeepCool LS520SE Liquid + 3 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel Bdie @ 3600Mhz CL14 1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC + 8% PL
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Software Win 11 Home x64
Wow, this topic has completely derailed.



Pretty much every Radeon card I owned had better image quality than any nVidia card I ever owned. Both in 2D/Desktop and 3D. Well, except maybe my current 8800GTX. I actually owned a GeForce 6800 vanilla (Apollo brand). It was the worst experience I ever had with a video card, in addition to being the only card to date that I had to RMA on the same day I received it from newegg. It was artifacting at stock settings, both in 2D and 3D. Although I guess that has to do more with Apollo's quality control than anything else. Anyways, I went from that card to a BFG GeForce 6600GT. Which I had for about 2-3 months. Until a DAC unit on the BFG went apeshit and killed one of my CRT's. After that I temporarily went to a backup GeForce 3 TI 500 for a week or two. The old GeForce 3 had better image quality than the newer 6800/6600 cards and after using the GF3 for few weeks I received a Radeon X800GTO (my first Radeon card ever) and my eyes were amazed. It had better IQ than any nVidia card I owned by that point (and I owned about a dozen by then). I replaced all my nVidia cards with Radeon/ATI alternatives in single month. X850XT's, few X800GTO's and even a lowly x700pro card.

And before anyone calls me a ATI/AMD fanboi, please take into consideration that currently I'm running a GeForce 8800GTX and an nForce based motherboard on my primary machine.



SM3 was really SM 2.5, feature wise. But we all know how marketing works. Difference was not that big of a ...well "big jump". SM3 really allowed for increased performance vs. SM2 than it actually introduced any new features (which it did, admittedly).

Here is a great and informative article @ Hardocp written back in 2004, comparing the new features of SM3 vs. SM2 and SM1.1, in FarCry no less. :)

http://www.hardocp.com/article.html?art=NjA5

Yup, cant argue with that and it showed in AA the 800XT outperformed the 6800Ultra at max settings, partially due to the fact that the Ultra's AA range was 2x, 4x or 8x where the 800's was 2x, 4x and 6x. But also in that same article in May 2004 regarding IQ specifically which was my origional point....I quote:

Comparing IQ Technology:


Looking at the Anti-Aliasing and Anisotropic image quality between the X800 series and the GeForce 6800Ultra we find them to be very comparable. There is one difference though. The X800 is so powerful, 6XAA is actually a useable Anti-Aliasing setting on the X800 series whereas comparable 8XAA on the 6800Ultra, is basically not usable, as it is too demanding in terms of performance because it is a super-sampling + multi-sampling technique.


The only shader quality differences we noticed were in FarCry where the X800 series is providing much better image quality. Compared to the 9800XT the X800 series have identical AA, AF and shader quality.
 
Joined
Aug 9, 2006
Messages
1,065 (0.16/day)
System Name [Primary Workstation]
Processor Intel Core i7-920 Bloomfield @ 3.8GHz/4.55GHz [24-7/Bench]
Motherboard EVGA X58 E758-A1 [Tweaked right!]
Cooling Cooler Master V8 [stock fan + two 133CFM ULTRA KAZE fans]
Memory 12GB [Kingston HyperX]
Video Card(s) constantly upgrading/downgrading [prefer nVidia]
Storage constantly upgrading/downgrading [prefer Hitachi/Samsung]
Display(s) Triple LCD [40 inch primary + 32 & 28 inch auxiliary displays]
Case Cooler Master Cosmos 1000 [Mesh Mod, CFM Overload]
Audio Device(s) ASUS Xonar D1 + onboard Realtek ALC889A [Logitech Z-5300 Spk., Niko 650-HP 5.1 Hp., X-Bass Hp.]
Power Supply Corsair TX950W [aka Reactor]
Software This and that... [All software 100% legit and paid for, 0% pirated]
Benchmark Scores Ridiculously good scores!!!
You know it could of been Neweggs Fault because no one knows how these retailers store and handle their products.

I actually mentioned this before in few other topics, but the Apollo 6800 card was cut down. What I mean is, it had 128-bit memory interface vs. 256-bit on other 6800v/reference cards and it had some other discrepancies as well, which I will not go into now. (Upon closer examination I noticed right away the arrangement and the count of VRAM ICs which clearly indicated a 128-bit part, something that was confirmed by RivaTuner as well.) Something else to consider is that GeCube and Apollo are different branches of the same corporation, and GeCube has had a tendency of releasing cards that are quite different from reference models. For example, GeCube X800GTO few years back which was the only 128-bit GTO part on the market to my knowledge (vs. 256-bit on reference/others) and the most recent fiasco on Newegg where they advertised a GeCube 2600XT with a 256-bit interface when in reality all 2600XT cards have 128-bit, including theirs. I have more examples, but that's another topic. Apollo/GeCube = shady.
 

BumbRush

New Member
Joined
Mar 5, 2008
Messages
225 (0.04/day)
reposting this since i think it got missed due to the 2nd one being on a new page

-----------------------------------------------------------------------------------------------------------
I find it hard to beleive, having owned both the 6800 vanilla and Ultra as well as an 800XT that the IQ in the 800 was better, the 800 if I remember rightly didnt support SM3 where as the 6800 did if my memory serves me correctly, a big jump in gfx quality and effects between SM2 and SM3. There were many things IMO that the X800 was better at than the 6800 but IQ was not one of them.....but as I said, thats just my opinion.

http://www.hardocp.com/article.html?art=Njc4LDUsLGhlbnRodXNpYXN0
We did notice shader quality improvements from Patch 1.1 to Patch 1.3, which now make the image quality on the GeForce 6800GT comparable to the shader image quality as seen on Radeon X800 video cards. The shader quality with Shader Model 3.0 is not better than Shader Model 2.0, it is now equal, where it wasn’t before in this game with Patch 1.1.

http://www.anandtech.com/video/showdoc.aspx?i=2102&p=11
Image quality of both SM2.0 paths are on par with eachother, and the SM3.0 path on NVIDIA hardware shows negligable differences. The very slight variations are most likely just small fluctuations between the mathematical output of a single pass and a multipass lighting shader. The difference is honestly so tiny that you can't call either rendering lower quality from a visual standpoint. We will still try to learn what exactly causes the differences we noticed from CryTek.


so yeah, basickly the x800's iq was better, the ps3/sm3 path sped up the 6800 and gave it quility =to the x800 cards but did not make it look better

i had both cards, the x800's aa and af looked far better and till games optimized for nvidia ps3 support IQ and perf where FAR worse on the 6800gt@ultra i had, then the x800pro vivo@xt pe(flashed) and the x800pro vivo cost me less yet was faster.....lol

im on an 8800gt now it was best deal i could get at the time, and after alot of tweaking the drivers are ok, still not as good IQ wise PER SETTING as my x1900xtx was but at least it works, im just wondering if they will abandon updates for the 8800gt's once they move beyond the g92 core as they did with the 7 seirse, thats something i was alwase impressed by since i moved from nvidia to Ati back in the FX line days, (tho i have owned nvidia cards from each gen in those times) ati updates even their older cards drivers to fix issues, i know somebody told me recently that ati's 8 drivers fixed a problem with a game on his x800gto@xt pe(flash mod) thats far better then my experiance with nvidia have bene over the years, even back when i was a huge nvidia fan i knew that my older nvidia cards wouldnt be getting bug fixes, after the gf2 came out the tnt cards didnt even get bug fixes for common games that had seirous issues, and yet they where still selling the tnt/tnt2 based cards as budget seirse cards to OEM's(the gfmx was mid range the full gf cards where high end and the tnt cards where value line)

sorry last rant was a bit long, hope people can deal with more then 2 lines of text in a row, if not i will go back to dubble spacing my posts......

neither ati or nvidia are golden when it comes to their remarking older parts or supporting some older parts, tho really nobody supports dx8 and older cards anymore, but i can say this, nvidia cut driver updates/fixes for their dx8 cards sooner then ati did(the 8500/9100/9200 and such) all got driver support up till they cut support for all the sub 9500 cards

the gf4 and older cards all stoped getting meaningfull updates not long after the fx line hit.
i know because at the time i had a ti4400(had better cooler then the 4600's did and was able to clock higher in my case) and i was effectivly proded into buying a 5800ultra by the hype nvidia put out about it jesus that card sucked tho........drove me to try ati again after years of HATING them due to their shitty rage pro/rage2/rage128 drivers sucking ass.

i better stop b4 i spend another page ranting about why i hate ati and why i hate nvidia :p i like them both in ways but both also piss me off at times, stupid bastages.......oh well at least if you buy a card from eather today you will still get something you can use for a couple years (maby not for gaming but gamings a small part of the pc market really)

this is bs tho think about it, they put out the 7950gx2 and NEVER give it proper support, then the 9800gx2 and EOL it just after it comes out, im SURE they wont give it proper support now eather, i would also bet they are regreting their dual pcb design as its far more costly then amdti's 3870x2 cards are to make.

now b4 any of you try and say im full of it, use logic here.

you have a 3870x2 thats 1 card, and can use a moded ver of the cooler theyuse on the normal cards OR most 3rd party coolers will fit.

then you have the 9800gx2 that you have to design and order special coolers for, as well as having to pay more to assimble the cards because its dual pcb with flexable links and such, each pcb being quite long/marge as well as being quite complex, basicly they made it overly complex and more of a PITA to deal with, hell look at the price compared to the x2 card.......nasty!!!

if i had bought one of these i would be returning it asap or selling it on ebay or something, because if they eol it this quick u KNOW your gonna get screwed on driver support just as they did to the last gx2 card.......

at least ati's first x2 card got support dispite it being very poorly known, but then again it dosnt need really special drivers, its just seen as a crossfire setup and gets enhancements from any crossfire based update :)

blah lets not fight about it, cant we all agree that we would be pissed if we owned one of these?
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.65/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
174.74 allows both. It supports the 7950 GX2 under Vista64, and supports Quad-SLI, most of the drivers released have been like this. Have you actually tried it? I have a customer that comes in my shop regulary that bought two 7950 GX2's though me, he still uses them in Quad-SLI and runs Vista64, 174.74 has been working wonders for him, so have several previous driver releases.



Real support for any of the 7 series cards, even the ones that are not EOL, has been abysmal. Just like real support for the x1k series has been non-existant also. Once a new series comes out, both graphics camps pretty much drop real support for their older cards. Usually, it isn't a problem since most of the cards have had more than enough time to mature before the new series was released. However, in the cases of cards released at the very end of a series lifespan, support is usually dropped rather quickly, but the cards still work and still get the general benefits of the new drivers. ATi did the same thing with their Dual x1950Pro, there haven't been driver improvemnts directly for the cards since the day it was released.
Yeah, the 7950GX2 is one of those cards that suffer from lack of development time. Hell, it took nVidia months before they even bothered to get it working acceptably in Vista.

And ATI didn't make a dual gpu 1950 pro. That was an independent design and release by Sapphire.
 
Joined
Feb 21, 2008
Messages
5,004 (0.81/day)
Location
NC, USA
System Name Cosmos F1000
Processor Ryzen 9 7950X3D
Motherboard MSI PRO B650-S WIFI AM5
Cooling Corsair H100x, Panaflo's on case
Memory G.Skill DDR5 Trident 64GB (32GBx2)
Video Card(s) MSI Gaming Radeon RX 7900 XTX 24GB GDDR6
Storage 4TB Firecuda M.2 2280
Display(s) 32" OLED 4k 240Hz ASUS ROG Swift PG32UCD
Case CM Cosmos 1000
Audio Device(s) logitech 5.1 system (midrange quality)
Power Supply CORSAIR RM1000e 1000watt
Mouse G400s Logitech, white Razor Mamba
Keyboard K65 RGB Corsair Tenkeyless Cherry Red MX
VR HMD Steam Valve Index
Software Win10 Pro, Win11
My 9800GX2 will step up to the 9900GX2 when the time comes. I might get two if they sort out the drivers..... so I will wait. :)

It already plays "Very High" Crysis well. I wonder if it will do well with Alan Wake? I want that game badly.


Sorry to derail your thread guys. :laugh:
 

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,945 (3.75/day)
Location
Worcestershire, UK
Processor Intel Core i9 11900KF @ -.080mV PL max @220w
Motherboard MSI MAG Z490 TOMAHAWK
Cooling DeepCool LS520SE Liquid + 3 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel Bdie @ 3600Mhz CL14 1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC + 8% PL
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Software Win 11 Home x64
reposting this since i think it got missed due to the 2nd one being on a new page

-----------------------------------------------------------------------------------------------------------

http://www.hardocp.com/article.html?art=Njc4LDUsLGhlbnRodXNpYXN0


http://www.anandtech.com/video/showdoc.aspx?i=2102&p=11



so yeah, basickly the x800's iq was better, the ps3/sm3 path sped up the 6800 and gave it quility =to the x800 cards but did not make it look better

i had both cards, the x800's aa and af looked far better and till games optimized for nvidia ps3 support IQ and perf where FAR worse on the 6800gt@ultra i had, then the x800pro vivo@xt pe(flashed) and the x800pro vivo cost me less yet was faster.....lol

im on an 8800gt now it was best deal i could get at the time, and after alot of tweaking the drivers are ok, still not as good IQ wise PER SETTING as my x1900xtx was but at least it works, im just wondering if they will abandon updates for the 8800gt's once they move beyond the g92 core as they did with the 7 seirse, thats something i was alwase impressed by since i moved from nvidia to Ati back in the FX line days, (tho i have owned nvidia cards from each gen in those times) ati updates even their older cards drivers to fix issues, i know somebody told me recently that ati's 8 drivers fixed a problem with a game on his x800gto@xt pe(flash mod) thats far better then my experiance with nvidia have bene over the years, even back when i was a huge nvidia fan i knew that my older nvidia cards wouldnt be getting bug fixes, after the gf2 came out the tnt cards didnt even get bug fixes for common games that had seirous issues, and yet they where still selling the tnt/tnt2 based cards as budget seirse cards to OEM's(the gfmx was mid range the full gf cards where high end and the tnt cards where value line)

sorry last rant was a bit long, hope people can deal with more then 2 lines of text in a row, if not i will go back to dubble spacing my posts......

neither ati or nvidia are golden when it comes to their remarking older parts or supporting some older parts, tho really nobody supports dx8 and older cards anymore, but i can say this, nvidia cut driver updates/fixes for their dx8 cards sooner then ati did(the 8500/9100/9200 and such) all got driver support up till they cut support for all the sub 9500 cards

the gf4 and older cards all stoped getting meaningfull updates not long after the fx line hit.
i know because at the time i had a ti4400(had better cooler then the 4600's did and was able to clock higher in my case) and i was effectivly proded into buying a 5800ultra by the hype nvidia put out about it jesus that card sucked tho........drove me to try ati again after years of HATING them due to their shitty rage pro/rage2/rage128 drivers sucking ass.

i better stop b4 i spend another page ranting about why i hate ati and why i hate nvidia :p i like them both in ways but both also piss me off at times, stupid bastages.......oh well at least if you buy a card from eather today you will still get something you can use for a couple years (maby not for gaming but gamings a small part of the pc market really)

this is bs tho think about it, they put out the 7950gx2 and NEVER give it proper support, then the 9800gx2 and EOL it just after it comes out, im SURE they wont give it proper support now eather, i would also bet they are regreting their dual pcb design as its far more costly then amdti's 3870x2 cards are to make.

now b4 any of you try and say im full of it, use logic here.

you have a 3870x2 thats 1 card, and can use a moded ver of the cooler theyuse on the normal cards OR most 3rd party coolers will fit.

then you have the 9800gx2 that you have to design and order special coolers for, as well as having to pay more to assimble the cards because its dual pcb with flexable links and such, each pcb being quite long/marge as well as being quite complex, basicly they made it overly complex and more of a PITA to deal with, hell look at the price compared to the x2 card.......nasty!!!

if i had bought one of these i would be returning it asap or selling it on ebay or something, because if they eol it this quick u KNOW your gonna get screwed on driver support just as they did to the last gx2 card.......

at least ati's first x2 card got support dispite it being very poorly known, but then again it dosnt need really special drivers, its just seen as a crossfire setup and gets enhancements from any crossfire based update :)

blah lets not fight about it, cant we all agree that we would be pissed if we owned one of these?

You ever thought of becoming an author?....war and peace springs to mind! You have gone off on a bit of a tangent there, I never said the 6800 was a better card, in fact I did prefer the x800, my point was that IMO IQ was the same in MY experience, some of us have linked articles/reviews that partly agree with that, and partly disagree, the very Hardcop review that said that Antialiasing performance on the x800 was superior and that IQ in Far Cry was better on the x800 also went on to say that IQ across the board was comparable (even if that was eventually compariable), we are on a no win here (or no lose depending which way you look at it) as IQ is very subjective depending on the users eyes, perception and quality settings.

I went from a 7900GTO to a 1950XT briefly, I DID see better IQ from the 1950XT but I think that once NVidia released the G80 and finally sorted out simultaneous HDR/AA that the days of superior IQ in one or the other sides has more or less disappeared but again that is my subjective opinion.
 
Joined
Aug 9, 2006
Messages
1,065 (0.16/day)
System Name [Primary Workstation]
Processor Intel Core i7-920 Bloomfield @ 3.8GHz/4.55GHz [24-7/Bench]
Motherboard EVGA X58 E758-A1 [Tweaked right!]
Cooling Cooler Master V8 [stock fan + two 133CFM ULTRA KAZE fans]
Memory 12GB [Kingston HyperX]
Video Card(s) constantly upgrading/downgrading [prefer nVidia]
Storage constantly upgrading/downgrading [prefer Hitachi/Samsung]
Display(s) Triple LCD [40 inch primary + 32 & 28 inch auxiliary displays]
Case Cooler Master Cosmos 1000 [Mesh Mod, CFM Overload]
Audio Device(s) ASUS Xonar D1 + onboard Realtek ALC889A [Logitech Z-5300 Spk., Niko 650-HP 5.1 Hp., X-Bass Hp.]
Power Supply Corsair TX950W [aka Reactor]
Software This and that... [All software 100% legit and paid for, 0% pirated]
Benchmark Scores Ridiculously good scores!!!
G80 and finally sorted out simultaneous HDR/AA that the days of superior IQ in one or the other sides has more or less disappeared but again that is my subjective opinion.

I agree. With the arrival of G80 the IQ seems to have gotten better, or at least up to the point where you can't really notice that much difference between the red and green. (Or is it green and green at this point? I can't keep track of all the corporate colors. :D)

The IQ issues with nVidia cards, at least as far as my own experiences go, (and at this point I've owned at least one nVidia card from each generation, except 9xxx), the issues really started with 5xxx series, and IQ seemed to get worse in 6xxx and 7xxx series. I'm not sure if it was architectural problems tied to the GPU design or just driver issues (my own guess would be drivers) but once I started using ATI cards for the first time the difference, in my own eyes at least, become even more noticeable.
 
Joined
Jul 19, 2006
Messages
43,611 (6.47/day)
Processor AMD Ryzen 7 7800X3D
Motherboard ASUS TUF x670e-Plus Wifi
Cooling EK AIO 360. Phantek T30 fans.
Memory 32GB G.Skill 6000Mhz
Video Card(s) Asus RTX 4090
Storage WD/Samsung m.2's
Display(s) LG C2 Evo OLED 42"
Case Lian Li PC 011 Dynamic Evo
Audio Device(s) Topping E70 DAC, SMSL SP200 Amp, Adam Audio T5V's, Hifiman Sundara's.
Power Supply FSP Hydro Ti PRO 1000W
Mouse Razer Basilisk V3 Pro
Keyboard Epomaker 84 key
Software Windows 11 Pro
Well, this thread has me completely intrigued to buy a card with a g92 core on it. I might as well as I'm letting my other rig borrow my two 3870's for a while. I really want to see for myself. I expect nothing.:)

*Edit: Oh, wait a minute... I thought this was the IQ thread but am mistaken. Fooled by off topic posts, stay on track folks.
 
Joined
Aug 9, 2006
Messages
1,065 (0.16/day)
System Name [Primary Workstation]
Processor Intel Core i7-920 Bloomfield @ 3.8GHz/4.55GHz [24-7/Bench]
Motherboard EVGA X58 E758-A1 [Tweaked right!]
Cooling Cooler Master V8 [stock fan + two 133CFM ULTRA KAZE fans]
Memory 12GB [Kingston HyperX]
Video Card(s) constantly upgrading/downgrading [prefer nVidia]
Storage constantly upgrading/downgrading [prefer Hitachi/Samsung]
Display(s) Triple LCD [40 inch primary + 32 & 28 inch auxiliary displays]
Case Cooler Master Cosmos 1000 [Mesh Mod, CFM Overload]
Audio Device(s) ASUS Xonar D1 + onboard Realtek ALC889A [Logitech Z-5300 Spk., Niko 650-HP 5.1 Hp., X-Bass Hp.]
Power Supply Corsair TX950W [aka Reactor]
Software This and that... [All software 100% legit and paid for, 0% pirated]
Benchmark Scores Ridiculously good scores!!!
Yeah erocker, this topic =



about 60 posts ago...
 

VroomBang

New Member
Joined
Mar 8, 2008
Messages
183 (0.03/day)
Location
Spain
System Name Computer
Processor Wolfdale E8400 stock
Motherboard Gigabyte GA-P35-DS3 FSB 1600MHz
Cooling CPU Xigmatek HDT-S1283 - front stock 12cm - rear Xig XSF-F1251 12cm - side Tacens Aura Pro 12cm
Memory Mushkin Redline DDR2 1000MHz 2x2GB 4-4-4-12 @1.8V
Video Card(s) Sapphire HD 3850 512MB stock
Storage SATA2 Seagate 500MB + IDE WesternDigital 500MB
Display(s) Samsung 19" LCD SyncMaster 193v
Case Jeantech NJX ATX gaming tower case
Audio Device(s) onboard
Power Supply Jeantech 600W Arctic Modular JN-600-AP (dual 19a +12v rails )
Software Win XP SP3
Benchmark Scores 3DMark06: 9906 / Super Pi 1M: 15.641 s
This information from Expreview may dissapoing many GeForce 9800 GX2 owners if true. NVIDIA is about to EOL (end-of-life) the GeForce 9800 GX2 line-up in just three months, as a result of two new GT200 cards - the single GPU GeForce 9900GTX and the dual GPU GeForce 9900 GX2. One of the GT200 cards will have similar performance and production cost as the GeForce 9800 GX2, which will force the manufacturer to cut down the "older" card. There will be no rebranding for 9800 GX2, like the GeForce 8800 GS which will become 9600 GSO, but just a sudden death. Meanwhile, details of the new GT200 graphics are still unknown.

Source: Expreview.com

Clearly not the best time to upgrade the graphics card. I'd wait till ATI's HD4xxx and NVidia 99xx cards are released and fully tested, hopefully by mid year?
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,473 (4.08/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Yeah, the 7950GX2 is one of those cards that suffer from lack of development time. Hell, it took nVidia months before they even bothered to get it working acceptably in Vista.

And ATI didn't make a dual gpu 1950 pro. That was an independent design and release by Sapphire.

It doesn't matter who designed the card, most of ATi's cards are designed by Sapphire, what matters is that ATi allowed their partners to produce and sell the card, so ATi is responsible for providing driver support for it.
 
Joined
Feb 8, 2005
Messages
1,675 (0.23/day)
Location
Minneapolis, Mn
System Name Livingston
Processor i7-4960HQ
Motherboard macbook prp retina
Cooling Alphacool NexXxoS Monsta (240mm x 120mm x 80mm)
Memory 16Gb
Video Card(s) Zotac Arctic Storm Nvidia 980ti
Display(s) 1x Acer XB270HU, 1x Catleap, 1x Oculus
Benchmark Scores http://www.3dmark.com/fs/770087
It doesn't matter who designed the card, most of ATi's cards are designed by Sapphire, what matters is that ATi allowed their partners to produce and sell the card, so ATi is responsible for providing driver support for it.
Really? I did not know that, I did not suspect Sapphire to be the company to make that.
 
Joined
Dec 28, 2006
Messages
4,378 (0.67/day)
Location
Hurst, Texas
System Name The86
Processor Ryzen 5 3600
Motherboard ASROCKS B450 Steel Legend
Cooling AMD Stealth
Memory 2x8gb DDR4 3200 Corsair
Video Card(s) EVGA RTX 3060 Ti
Storage WD Black 512gb, WD Blue 1TB
Display(s) AOC 24in
Case Raidmax Alpha Prime
Power Supply 700W Thermaltake Smart
Mouse Logitech Mx510
Keyboard Razer BlackWidow 2012
Software Windows 10 Professional
ATI doesnt provide special support, the drivers read it as a 1950pro Crossfire, same as the ASUS 7800GT DUAL, or Gigabye 6600GT, 6800GT 3D1
 
Joined
Jan 8, 2008
Messages
1,941 (0.31/day)
Location
Pleasant Prairie, WI
System Name File Server
Processor 2600k
Motherboard idk
Cooling H110
Memory 20gb of something
Video Card(s) onboard!
Storage 2x120 SSD, (5x8tb)+(3x4tb) = 35 TB Z1 pool
Display(s) couple 4k 32s
Case cooler master something i think
Audio Device(s) does anyone bother with anything but onboard?
Power Supply Whatever OCZ PC Powercooling became. 1200 modular setup
Mouse MX Master 2x
Keyboard G710
Software FreeNAS
Really? I did not know that, I did not suspect Sapphire to be the company to make that.

Sapphire and ATI paired up a few years ago,

It shocks me alittle still, but its funny, Ive bought about 7 different cards from ATI since the 1k series release, and I have had the best luck OCing with sapphire cards. Their cooling isnt the greatest but if you like to water cool, or upgrade the cooling, they are nice and cheap and perform great!


**9800GX2

If this is true, it really would not suprise me, the release of the new 200 series cores will make this card very hard to sell, and to continue to produce them would not be a good idea.

Plus I dont feel Nvidia ever really had good luck with putting 2 GPUs into one card. The whole 2 PCB idea never seemed to work right...
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
42,758 (6.69/day)
Location
Republic of Texas (True Patriot)
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
cooling is fine for stock applications, thats what it was originally meant for, now if you could mount the cooler to say the northbridge and southbridge it be killer bro.
 
Joined
Feb 8, 2005
Messages
1,675 (0.23/day)
Location
Minneapolis, Mn
System Name Livingston
Processor i7-4960HQ
Motherboard macbook prp retina
Cooling Alphacool NexXxoS Monsta (240mm x 120mm x 80mm)
Memory 16Gb
Video Card(s) Zotac Arctic Storm Nvidia 980ti
Display(s) 1x Acer XB270HU, 1x Catleap, 1x Oculus
Benchmark Scores http://www.3dmark.com/fs/770087
Sapphire and ATI paired up a few years ago,

It shocks me alittle still, but its funny, Ive bought about 7 different cards from ATI since the 1k series release, and I have had the best luck OCing with sapphire cards. Their cooling isnt the greatest but if you like to water cool, or upgrade the cooling, they are nice and cheap and perform great!

The last card I had from them was a 9800pro flashed to xt. One time while I was on a family vacation (this was a long time ago lol) one of the push pins on the cooler somehow came off and the 9800 sat there on with the cooler hanging off until we came home a week later. They let me rma it no question, but I was trying to move to pcie and never did get around to sending it in.
 
Top