Monday, January 28th 2008

NVIDIA GeForce 9800 GX2 Dismantled

Pictures from the NVIDIA GeForce 9800 GX2 dual card popped up online again, showing some new exclusive details never seen before. More pictures can be found here.
Source: CHIPHELL
Add your own comment

84 Comments on NVIDIA GeForce 9800 GX2 Dismantled

#51
candle_86
a few did something similar with the Radeon 8500, I know a few people that thought they where getting a rare 8500GT AGP card.
Posted on Reply
#52
newtekie1
Semi-Retired Folder
Funny how there are so many of you that are willing to start bashing nVidia so quickly.

The 7900 GX2(7950GX2) was not to combat the x1950XTX, it was to experiment with Quad-SLI. The 7900GTX did perfectly fine competing with the x1950XTX, it wasn't quite as fast, but it did the job, and there were overclocked version that were on par with the x1950XTX.

This isn't simply a thrown together solution to compete with the 3870 X2, I wouldn't be surprised if there was just as much planning in this card as the 3870 X2. This isn't he first time nVidia has done it this way, and it worked in the past(with the exception of poor driver support) so why change the way it is done? Why manufacture one extremely complicated PCB(and the 3870 X2's PCB is extremely complicated) when you can manufacture 2 PCBs that aren't that much more compicated than a normal video card?

Besides that, the 3870 X2 was just a solution to compete with nVidia's 8800GTS(G92) and 8800 Ultra. So making the argument that the 7950GX2 was just thrown together to compete with the x1950XTX and making a big deal out of the fact that nVidia could get a single GPU to compete with the x1950XTX is kind of hypocritical since that is exactly the problem ATI is facing right now.

The move to use 2 PCBs has its advantages. One major one I can see is that if oen of the PCBs is bad, it is cheaper for nVidia to simply replace that one instead of replacing the whole card.

Yes, there are advantages to both designs. I'm not saying one is better than the other. ATI chose thier method and nVidia stuck with the method that has worked for them in the past. By the way, which one of the two has had a dual GPU card that was actually successfull before? And what design did it use? Yeah, I can see why sticking with that design was such a bad move.
Posted on Reply
#53
Mussels
Freshwater Moderator
newtekie1Funny how there are so many of you that are willing to start bashing nVidia so quickly.

The 7900 GX2(7950GX2) was not to combat the x1950XTX, it was to experiment with Quad-SLI. The 7900GTX did perfectly fine competing with the x1950XTX, it wasn't quite as fast, but it did the job, and there were overclocked version that were on part with the x1950XTX.

This isn't simply a thrown together solution to compete with the 3870 X2, I wouldn't be surprised if there was just as much planning in this card as the 3870 X2. This isn't he first time nVidia has done it this way, and it worked in the past(with the exception of poor driver support) so why change the way it is done? Why manufacture one extremely complicated PCB(and the 3870 X2's PCB is extremely complicated) when you can manufacture 2 PCBs that aren't that much more compicated than a normal video card?

Besides that, the 3870 X2 was just a solution to compete with nVidia's 8800GTS(G92) and 8800 Ultra. So making the argument that the 7950GX2 was just thrown together to compete with the x1950XTX and making a big deal out of the fact that nVidia could get a single GPU to compete with the x1950XTX is kind of hypocritical since that is exactly the problem ATI is facing right now.

The move to use 2 PCBs has its advantages. One major one I can see is that if oen of the PCBs is bad, it is cheaper for nVidia to simply replace that one instead of replacing the whole card.

Yes, there are advantages to both designs. I'm not saying one is better than the other. ATI chose thier method and nVidia stuck with the method that has worked for them in the past. By the way, which one of the two has had a dual GPU card that was actually successfull before? And what design did it use? Yeah, I can see why sticking with that design was such a bad move.
and as another thing... whats so wrong with sandwhiching the cooler in between the cards?

The ATI solution has one GPU running a lot hotter than the other, while the Nv one gets away with a single fan, and one heatsink cooling both GPU's - all components that need cooling are on the inside, so you've got yourself a duct essentially. seal it up and get some CFM passing through, and the cooling of the whole card (both cards GPU, ram, voltage chips etc) are all going to get some good cooling.
Posted on Reply
#54
newtekie1
Semi-Retired Folder
Musselsand as another thing... whats so wrong with sandwhiching the cooler in between the cards?

The ATI solution has one GPU running a lot hotter than the other, while the Nv one gets away with a single fan, and one heatsink cooling both GPU's - all components that need cooling are on the inside, so you've got yourself a duct essentially. seal it up and get some CFM passing through, and the cooling of the whole card (both cards GPU, ram, voltage chips etc) are all going to get some good cooling.
Exactly! The reviews of the 3870 X2 have the first GPU running at ~65C under load, while the second one is reaching ~80C.:eek: Not dangerous temperatures, but certainly a little concerning. It seems to me like nVidia took their proven design and improved upon it.
Posted on Reply
#55
mR Yellow
phanbueyI dont think you guys realise the performance advantage nVidia had until now. And for how long... and how long the R600 got pushed back in the beginning. Lets face it, the 8800 series were the best... this X2 doesnt "dominate" by any means, only in a few games and at extremely high res... ATI still doesnt have all their garbage in one bag - and the fact that they recycled their R600 architecture from the HD 2900 to the HD 3800 is no different than what nvidia did with their "milking" of the g80 core... still the X2, which is 2 new chips gets beat out by NV cards that are over a year old now in some games - that should not happen, irrelevant of who wrote the game and blah blah. The VLIW architecture is just very hit and miss depending on application, which ultimately makes the R600 cards unreliable performers.

In reality, i think the next gen is going to be ludicrously fast... ATi and Nvidia are both buying time with their x2 and gx2 cards.

I love ATI, and i think this X2 is amazing even with crap drivers, but dont forget that Nvidia was forbes' company of the year out of EVERY industry - those guys are rolling in money. just pray for ATI that the 9800GX2 is not twice as fast as the X2. Also i thought the GX2 was announced a Looooong time ago... like months before the g92 core was even out.
Well said! nVidia is on top atm. The X2 is more like ATi's only hope of beating nVidia...and then only in some games.

I personally won't be buying either card. I'm waiting for the next gen cards. These dual chip solutions are just week ass...from both sides.
Posted on Reply
#56
candle_86
newtekie1Funny how there are so many of you that are willing to start bashing nVidia so quickly.

The 7900 GX2(7950GX2) was not to combat the x1950XTX, it was to experiment with Quad-SLI. The 7900GTX did perfectly fine competing with the x1950XTX, it wasn't quite as fast, but it did the job, and there were overclocked version that were on par with the x1950XTX.

This isn't simply a thrown together solution to compete with the 3870 X2, I wouldn't be surprised if there was just as much planning in this card as the 3870 X2. This isn't he first time nVidia has done it this way, and it worked in the past(with the exception of poor driver support) so why change the way it is done? Why manufacture one extremely complicated PCB(and the 3870 X2's PCB is extremely complicated) when you can manufacture 2 PCBs that aren't that much more compicated than a normal video card?

Besides that, the 3870 X2 was just a solution to compete with nVidia's 8800GTS(G92) and 8800 Ultra. So making the argument that the 7950GX2 was just thrown together to compete with the x1950XTX and making a big deal out of the fact that nVidia could get a single GPU to compete with the x1950XTX is kind of hypocritical since that is exactly the problem ATI is facing right now.

The move to use 2 PCBs has its advantages. One major one I can see is that if oen of the PCBs is bad, it is cheaper for nVidia to simply replace that one instead of replacing the whole card.

Yes, there are advantages to both designs. I'm not saying one is better than the other. ATI chose thier method and nVidia stuck with the method that has worked for them in the past. By the way, which one of the two has had a dual GPU card that was actually successfull before? And what design did it use? Yeah, I can see why sticking with that design was such a bad move.
See I agree here, and the 9800GX2 appeared in the 165.01 Beta Drivers back in May 07 as 8800GX2 so Nvidia had been playing with the idea for awhile.
Posted on Reply
#57
Tamin
mR YellowWell said! nVidia is on top atm. The X2 is more like ATi's only hope of beating nVidia...and then only in some games.

I personally won't be buying either card. I'm waiting for the next gen cards. These dual chip solutions are just week ass...from both sides.
enlight us pls, what will u buy? :) the next next? like past the 9800's n 3800's?:eek:
Posted on Reply
#58
Mussels
Freshwater Moderator
Taminenlight us pls, what will u buy? :) the next next? like past the 9800's n 3800's?:eek:
he'd probably go for the next single card solution like i would.

These cards arent slow - but neither are they a big leap. Compare a 7900GTX or x1900xtx to the 8800GTX. We all want another leap like that, not these 5-10% gains.
Posted on Reply
#59
Tamin
yes sir! hallelujah :rockout: and i wanted to sell my "2900 card pc" and get 2x3870, guess ill wait... thank you!
Posted on Reply
#60
candle_86
that doesnt happen often in this world.

Its happened a few times only actully

Voodoo 1 to Voodoo 2

Radeon 8500 to Radeon 9700pro

GeforceFX5950Ultra to 6800Ultra.

Geforce 7900GTX to Geforce 8800GTX
Posted on Reply
#61
Mussels
Freshwater Moderator
Taminyes sir! hallelujah :rockout:
I've had my GTX for over a year. I paid $700 au for it.

Even after a year, with only swapping to a quieter cooler this card still fights for 3rd place in games (8800ultra vs 3870 x2 for first and 2nd, with GTX third)

If you compare that to the people getting 'the best' all the time - they spend $400-500 on mid range hardware every 6 months. the GTX certainly has huge value for money since its still going strong so long after it came out.

In all the years i've been a gamer, i've not seen a card go as good as since the radeon 9700PRO - and if you think of the 9800 as the ultra version, its quite similar in its dominance to the 8800GTX/ultra.

edit:candle beat me to it. damn.
Posted on Reply
#62
Tamin
imagine all these companies wudda care for us and realease 1 card per year! haha sweet!
Posted on Reply
#63
candle_86
MusselsI've had my GTX for over a year. I paid $700 au for it.

Even after a year, with only swapping to a quieter cooler this card still fights for 3rd place in games (8800ultra vs 3870 x2 for first and 2nd, with GTX third)

If you compare that to the people getting 'the best' all the time - they spend $400-500 on mid range hardware every 6 months. the GTX certainly has huge value for money since its still going strong so long after it came out.

In all the years i've been a gamer, i've not seen a card go as good as since the radeon 9700PRO - and if you think of the 9800 as the ultra version, its quite similar in its dominance to the 8800GTX/ultra.

edit:candle beat me to it. damn.
I beg to differ there, the Geforce2 was the longest supported, came out in 2000, and will run Call of Duty2, I know I had to do it for a few days lol. Thats 6 years of life from them. 9700pro went from 2003 till 2007 so its only got 4 years under its belt.
Posted on Reply
#64
Ripper3
@ Candle: The FX5950 Ultra to the 6800 Ultra is certainly true, and to clarify the huge gap in performance, even the 6600GT was capable of kicking the ass of an FX5950 at times. I'm still waiting for that to happen again. The 7600GT almost did the same thing to the 6800 Ultra, at stock speeds. The 8600GT/GTS could have been a bit better, but did keep up with the 7900s, and could beat the 7800s.

Oh, and I do agree with Newtekie1, I think I was a bit hypocritical in my comparison, but I guess I was sleepy, heh.
I did always think the 7950GX2 was made to give Nvidia a fighting chance against the X1950XTX, that's how it looked to me at least, when it was released.
candle_86I beg to differ there, the Geforce2 was the longest supported, came out in 2000, and will run Call of Duty2, I know I had to do it for a few days lol. Thats 6 years of life from them. 9700pro went from 2003 till 2007 so its only got 4 years under its belt.
If I had known my old GF2 was capable of Call of Duty 2, I would have used it when I switched graphics cards, about a year back. I was _SO_ bored... but anyhow, coming to realize it, what DX version did the GF2 support? If it was DX7, then a lot of games have had good support for it. Call of Duty 2, as mentioned, and HL2, and all the variations there-of (since they support DX7 AFAIK). Certainly right about the support for it.
The 9700 still has support in games, if running slowly is fine by the user. Many games can still use DX9 basic, instead of DX9c for rendering.
Posted on Reply
#65
Hawk1
Ripper3I did always think the 7950GX2 was made to give Nvidia a fighting chance against the X1950XTX, that's how it looked to me at least, when it was released.
I think the Companies release things like this as a stop gap before their next big release. The 1950xtx with GDDR4 was not necessary as it only provided marginal improvement to the 1900xtx (although it held the crown of "top spot" (arguably) for the month or two it was out before the 8800's, and it was a bit quieter than its predicessor.)

The 7950GX2 was an experiment for Nvidia and was mostly a high priced novelty item for the big spenders, before the 8800's. It was a great concept, but never got the driver support it (the owners) deserved - I think due the 8800 driver issues with Vista, all NV resources were focused on getting that straight and left the GX2 owners blowing in the wind.

I think its the same with these current releases of the x2 and GX2 (well, maybe less so for ATI - they just wanted to say they had the fastest for a while - like the 1950xtx). They are stop gaps for R700/G100 that will come later this year, which if rumours have it, will be the next "big" leap in performance.
Posted on Reply
#66
newtekie1
Semi-Retired Folder
The 7900 GX2 and 7950 GX2 were made to allow quad-SLI, not to compete with the x1950XTX. The 7900GTX did just fine competing against the x1950XTX, as did the 8800 series which came very shortly after the release of the x1950XTX. In case you guys forgot, the 7900GX2 came out in January of 06, the x1950XTX didn't come out until August of 06. So you are saying nVidia released a card 8 months before the x1950XTX because they wanted to compete with it? What kind of logic is that? The x1950XTX was just hitting the drawing boards when the 7900GX2 game out, and by the time the x1950XTX hit nVidia only had 2 months before they were going to release the 8800 series.

And to all the people bashing nVidia's design: At least nVidia came up with their own design instead of just stealing it from ASUS like ATi did.
Posted on Reply
#67
Hawk1
newtekie1And to all the people bashing nVidia's design: At least nVidia came up with their own design instead of just stealing it from ASUS like ATi did.
They did? Link?
Posted on Reply
#68
newtekie1
Semi-Retired Folder
Hawk1They did? Link?
ASUS GeForce 7800GT Dual

There you go, ASUS came up with the dual GPU on a single PCB design, which ATi pretty much stole and improved upon to give us the 3870 X2(and Sapphire did the same thing with their x1950 Pro Dual card also). And yes, ASUS isn't actually the first to do it, it was just the most well known. Gigabyte did it with 6600GT cores.

Edit: I actually commend nVidia for coming up with its own design instead of just copying what has already been done(in the nVidia camp at that :))
Posted on Reply
#69
Hawk1
newtekie1ASUS GeForce 7800GT Dual

There you go, ASUS came up with the dual GPU on a single PCB design, which ATi pretty much stole and improved upon to give us the 3870 X2(and Sapphire did the same thing with their x1950 Pro Dual card also). And yes, ASUS isn't actually the first to do it, it was just the most well known.
No, I meant that Nvidia came up with its own design:p jk thanks for the link. :toast:
Posted on Reply
#70
candle_86
no ASUS didnt

Voodoo 2


Rage Fury Maxx


Voodoo 5 5500


Gigabyte 3D1 6600GT


Gigabyte 3D1 6800GT


ASUS 7800GT Dual


Saphire 1950Pro Dual



so as you can see from these its common
Posted on Reply
#71
AddSub
You forgot XGI's Volari Duo:



Wow, this thread has gone off the rails.
Posted on Reply
#72
newtekie1
Semi-Retired Folder
candle_86so as you can see from these its common
Correct, I stated that ASUS wasn't actually the first to do it. My point was that nVidia designed their own method with the 7900GX2 instead of just reusing previous designs.
Posted on Reply
#73
Tamin
look how they evolved the bastards ;)
Posted on Reply
#74
imperialreign
Musselsand as another thing... whats so wrong with sandwhiching the cooler in between the cards?

The ATI solution has one GPU running a lot hotter than the other, while the Nv one gets away with a single fan, and one heatsink cooling both GPU's - all components that need cooling are on the inside, so you've got yourself a duct essentially. seal it up and get some CFM passing through, and the cooling of the whole card (both cards GPU, ram, voltage chips etc) are all going to get some good cooling.
I'm not so sure on the thought that the 3870x2 will have one GPU running hotter than the other - based on the fact that one GPU has an aluminum based cooler, whereas the second uses copper - both should theorhetically stay very close to the same temp.

We'll have to defi see, though, as I don't think I've read that being touched upon in any reviews, yet.


On the whole dual GPU thing - 3DFX, IIRC, were the first to go that route; also the first to impliment a multiple card setup through SLI (not just dual card). But, even 3DFX had severe limitations with this technology, and performance was bleh compared to the stoopid heat output of those GPUs. nVidia put the SLI tech on the back burner after their acquisition of 3DFX, and didn't start investing into again until ATI started developing Crossfire, and for some reason, nVidia has lagged behind in performance in a multiple card setup (when comparing percentage to percentage improvement versus ATI's offerings). Maybe building off of 3DFX's start with SLI, a lot of the limitations that were inherent then have continued to carry over to now, who knows?

Although, if nVidia get SLI working as solidly as ATI's Crossfire - these dual PCB setups will be a nightmare to beat, as each GPU as it's own resources and doesn't have to "share". TBH, I think that will end up being the biggest limitation of ATI's 3870x2.
Posted on Reply
#75
newtekie1
Semi-Retired Folder
imperialreignI'm not so sure on the thought that the 3870x2 will have one GPU running hotter than the other - based on the fact that one GPU has an aluminum based cooler, whereas the second uses copper - both should theorhetically stay very close to the same temp.

We'll have to defi see, though, as I don't think I've read that being touched upon in any reviews, yet.
Read the review on the 3870X2 here at TPU. One of the cores ran at 65°C under load, and the other ran at 80°C.

And wasn't it Alienware that first went with a dual card solution using nVidia GPUs(talking modern GPUs here). That is the first I ever remember hearing about multiple GPU setups, and nVidia soon released their SLI. I don't even remember crossfire being mentioned until after SLI was already on the market. In fact SLI was on the market in June of 2004 with the release of the and Crossfire wasn't on the market until September of 2005, more than a year later.

I think you have your time lines and who created what to compete with who confused. Crossfire was developed to compete with nVidia's SLI. And it only recently reached the level of performance improvement that SLI gives. ATI just finally got Crossfire working as solidly as SLI.
Posted on Reply
Add your own comment
Nov 26th, 2024 15:26 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts