Monday, January 28th 2008
NVIDIA GeForce 9800 GX2 Dismantled
Pictures from the NVIDIA GeForce 9800 GX2 dual card popped up online again, showing some new exclusive details never seen before. More pictures can be found here.
Source:
CHIPHELL
84 Comments on NVIDIA GeForce 9800 GX2 Dismantled
The 7900 GX2(7950GX2) was not to combat the x1950XTX, it was to experiment with Quad-SLI. The 7900GTX did perfectly fine competing with the x1950XTX, it wasn't quite as fast, but it did the job, and there were overclocked version that were on par with the x1950XTX.
This isn't simply a thrown together solution to compete with the 3870 X2, I wouldn't be surprised if there was just as much planning in this card as the 3870 X2. This isn't he first time nVidia has done it this way, and it worked in the past(with the exception of poor driver support) so why change the way it is done? Why manufacture one extremely complicated PCB(and the 3870 X2's PCB is extremely complicated) when you can manufacture 2 PCBs that aren't that much more compicated than a normal video card?
Besides that, the 3870 X2 was just a solution to compete with nVidia's 8800GTS(G92) and 8800 Ultra. So making the argument that the 7950GX2 was just thrown together to compete with the x1950XTX and making a big deal out of the fact that nVidia could get a single GPU to compete with the x1950XTX is kind of hypocritical since that is exactly the problem ATI is facing right now.
The move to use 2 PCBs has its advantages. One major one I can see is that if oen of the PCBs is bad, it is cheaper for nVidia to simply replace that one instead of replacing the whole card.
Yes, there are advantages to both designs. I'm not saying one is better than the other. ATI chose thier method and nVidia stuck with the method that has worked for them in the past. By the way, which one of the two has had a dual GPU card that was actually successfull before? And what design did it use? Yeah, I can see why sticking with that design was such a bad move.
The ATI solution has one GPU running a lot hotter than the other, while the Nv one gets away with a single fan, and one heatsink cooling both GPU's - all components that need cooling are on the inside, so you've got yourself a duct essentially. seal it up and get some CFM passing through, and the cooling of the whole card (both cards GPU, ram, voltage chips etc) are all going to get some good cooling.
I personally won't be buying either card. I'm waiting for the next gen cards. These dual chip solutions are just week ass...from both sides.
These cards arent slow - but neither are they a big leap. Compare a 7900GTX or x1900xtx to the 8800GTX. We all want another leap like that, not these 5-10% gains.
Its happened a few times only actully
Voodoo 1 to Voodoo 2
Radeon 8500 to Radeon 9700pro
GeforceFX5950Ultra to 6800Ultra.
Geforce 7900GTX to Geforce 8800GTX
Even after a year, with only swapping to a quieter cooler this card still fights for 3rd place in games (8800ultra vs 3870 x2 for first and 2nd, with GTX third)
If you compare that to the people getting 'the best' all the time - they spend $400-500 on mid range hardware every 6 months. the GTX certainly has huge value for money since its still going strong so long after it came out.
In all the years i've been a gamer, i've not seen a card go as good as since the radeon 9700PRO - and if you think of the 9800 as the ultra version, its quite similar in its dominance to the 8800GTX/ultra.
edit:candle beat me to it. damn.
Oh, and I do agree with Newtekie1, I think I was a bit hypocritical in my comparison, but I guess I was sleepy, heh.
I did always think the 7950GX2 was made to give Nvidia a fighting chance against the X1950XTX, that's how it looked to me at least, when it was released. If I had known my old GF2 was capable of Call of Duty 2, I would have used it when I switched graphics cards, about a year back. I was _SO_ bored... but anyhow, coming to realize it, what DX version did the GF2 support? If it was DX7, then a lot of games have had good support for it. Call of Duty 2, as mentioned, and HL2, and all the variations there-of (since they support DX7 AFAIK). Certainly right about the support for it.
The 9700 still has support in games, if running slowly is fine by the user. Many games can still use DX9 basic, instead of DX9c for rendering.
The 7950GX2 was an experiment for Nvidia and was mostly a high priced novelty item for the big spenders, before the 8800's. It was a great concept, but never got the driver support it (the owners) deserved - I think due the 8800 driver issues with Vista, all NV resources were focused on getting that straight and left the GX2 owners blowing in the wind.
I think its the same with these current releases of the x2 and GX2 (well, maybe less so for ATI - they just wanted to say they had the fastest for a while - like the 1950xtx). They are stop gaps for R700/G100 that will come later this year, which if rumours have it, will be the next "big" leap in performance.
And to all the people bashing nVidia's design: At least nVidia came up with their own design instead of just stealing it from ASUS like ATi did.
There you go, ASUS came up with the dual GPU on a single PCB design, which ATi pretty much stole and improved upon to give us the 3870 X2(and Sapphire did the same thing with their x1950 Pro Dual card also). And yes, ASUS isn't actually the first to do it, it was just the most well known. Gigabyte did it with 6600GT cores.
Edit: I actually commend nVidia for coming up with its own design instead of just copying what has already been done(in the nVidia camp at that :))
Voodoo 2
Rage Fury Maxx
Voodoo 5 5500
Gigabyte 3D1 6600GT
Gigabyte 3D1 6800GT
ASUS 7800GT Dual
Saphire 1950Pro Dual
so as you can see from these its common
Wow, this thread has gone off the rails.
We'll have to defi see, though, as I don't think I've read that being touched upon in any reviews, yet.
On the whole dual GPU thing - 3DFX, IIRC, were the first to go that route; also the first to impliment a multiple card setup through SLI (not just dual card). But, even 3DFX had severe limitations with this technology, and performance was bleh compared to the stoopid heat output of those GPUs. nVidia put the SLI tech on the back burner after their acquisition of 3DFX, and didn't start investing into again until ATI started developing Crossfire, and for some reason, nVidia has lagged behind in performance in a multiple card setup (when comparing percentage to percentage improvement versus ATI's offerings). Maybe building off of 3DFX's start with SLI, a lot of the limitations that were inherent then have continued to carry over to now, who knows?
Although, if nVidia get SLI working as solidly as ATI's Crossfire - these dual PCB setups will be a nightmare to beat, as each GPU as it's own resources and doesn't have to "share". TBH, I think that will end up being the biggest limitation of ATI's 3870x2.
And wasn't it Alienware that first went with a dual card solution using nVidia GPUs(talking modern GPUs here). That is the first I ever remember hearing about multiple GPU setups, and nVidia soon released their SLI. I don't even remember crossfire being mentioned until after SLI was already on the market. In fact SLI was on the market in June of 2004 with the release of the and Crossfire wasn't on the market until September of 2005, more than a year later.
I think you have your time lines and who created what to compete with who confused. Crossfire was developed to compete with nVidia's SLI. And it only recently reached the level of performance improvement that SLI gives. ATI just finally got Crossfire working as solidly as SLI.