# NVIDIA GeForce 9800 GX2 Dismantled



## malware (Jan 28, 2008)

Pictures from the NVIDIA GeForce 9800 GX2 dual card popped up online again, showing some new exclusive details never seen before. More pictures can be found here.



 

 

 

 

 

 



*View at TechPowerUp Main Site*


----------



## GSG-9 (Jan 28, 2008)

only 1 core? I thought the 9800 was physicly two 8800's, so its a dual core chip


----------



## Sasqui (Jan 28, 2008)

^Yea, same thoughts here...  "why only one GPU?"


----------



## DaMulta (Jan 28, 2008)

GSG-9 said:


> only 1 core? I thought the 9800 was physicly two 8800's, so its a dual core chip



no they didn't put both cores on the same PCB. The PCP is bolted together then runs on one PCI-E slot.

There isn't a shot of it put together.


----------



## INSTG8R (Jan 28, 2008)

GSG-9 said:


> only 1 core? I thought the 9800 was physicly two 8800's, so its a dual core chip



Bit confused on that now too except I thought it was a 2 PCB layout so I think they must just be showing one. If you look at the 3rd pic it appears to have the bridge for the 2nd PCB.


----------



## AsRock (Jan 28, 2008)

GSG-9 said:


> only 1 core? I thought the 9800 was physicly two 8800's, so its a dual core chip



They left the other PCB on the heatsink by the looks of it. Pics do not show on that other site for me so maybe they show it there.


----------



## malware (Jan 28, 2008)

Yes there're two PCBs, but pictures show them one by one...so that makes the illusion that there's only one core. I'll add the second PCB (pic) in a few minutes.


----------



## Xaser04 (Jan 28, 2008)

malware said:


> Yes there're two PCBs, but pictures show them one by one...so that makes the illusion that there's only one core. I'll add the second PCB (pic) in a few minutes.



I take it then each pcb fits onto each side of the cooler. (kind of like sandwiching (SP?!) the cooler?!)


----------



## DaMulta (Jan 28, 2008)

Toms has a pic with the cooler on


----------



## btarunr (Jan 28, 2008)

Wow, since the release of the HD3870 X2, it's now NVIDIA's turn for all the viral-marketing drama (pre-release reviews, Mysterious pictures, etc.) Just to make people interested in the product


----------



## Hawk1 (Jan 28, 2008)

Wonder if temps are going to be an issue with the cores sandwiched? Aftermarket cooler/waterblock makers will have their work cut out for them on this one (if even possible).


----------



## Mattgal (Jan 28, 2008)

of course! thake a look at the pic. there are 2 hdmi poerts on eachother.


----------



## btarunr (Jan 28, 2008)

GSG-9 said:


> only 1 core? I thought the 9800 was physicly two 8800's, so its a dual core chip



Pics #2 and #3 are two different PCB's. The second PCB has no PCI-E expansion interface.


----------



## newtekie1 (Jan 28, 2008)

So the cores face eachother, and the cooler is placed in between.  I like this design a lot better than the 7950 GX2.


----------



## MicroUnC (Jan 28, 2008)

GSG-9 said:


> only 1 core? I thought the 9800 was physicly two 8800's, so its a dual core chip




It have 2 PCB's it's going to be sandwich, 2 PCB's & a fan & Heatsink between them. very tasty


----------



## zOaib (Jan 28, 2008)

newtekie1 said:


> So the cores face eachother, and the cooler is placed in between.  I like this design a lot better than the 7950 GX2.



the idea is great but we have to wait and see real temps and other numbers once this thing gets reviewed .............. but on the other hand ATi did a better job just putting 2 cores on one pcb , i gues nvidia is just to lazy to do that instead ....... just my opinion .


----------



## phanbuey (Jan 28, 2008)

zOaib said:


> but on the other hand ATi did a better job just putting 2 cores on one pcb , i gues nvidia is just to lazy to do that instead ....... just my opinion .



haha... maybe true... Cant wait to see how this stacks up against the  3870X2... seems like Nvidia may actually have some trouble with this one, as SLi doesnt scale as well ... Personally I would rather have seen a signle chip solution... like a 9800GTX with 256 SP's but that may be asking for a bit too much atm.


----------



## newtekie1 (Jan 28, 2008)

zOaib said:


> the idea is great but we have to wait and see real temps and other numbers once this thing gets reviewed .............. but on the other hand ATi did a better job just putting 2 cores on one pcb , i gues nvidia is just to lazy to do that instead ....... just my opinion .



There are ups and downs to both setups.  With ATI's design, one core is getting cooled by the hot air from the other core.  So one core is guaranteed to run a lot hotter than the other.  The other option is to use two heatsinks and fans, but then the hot air isn't being exhausted out of the case, which is also bad.

I don't really think one disign is really better than the other.


----------



## mab1376 (Jan 28, 2008)

does anyone know the msrp?


----------



## CH33T03S (Jan 28, 2008)

mab1376 said:


> does anyone know the msrp?



Yea thats what I want to know also!


----------



## EnergyFX (Jan 28, 2008)

mab1376 said:


> does anyone know the msrp?



How much do you make?


----------



## OnBoard (Jan 28, 2008)

3870 X2 wins in the aftermarket cooler part, this you can't even put pack together after you dismantle it, there's going to be that one left over part  But can't they not use some quality thermalpads and not that gooeyfibrestuff on the high end model at least :/

That sandwich idea is indeed tasty. Comes with instructions:

Remove packaging
Heat up 30min with favorite FPS
Enjoy the crispy silvery taste


----------



## candle_86 (Jan 28, 2008)

Hawk1 said:


> Wonder if temps are going to be an issue with the cores sandwiched? Aftermarket cooler/waterblock makers will have their work cut out for them on this one (if even possible).




shouldn't the bolting holes from the 7950GX2 should carry over


----------



## newtekie1 (Jan 28, 2008)

This should actually be easier to watercool compared to the 7950GX2.  With this you can just use on sandwiched waterblock.  And from the looks of the stock cooler, it shouldn't be too hard to build a waterblock that fits in that space.


----------



## Hawk1 (Jan 28, 2008)

newtekie1 said:


> This should actually be easier to watercool compared to the 7950GX2.  With this you can just use on sandwiched waterblock.  And from the looks of the stock cooler, it shouldn't be too hard to build a waterblock that fits in that space.



Yeah, now that I have a second look at it, it may be easier to WC this thing compared to the 7950's. It looks like it may be difficult to take appart/put back together without undue damage/scratches to the outer shell (thinking for warranty purposes, if you have any issues with the card).


----------



## candle_86 (Jan 28, 2008)

never heard of RMA refusal for a scratch


----------



## Hawk1 (Jan 28, 2008)

candle_86 said:


> never heard of RMA refusal for a scratch



No, but if it's clearly been tampered with/opened up, you may get some issues - I think its EVGA thats the only one that allows aftermarket cooling on thier cards without voiding warranty.


----------



## candle_86 (Jan 28, 2008)

evga and XFX


----------



## TheLostSwede (Jan 28, 2008)

Well, this card seems to use some kind of ribbon cable to connect to the two PCB's together by the looks of this image - http://www.techpowerup.com/img/08-01-28/1-3.jpg
This suggests that this really is just a pair of cards in SLI using a single PCIe interface.
It seems like the only reason the second PCB has an eight-pin power connector is because it can't draw any power from the PCIe interface, as the ribbon cable doesn't allow for power between the two cards. Seems like a strange setup and a bit of a bodge job in all honesty.
It really looks like Nvidia likes to do "quick fixes" these days like the nForce 780i chipset...


----------



## shoman24v (Jan 28, 2008)

This card(s) will definatly be over $1000.


----------



## Hawk1 (Jan 28, 2008)

shoman24v said:


> This card(s) will definatly be over $1000.



LOL, I don't think they would get many buyers (maybe Candle) at that price. Even if it does a million FPS in Crysis on a 30" screen, it would not be a great seller.


----------



## candle_86 (Jan 28, 2008)

TheLostSwede said:


> Well, this card seems to use some kind of ribbon cable to connect to the two PCB's together by the looks of this image - http://www.techpowerup.com/img/08-01-28/1-3.jpg
> This suggests that this really is just a pair of cards in SLI using a single PCIe interface.
> It seems like the only reason the second PCB has an eight-pin power connector is because it can't draw any power from the PCIe interface, as the ribbon cable doesn't allow for power between the two cards. Seems like a strange setup and a bit of a bodge job in all honesty.
> It really looks like Nvidia likes to do "quick fixes" these days like the nForce 780i chipset...



actully it makes perfect sense how they did it, if they do it like ATI did, there are lenght concerns. The 7950GX2 is built very similar and sold decently well. Also dont expect 1000 dollar card, expect say 500.


----------



## AddSub (Jan 28, 2008)

> This card(s) will definatly be over $1000.



I wouldn't be surprised. Not that this card is actually worth that much. Or half that much, since multi-GPU solutions in general are feeble and unimpressive performers if similar products in the past are any kind of indicator. But, this is nvidia we are talking about and they know how to harness and exploit the hype around their products. Right now, nvidia is Antec of GPU's. In other words, actual performance has nothing to do with price. 

Since ATI is pricing their 3870 X2's at $450, these GX2's will go for about $550-$700, depending if it's a OC/ultra/extreme/supa-dupa version or not.


----------



## pentastar111 (Jan 28, 2008)

Hawk1 said:


> Wonder if temps are going to be an issue with the cores sandwiched? Aftermarket cooler/waterblock makers will have their work cut out for them on this one (if even possible).


 My thoughts exactly...Have fun cooling that m/f down...looks like a thermal nightmare.


----------



## pentastar111 (Jan 28, 2008)

I think this card/cards will perform well...provided the drivers are up to snuff(cough,cough) anyone remember the 7950 fiasco?...nVidia has and does have some really good stuff out there...I have 3 of their cards... a 7600 gt(best bang for the buck ever) and 2 640mb 8800GTS's. All of these products have performed flawlessly for over a year now. I don't believe this X2 card will be one of them...Heat will probably be just one of the factors making this card destroy all fun within a 3 mile radius...drivers will be the next and the third of course is going to be price...While I don't think it will top a grand, I don't think $600 to $750 a card will be uncommon. I could be wrong...we shall see...right now those ATI's are looking pretty sweet....And that is good...more compitition spurs more innovation and better products  AND (cough, cough) lower prices


----------



## PVTCaboose1337 (Jan 28, 2008)

I'm thinking about $600 or they will have no market.


----------



## WOutZoR (Jan 28, 2008)

pentastar111 said:


> My thoughts exactly...Have fun cooling that m/f down...looks like a thermal nightmare.


Lets wait until ViperJohn gets his hands on the 9800 GX2 

He's the first to watercool the 7900 GX2, remember?


----------



## erocker (Jan 28, 2008)

WOutZoR said:


> Lets wait until ViperJohn gets his hands on the 9800 GX2
> 
> He's the first to watercool the 7900 GX2, remember?



Yeah, that's nice and all, but it's too bad drivers weren't any good.


----------



## tkpenalty (Jan 28, 2008)

That cooler is just a 8800GT's cooler multiplied by two and soldered together....... I smell overheating. Unless these are the 8800GT G92s, and underclocked both of these cards will run around t he 90*C, unless that fan is super loud. 

I'd still prefer the HD3870X2, a far easier option if you want aftermarket cooling. yes water cooling is possible with this however but its not easy.






That definately looks nice however.

I can expect this card to perform somewhat slower than the HD3870 in some cases, with the disadvantage that SLi brings. Moreover, will our PCs detect this as one or two GPUs? If its two... this card wont sell.


----------



## imperialreign (Jan 28, 2008)

TBH, to me this card looks like a last minute panic response to the 3870x2 that's been hyped up over the last month or two. 

Just my opinion, unless nVidia gets their SLI on-par with ATI's Crossfire, nVidia is going to be out of their league in multi-GPU setups.  This is going to become very interesting - I'm looking forward to seeing how this card performs when compared to ATI's new monstrosity.


I like how it looks, though.  That casing around it is sweet.


----------



## TooFast (Jan 28, 2008)

lol thats the best nvidia could do, slap 2 cards together.  lol let me guess you need an sli board to run it! and all that heat in the case. OMG


----------



## Ripper3 (Jan 29, 2008)

This is what Nvidia does for a living. The 7950GX2 was their answer to the X1950XTX... couldn't get a decent card with a single GPU, so they stuck two together instead.
Seems they're doing te same with the 8800.
The cooling won't be a problem, me thinks. The 65nm 8800 doesn't seem to be bothered about heat at all, so this cooling solution will still work. I'm sure that cooler might even get better temps than the stock 8800GT one, since that fan looks like a very big, very chunky fan. If the new bigger fan on the single-slot stock 8800GT cooler is anything to go by, this might be a slow-spinning fan, trying to keep the noise down, and only just keeping the cores alive (since the 60mm ran extremely high speeds, to keep the GPU at, say, 83c, and made a hell of a racket, with the newer 70mm model capable of keeping the GPU a little tiny bit cooler, at ~80c but at much much lower RPMs, and hardly audible, second what the released materal stated).
Replacing the cooling may be trouble, since the two GPUs are facing each other, and that ribbon cable connecting the PCBs doesn't look like it likes stretching (hell, looks flimsy enough that it might come out while putting on the stock cooler, much less after-market).
Watercoolingwith a single replacement block in the middle is definately a possibility, and someone's probably already thinking it up.

I must say that the ATi arrangement for the 3870 X2 looks to be much better, in terms of simplicity, and in terms of having a tidier layout (no chance of misplacing that second PCB layer, when doing maintenance). Yes, on air cooling the second GPU will be receiving the hotter air, but I'm unsure if it will make such a large difference. Watercooling meanwhile should be a piece of cake. Just get two seperate waterblocks, like the Maze 5, along with two packs of RAM 'sinks, and a low-profile chipset 'sink for the PCI-E switcher between the cores.


----------



## Mussels (Jan 29, 2008)

DaMulta said:


> Toms has a pic with the cooler on



even if its dual PCB, that looks quite good and is still only two slots WITH cooler.


----------



## AddSub (Jan 29, 2008)

> I must say that the ATi arrangement for the 3870 X2 looks to be much better



Yup. X2 looks much more accessible as far as custom cooling solutions go. Heck, you could fix a water block on one GPU, and go passive with the other one. Not that it would be a good idea to do so, but it shows how much room you got to maneuver. 

GX2 smells like desperation on part of nvidia. And the latest delay for the 9xxx lineup doesn’t help their image, an image they try so hard to keep up. Let’s face it, a large part of nvidia is in fact their “GeForce” branding, and in smaller part their “nVidia” branding. They could take some 10 year old Voodoo boards and rename them GeForce VoDo and people would eat em up. Certain type of people that is.

Anyways, this whole “delay” in order to work out some bugs, or whatever, seems like pure unfiltered guano to begin with. I mean, the only time a firm like nvidia would admit to having some serious bugs in their latest product lineup is when there is some bigger issue to hide, like, oh I don’t know, maybe performance issues?


----------



## indybird (Jan 29, 2008)

Hawk1 said:


> Yeah, now that I have a second look at it, it may be easier to WC this thing compared to the 7950's. It looks like it may be difficult to take appart/put back together without undue damage/scratches to the outer shell (thinking for warranty purposes, if you have any issues with the card).



Well if you're watercooling you'll be voiding the warranty anyways...  I think that there will actually be some after-market waterblocks for these because of simplicity of it.  

Oh boy, I can't wait for this card.  Since they're delaying because of driver issues, it makes me confident that this card will be pretty solid by its release.  And I am very sure this card will be around $450 to $550.  Those are very affordable prices for a card of this power.

Maybe I'm being over optimistic, but we'll see...

-Indybird


----------



## imperialreign (Jan 29, 2008)

AddSub said:


> Yup. X2 looks much more accessible as far as custom cooling solutions go. Heck, you could fix a water block on one GPU, and go passive with the other one. Not that it would be a good idea to do so, but it shows how much room you got to maneuver.
> 
> GX2 smells like desperation on part of nvidia. And the latest delay for the 9xxx lineup doesn’t help their image, an image they try so hard to keep up. Let’s face it, a large part of nvidia is in fact their “GeForce” branding, and in smaller part their “nVidia” branding. They could take some 10 year old Voodoo boards and rename them GeForce VoDo and people would eat em up. Certain type of people that is.
> 
> Anyways, this whole “delay” in order to work out some bugs, or whatever, seems like pure unfiltered guano to begin with. I mean, the only time a firm like nvidia would admit to having some serious bugs in their latest product lineup is when there is some bigger issue to hide, like, oh I don’t know, maybe performance issues?



not so sure on the performance thing . . . but their kicker is staying ahead of ATI as far as performance goes.

The 8800 GTX/Ultra cards are going to be very, very hard for nVidia to top out performance wise - you can only go so far with processor architecture.  The last few models of 8800s being pumped out varied more in the amount of DRAM than anything else, it seemed - almost like they were milking the architecture for all it's worth.

But, this is product is typical of nVidia, too, whenever they start to feel threatened by ATI, they literally slap something together and throw it out the door.  TBH, this card setup really doesn't look like much thought was put into it - two PCBs, two GPUs, one cooler . . . 

If it wasn't for that pretty chasis enshrouding the two PCB's - it'd be one fugli VGA adapter - very hack & slash approach.

I'm curious to see if they're actually cooking up a single PCB/dual GPU offering for the second revision 9800 GX2s or 9800 Ultras (if they go the "Ultra" route again).

It wouldn't surprise me, though, if we start seeing more jimmy-rigged looking setups as we get closer to a release of the secretive R700.


----------



## phanbuey (Jan 29, 2008)

I dont think you guys realise the performance advantage nVidia had until now. And for how long... and how long the R600 got pushed back in the beginning.  Lets face it, the 8800 series were the best... this X2 doesnt "dominate" by any means, only in a few games and at extremely high res... ATI still doesnt have all their garbage in one bag - and the fact that they recycled their R600 architecture from the HD 2900 to the HD 3800 is no different than what nvidia did with their "milking" of the g80 core... still the X2, which is 2 new chips gets beat out by NV cards that are over a year old now in some games - that should not happen, irrelevant of who wrote the game and blah blah.  The VLIW architecture is just very hit and miss depending on application, which ultimately makes the R600 cards unreliable performers.

In reality, i think the next gen is going to be ludicrously fast... ATi and Nvidia are both buying time with their x2 and gx2 cards.

I love ATI, and i think this X2 is amazing even with crap drivers, but dont forget that Nvidia was forbes' company of the year out of EVERY industry - those guys are rolling in money. just pray for ATI that the 9800GX2 is not twice as fast as the X2.  Also i thought the GX2 was announced a Looooong time ago... like months before the g92 core was even out.


----------



## candle_86 (Jan 29, 2008)

TooFast said:


> lol thats the best nvidia could do, slap 2 cards together.  lol let me guess you need an sli board to run it! and all that heat in the case. OMG



and all AMD could do was slap two GPU's on one PCB, make it longer and have them share the air, both are good cards, stop being a hater.


As for preformace remember this isn't an 8800GT SLI, this is an 8800GTS G92 so 2x128sp, meaning simply it will give the HD3870 a KO.


----------



## candle_86 (Jan 29, 2008)

phanbuey said:


> I dont think you guys realise the performance advantage nVidia had until now. And for how long... and how long the R600 got pushed back in the beginning.  Lets face it, the 8800 series were the best... this X2 doesnt "dominate" by any means, only in a few games and at extremely high res... ATI still doesnt have all their garbage in one bag - and the fact that they recycled their R600 architecture from the HD 2900 to the HD 3800 is no different than what nvidia did with their "milking" of the g80 core... still the X2, which is 2 new chips gets beat out by NV cards that are over a year old now in some games - that should not happen, irrelevant of who wrote the game and blah blah.  The VLIW architecture is just very hit and miss depending on application, which ultimately makes the R600 cards unreliable performers.
> 
> In reality, i think the next gen is going to be ludicrously fast... ATi and Nvidia are both buying time with their x2 and gx2 cards.
> 
> I love ATI, and i think this X2 is amazing even with crap drivers, but dont forget that Nvidia was forbes' company of the year out of EVERY industry - those guys are rolling in money. just pray for ATI that the 9800GX2 is not twice as fast as the X2.  Also i thought the GX2 was announced a Looooong time ago... like months before the g92 core was even out.




thats because of the design of the chip it has 5 groups of 64 shaders, while nvidia has 8 groups of 16 shaders. So nvidia can basiclly load the card better, if say you need 13 shader instructions that leaves Nvidia 7 more units of 112 shaders, but AMD cuts it and takes 64units to do 32 leaving it with 4 groups left. AMD's could be used better but they where so late most game devs didnt optimize for the R600GPU and thus access it the same way the G80 is, which leave R600 with a major disadvantage.


----------



## TooFast (Jan 29, 2008)

candle_86 said:


> and all AMD could do was slap two GPU's on one PCB, make it longer and have them share the air, both are good cards, stop being a hater.
> 
> 
> As for preformace remember this isn't an 8800GT SLI, this is an 8800GTS G92 so 2x128sp, meaning simply it will give the HD3870 a KO.




Hater! its the truth, its two cards glued together! the x2 is a true single card, even if it had 16 gpus on it. as for the the glued nvidia card it will surely be 600$+ card with LOWER CLOCKS, IT MIGHT NOT EVEN BEAT THE X2


----------



## xvi (Jan 29, 2008)

I wonder if I could go eBay my old ATI Radeon 9800 for $500 when this comes out.
"9800 GRAPHICS CARD".. "Play games faster than ever before!"


----------



## candle_86 (Jan 29, 2008)

a few did something similar with the Radeon 8500, I know a few people that thought they where getting a rare 8500GT AGP card.


----------



## newtekie1 (Jan 29, 2008)

Funny how there are so many of you that are willing to start bashing nVidia so quickly.

The 7900 GX2(7950GX2) was not to combat the x1950XTX, it was to experiment with Quad-SLI.  The 7900GTX did perfectly fine competing with the x1950XTX, it wasn't quite as fast, but it did the job, and there were overclocked version that were on par with the x1950XTX.

This isn't simply a thrown together solution to compete with the 3870 X2, I wouldn't be surprised if there was just as much planning in this card as the 3870 X2.  This isn't he first time nVidia has done it this way, and it worked in the past(with the exception of poor driver support) so why change the way it is done?  Why manufacture one extremely complicated PCB(and the 3870 X2's PCB is extremely complicated) when you can manufacture 2 PCBs that aren't that much more compicated than a normal video card?

Besides that, the 3870 X2 was just a solution to compete with nVidia's 8800GTS(G92) and 8800 Ultra.  So making the argument that the 7950GX2 was just thrown together to compete with the x1950XTX and making a big deal out of the fact that nVidia could get a single GPU to compete with the x1950XTX is kind of hypocritical since that is exactly the problem ATI is facing right now.

The move to use 2 PCBs has its advantages.  One major one I can see is that if oen of the PCBs is bad, it is cheaper for nVidia to simply replace that one instead of replacing the whole card.

Yes, there are advantages to both designs.  I'm not saying one is better than the other.  ATI chose thier method and nVidia stuck with the method that has worked for them in the past.  By the way, which one of the two has had a dual GPU card that was actually successfull before?  And what design did it use?  Yeah, I can see why sticking with that design was such a bad move.


----------



## Mussels (Jan 29, 2008)

newtekie1 said:


> Funny how there are so many of you that are willing to start bashing nVidia so quickly.
> 
> The 7900 GX2(7950GX2) was not to combat the x1950XTX, it was to experiment with Quad-SLI.  The 7900GTX did perfectly fine competing with the x1950XTX, it wasn't quite as fast, but it did the job, and there were overclocked version that were on part with the x1950XTX.
> 
> ...



and as another thing... whats so wrong with sandwhiching the cooler in between the cards?

The ATI solution has one GPU running a lot hotter than the other, while the Nv one gets away with a single fan, and one heatsink cooling both GPU's - all components that need cooling are on the inside, so you've got yourself a duct essentially. seal it up and get some CFM passing through, and the cooling of the whole card (both cards GPU, ram, voltage chips etc) are all going to get some good cooling.


----------



## newtekie1 (Jan 29, 2008)

Mussels said:


> and as another thing... whats so wrong with sandwhiching the cooler in between the cards?
> 
> The ATI solution has one GPU running a lot hotter than the other, while the Nv one gets away with a single fan, and one heatsink cooling both GPU's - all components that need cooling are on the inside, so you've got yourself a duct essentially. seal it up and get some CFM passing through, and the cooling of the whole card (both cards GPU, ram, voltage chips etc) are all going to get some good cooling.



Exactly!  The reviews of the 3870 X2 have the first GPU running at ~65C under load, while the second one is reaching ~80C. Not dangerous temperatures, but certainly a little concerning.  It seems to me like nVidia took their proven design and improved upon it.


----------



## mR Yellow (Jan 29, 2008)

phanbuey said:


> I dont think you guys realise the performance advantage nVidia had until now. And for how long... and how long the R600 got pushed back in the beginning.  Lets face it, the 8800 series were the best... this X2 doesnt "dominate" by any means, only in a few games and at extremely high res... ATI still doesnt have all their garbage in one bag - and the fact that they recycled their R600 architecture from the HD 2900 to the HD 3800 is no different than what nvidia did with their "milking" of the g80 core... still the X2, which is 2 new chips gets beat out by NV cards that are over a year old now in some games - that should not happen, irrelevant of who wrote the game and blah blah.  The VLIW architecture is just very hit and miss depending on application, which ultimately makes the R600 cards unreliable performers.
> 
> In reality, i think the next gen is going to be ludicrously fast... ATi and Nvidia are both buying time with their x2 and gx2 cards.
> 
> I love ATI, and i think this X2 is amazing even with crap drivers, but dont forget that Nvidia was forbes' company of the year out of EVERY industry - those guys are rolling in money. just pray for ATI that the 9800GX2 is not twice as fast as the X2.  Also i thought the GX2 was announced a Looooong time ago... like months before the g92 core was even out.



Well said! nVidia is on top atm. The X2 is more like ATi's only hope of beating nVidia...and then only in some games.

I personally won't be buying either card. I'm waiting for the next gen cards. These dual chip solutions are just week ass...from both sides.


----------



## candle_86 (Jan 29, 2008)

newtekie1 said:


> Funny how there are so many of you that are willing to start bashing nVidia so quickly.
> 
> The 7900 GX2(7950GX2) was not to combat the x1950XTX, it was to experiment with Quad-SLI.  The 7900GTX did perfectly fine competing with the x1950XTX, it wasn't quite as fast, but it did the job, and there were overclocked version that were on par with the x1950XTX.
> 
> ...



See I agree here, and the 9800GX2 appeared in the 165.01 Beta Drivers back in May 07 as 8800GX2 so Nvidia had been playing with the idea for awhile.


----------



## Tamin (Jan 29, 2008)

mR Yellow said:


> Well said! nVidia is on top atm. The X2 is more like ATi's only hope of beating nVidia...and then only in some games.
> 
> I personally won't be buying either card. I'm waiting for the next gen cards. These dual chip solutions are just week ass...from both sides.



enlight us pls, what will u buy?  the next next? like past the 9800's n 3800's?


----------



## Mussels (Jan 29, 2008)

Tamin said:


> enlight us pls, what will u buy?  the next next? like past the 9800's n 3800's?



he'd probably go for the next single card solution like i would.

These cards arent slow - but neither are they a big leap. Compare a 7900GTX or x1900xtx to the 8800GTX. We all want another leap like that, not these 5-10% gains.


----------



## Tamin (Jan 29, 2008)

yes sir! hallelujah  and i wanted to sell my "2900 card pc" and get 2x3870, guess ill wait... thank you!


----------



## candle_86 (Jan 29, 2008)

that doesnt happen often in this world.

Its happened a few times only actully

Voodoo 1 to Voodoo 2

Radeon 8500 to Radeon 9700pro

GeforceFX5950Ultra to 6800Ultra.

Geforce 7900GTX to Geforce 8800GTX


----------



## Mussels (Jan 29, 2008)

Tamin said:


> yes sir! hallelujah



I've had my GTX for over a year. I paid $700 au for it.

Even after a year, with only swapping to a quieter cooler this card still fights for 3rd place in games (8800ultra vs 3870 x2 for first and 2nd, with GTX third)

If you compare that to the people getting 'the best' all the time - they spend $400-500 on mid range hardware every 6 months. the GTX certainly has huge value for money since its still going strong so long after it came out.

In all the years i've been a gamer, i've not seen a card go as good as since the radeon 9700PRO - and if you think of the 9800 as the ultra version, its quite similar in its dominance to the 8800GTX/ultra.

edit:candle beat me to it. damn.


----------



## Tamin (Jan 29, 2008)

imagine all these companies wudda care for us and realease 1 card per year! haha sweet!


----------



## candle_86 (Jan 29, 2008)

Mussels said:


> I've had my GTX for over a year. I paid $700 au for it.
> 
> Even after a year, with only swapping to a quieter cooler this card still fights for 3rd place in games (8800ultra vs 3870 x2 for first and 2nd, with GTX third)
> 
> ...



I beg to differ there, the Geforce2 was the longest supported, came out in 2000, and will run Call of Duty2, I know I had to do it for a few days lol. Thats 6 years of life from them. 9700pro went from 2003 till 2007 so its only got 4 years under its belt.


----------



## Ripper3 (Jan 29, 2008)

@ Candle: The FX5950 Ultra to the 6800 Ultra is certainly true, and to clarify the huge gap in performance, even the 6600GT was capable of kicking the ass of an FX5950 at times. I'm still waiting for that to happen again. The 7600GT almost did the same thing to the 6800 Ultra, at stock speeds. The 8600GT/GTS could have been a bit better, but did keep up with the 7900s, and could beat the 7800s.

Oh, and I do agree with Newtekie1, I think I was a bit hypocritical in my comparison, but I guess I was sleepy, heh.
I did always think the 7950GX2 was made to give Nvidia a fighting chance against the X1950XTX, that's how it looked to me at least, when it was released.



candle_86 said:


> I beg to differ there, the Geforce2 was the longest supported, came out in 2000, and will run Call of Duty2, I know I had to do it for a few days lol. Thats 6 years of life from them. 9700pro went from 2003 till 2007 so its only got 4 years under its belt.



If I had known my old GF2 was capable of Call of Duty 2, I would have used it when I switched graphics cards, about a year back. I was _SO_ bored... but anyhow, coming to realize it, what DX version did the GF2 support? If it was DX7, then a lot of games have had good support for it. Call of Duty 2, as mentioned, and HL2, and all the variations there-of (since they support DX7 AFAIK). Certainly right about the support for it.
The 9700 still has support in games, if running slowly is fine by the user. Many games can still use DX9 basic, instead of DX9c for rendering.


----------



## Hawk1 (Jan 29, 2008)

Ripper3 said:


> I did always think the 7950GX2 was made to give Nvidia a fighting chance against the X1950XTX, that's how it looked to me at least, when it was released.



I think the Companies release things like this as a stop gap before their next big release. The 1950xtx with GDDR4 was not necessary as it only provided marginal improvement to the 1900xtx (although it held the crown of "top spot" (arguably) for the month or two it was out before the 8800's, and it was a bit quieter than its predicessor.)

The 7950GX2 was an experiment for Nvidia and was mostly a high priced novelty item for the big spenders, before the 8800's. It was a great concept, but never got the driver support it (the owners) deserved - I think due the 8800 driver issues with Vista, all NV resources were focused on getting that straight and left the GX2 owners blowing in the wind. 

I think its the same with these current releases of the x2 and GX2 (well, maybe less so for ATI - they just wanted to say they had the fastest for a while - like the 1950xtx). They are stop gaps for R700/G100 that will come later this year, which if rumours have it, will be the next "big" leap in performance.


----------



## newtekie1 (Jan 29, 2008)

The 7900 GX2 and 7950 GX2 were made to allow quad-SLI, not to compete with the x1950XTX.  The 7900GTX did just fine competing against the x1950XTX, as did the 8800 series which came very shortly after the release of the x1950XTX.  In case you guys forgot, the 7900GX2 came out in January of 06, the x1950XTX didn't come out until August of 06.  So you are saying nVidia released a card 8 months before the x1950XTX because they wanted to compete with it?  What kind of logic is that?  The x1950XTX was just hitting the drawing boards when the 7900GX2 game out, and by the time the x1950XTX hit nVidia only had 2 months before they were going to release the 8800 series.

And to all the people bashing nVidia's design:  At least nVidia came up with their own design instead of just stealing it from ASUS like ATi did.


----------



## Hawk1 (Jan 29, 2008)

newtekie1 said:


> And to all the people bashing nVidia's design:  At least nVidia came up with their own design instead of just stealing it from ASUS like ATi did.



They did? Link?


----------



## newtekie1 (Jan 29, 2008)

Hawk1 said:


> They did? Link?



ASUS GeForce 7800GT Dual

There you go, ASUS came up with the dual GPU on a single PCB design, which ATi pretty much stole and improved upon to give us the 3870 X2(and Sapphire did the same thing with their x1950 Pro Dual card also).  And yes, ASUS isn't actually the first to do it, it was just the most well known.  Gigabyte did it with 6600GT cores.

Edit: I actually commend nVidia for coming up with its own design instead of just copying what has already been done(in the nVidia camp at that )


----------



## Hawk1 (Jan 29, 2008)

newtekie1 said:


> ASUS GeForce 7800GT Dual
> 
> There you go, ASUS came up with the dual GPU on a single PCB design, which ATi pretty much stole and improved upon to give us the 3870 X2(and Sapphire did the same thing with their x1950 Pro Dual card also).  And yes, ASUS isn't actually the first to do it, it was just the most well known.



No, I meant that Nvidia came up with its own design jk thanks for the link.


----------



## candle_86 (Jan 29, 2008)

no ASUS didnt

Voodoo 2






Rage Fury Maxx





Voodoo 5 5500





Gigabyte 3D1 6600GT





Gigabyte 3D1 6800GT





ASUS 7800GT Dual





Saphire 1950Pro Dual






so as you can see from these its common


----------



## AddSub (Jan 29, 2008)

You forgot XGI's Volari Duo:






Wow, this thread has gone off the rails.


----------



## newtekie1 (Jan 29, 2008)

candle_86 said:


> so as you can see from these its common



Correct, I stated that ASUS wasn't actually the first to do it.  My point was that nVidia designed their own method with the 7900GX2 instead of just reusing previous designs.


----------



## Tamin (Jan 29, 2008)

look how they evolved the bastards


----------



## imperialreign (Jan 29, 2008)

Mussels said:


> and as another thing... whats so wrong with sandwhiching the cooler in between the cards?
> 
> The ATI solution has one GPU running a lot hotter than the other, while the Nv one gets away with a single fan, and one heatsink cooling both GPU's - all components that need cooling are on the inside, so you've got yourself a duct essentially. seal it up and get some CFM passing through, and the cooling of the whole card (both cards GPU, ram, voltage chips etc) are all going to get some good cooling.



I'm not so sure on the thought that the 3870x2 will have one GPU running hotter than the other - based on the fact that one GPU has an aluminum based cooler, whereas the second uses copper - both should theorhetically stay very close to the same temp.

We'll have to defi see, though, as I don't think I've read that being touched upon in any reviews, yet.


On the whole dual GPU thing - 3DFX, IIRC, were the first to go that route; also the first to impliment a multiple card setup through SLI (not just dual card).  But, even 3DFX had severe limitations with this technology, and performance was bleh compared to the stoopid heat output of those GPUs.  nVidia put the SLI tech on the back burner after their acquisition of 3DFX, and didn't start investing into again until ATI started developing Crossfire, and for some reason, nVidia has lagged behind in performance in a multiple card setup (when comparing percentage to percentage improvement versus ATI's offerings).  Maybe building off of 3DFX's start with SLI, a lot of the limitations that were inherent then have continued to carry over to now, who knows?

Although, if nVidia get SLI working as solidly as ATI's Crossfire - these dual PCB setups will be a nightmare to beat, as each GPU as it's own resources and doesn't have to "share".  TBH, I think that will end up being the biggest limitation of ATI's 3870x2.


----------



## newtekie1 (Jan 30, 2008)

imperialreign said:


> I'm not so sure on the thought that the 3870x2 will have one GPU running hotter than the other - based on the fact that one GPU has an aluminum based cooler, whereas the second uses copper - both should theorhetically stay very close to the same temp.
> 
> We'll have to defi see, though, as I don't think I've read that being touched upon in any reviews, yet.



Read the review on the 3870X2 here at TPU.  One of the cores ran at 65°C under load, and the other ran at 80°C.

And wasn't it Alienware that first went with a dual card solution using nVidia GPUs(talking modern GPUs here).  That is the first I ever remember hearing about multiple GPU setups, and nVidia soon released their SLI.  I don't even remember crossfire being mentioned until after SLI was already on the market.  In fact SLI was on the market in June of 2004 with the release of the and Crossfire wasn't on the market until September of 2005, more than a year later.

I think you have your time lines and who created what to compete with who confused.  Crossfire was developed to compete with nVidia's SLI.  And it only recently reached the level of performance improvement that SLI gives.  ATI just finally got Crossfire working as solidly as SLI.


----------



## PVTCaboose1337 (Jan 30, 2008)

newtekie1 said:


> Read the review on the 3870X2 here at TPU.  One of the cores ran at 65°C under load, and the other ran at 80°C.



Ya the aluminum HS was on one core, and the copper HS on the other core...  weird cooler design.


----------



## Mussels (Jan 30, 2008)

PVTCaboose1337 said:


> Ya the aluminum HS was on one core, and the copper HS on the other core...  weird cooler design.



it was designed so that the first GPU's heat wasnt entirely dumped into the second one. Didnt work so well, as the heat is still really unbalanced.


----------



## Hawk1 (Jan 30, 2008)

Mussels said:


> it was designed so that the first GPU's heat wasnt entirely dumped into the second one. Didnt work so well, as the heat is still really unbalanced.



That's why I'm waiting to see how the ASUS/GeCube dual fan versions do as far as cooling each core, and if it causes any other significant heat problems for the card (obviously the heat remaining in the case will be an issue). I cant wait to see the 9800 and the cooling for it (how stock performs and what aftermarket goodies come about). Will be very interesting.


----------



## imperialreign (Jan 30, 2008)

newtekie1 said:


> Read the review on the 3870X2 here at TPU.  One of the cores ran at 65°C under load, and the other ran at 80°C.





I musta completely missed that reading W1z's review.  That's a ton of a difference, there!



newtekie1 said:


> And wasn't it Alienware that first went with a dual card solution using nVidia GPUs(talking modern GPUs here).  That is the first I ever remember hearing about multiple GPU setups, and nVidia soon released their SLI.  I don't even remember crossfire being mentioned until after SLI was already on the market.  In fact SLI was on the market in June of 2004 with the release of the and Crossfire wasn't on the market until September of 2005, more than a year later.
> 
> I think you have your time lines and who created what to compete with who confused.  Crossfire was developed to compete with nVidia's SLI.  And it only recently reached the level of performance improvement that SLI gives.  ATI just finally got Crossfire working as solidly as SLI.



You're right, Crossfire was designed to _compete_ with SLI, I never said otherwise; but nVidia didn't get on the ball with their technology until rumors were out as to what ATI were up to - but nVidia did not pioneer SLI; 3DFX did.  nVidia originally acquired the technology when they bought 3DFX back in late 2000.  They also acquired multi-gpu per PCB and multi GPU/PCB + SLI, as 3DFX was also the company that pioneered those designs in their quest for supreme performance domination (The VooDoo5 6000 - which was never released - was to have 4 GPUs on one PCB, and was to have come with it's own power supply: http://www.x86-secret.com/articles/divers/v5-6000/v56kgb-6.htm . . . actually, if you get a chance, check that whole site from page 1, lot's of interesting info there!).  But, like I said, after the acquisition, nVidia didn't re-introduce SLI until '04.  ATI released Crossfire a year later, in '05 - not to get fanboish here, but look who has come the furthest.  nVidia acquired the technology and expanded on it, ATI designed theirs from the ground up.


----------



## Mussels (Jan 30, 2008)

crossfire was designed to beat SLI, and failed miserably at the start. However once they moved to an internal bridge like SLI, they finally got it right and have been equal to SLI since - the performance gain is higher than SLI for whatever reason, but it seems to be compatible with less games too. (especially DX10 titles)

Both of them have potential, i like where ATI is heading with fusion and crossfireX


----------



## imperialreign (Jan 30, 2008)

Mussels said:


> crossfire was designed to beat SLI, and failed miserably at the start. However once they moved to an internal bridge like SLI, they finally got it right and have been equal to SLI since - the performance gain is higher than SLI for whatever reason, but it seems to be compatible with less games too. (especially DX10 titles)
> 
> Both of them have potential, i like where ATI is heading with fusion and crossfireX



the initial implimentations were a little . . . bulky.  That external dongle wasn't that great an idea, and the need for a master/slave was a little odd, too.

Not sure 100% about performance gains, though.  At this point I think it's 50/50 - TBH, I think it also comes down to game devs, too - look at the *amazing* Crossfire performance increase everyone saw with the Crysis 1.1 patch


----------



## btarunr (Jan 30, 2008)

The sole reason behind Crossfire > SLI is this:

The northbridge, be it AMD 580X, 790FX or the Intel X38, supply all the 32 PCI-E lanes to the video cards and it eases inter-GPU communication than in the NForce 590 SLI, 680i SLI where the northbridge and southbridge each supply the video-cards with 16 lanes independently and the HyperTransport bus between the chipset is relatively congested when doing multi-GPU rendering. The same factor is what partly brings down the efficiency of running Crossfire setups on Intel P35 based boards where the second video card not only gets just 4 PCI-E lanes but also that the 4 lanes come from the southbridge.


----------



## newtekie1 (Jan 30, 2008)

imperialreign said:


> You're right, Crossfire was designed to _compete_ with SLI, I never said otherwise; but nVidia didn't get on the ball with their technology until rumors were out as to what ATI were up to - but nVidia did not pioneer SLI; 3DFX did.  nVidia originally acquired the technology when they bought 3DFX back in late 2000.  They also acquired multi-gpu per PCB and multi GPU/PCB + SLI, as 3DFX was also the company that pioneered those designs in their quest for supreme performance domination (The VooDoo5 6000 - which was never released - was to have 4 GPUs on one PCB, and was to have come with it's own power supply: http://www.x86-secret.com/articles/divers/v5-6000/v56kgb-6.htm . . . actually, if you get a chance, check that whole site from page 1, lot's of interesting info there!).  But, like I said, after the acquisition, nVidia didn't re-introduce SLI until '04.  ATI released Crossfire a year later, in '05 - not to get fanboish here, but look who has come the furthest.  nVidia acquired the technology and expanded on it, ATI designed theirs from the ground up.




Nvidia didn't bring SLI out of mothballs because ATi was working on Crossfire.  It is the other way around, ATi started developing Crossfire because nVidia was bringing SLI out of mothballs.  And really to say that the 3DFX SLI had anything to do with nVidia's current SLI is kind of off.  The only thing the two share is name and concept, other than that they are totally different.  Nvidia build their current SLI from the ground up, they simply took the concept from 3DFX.  Comparing who came the furthest isn't really worth anything.  Obviously ATi has come the furthest with the technology because Crossfire was a piece of crap when it was released, they had the furthest to come to be competitive.  SLI was a lot better when it was released, so nVidia hasn't needed to come as far.


----------



## Xaser04 (Jan 30, 2008)

Hawk1 said:


> That's why I'm waiting to see how the ASUS/GeCube dual fan versions do as far as cooling each core, and if it causes any other significant heat problems for the card (obviously the heat remaining in the case will be an issue). I cant wait to see the 9800 and the cooling for it (how stock performs and what aftermarket goodies come about). Will be very interesting.



It should be interesting to read a review about the Asus card as they have added an extra two DVI ports to it which whilst good on one hand is completely dump on the other as it blocks up the exhaust vent, thi whilst not such an issue on a single gpu card could be more of a concern on a dual gpu card especially if the case cooling can't get rid of the extra heat quickly enough.


----------

