# NVIDIA GeForce 9800GX2 New Pictures



## malware (Jan 5, 2008)

Enjoy the pics:





 

 

 



*View at TechPowerUp Main Site*


----------



## freaksavior (Jan 5, 2008)

thats insane


----------



## a111087 (Jan 5, 2008)

are they venting most of the air inside the case???


----------



## erocker (Jan 5, 2008)

Beautiful!


----------



## Snipe343 (Jan 5, 2008)

^+1, How much is this thing going to cost?


----------



## kwchang007 (Jan 5, 2008)

Whats the optical jack for?  Are they adding sound capability like the radeons?


----------



## erocker (Jan 5, 2008)

kwchang007 said:


> Whats the optical jack for?  Are they adding sound capability like the radeons?



Yes, since it's required for HDMI.  Is that displayport though?


----------



## BloodTotal (Jan 5, 2008)

That just looks sweet! dam Nvidia out did themselves with the slick look!


----------



## eidairaman1 (Jan 5, 2008)

looks like a FX5800.


----------



## BloodTotal (Jan 5, 2008)

now it looks like a a graphics card with armor! that means the armor is protecting some good stuff inside!


----------



## jbizzler (Jan 5, 2008)

Do you hook it up to your sound card/chip, or does it process sound on its own?


----------



## imperialreign (Jan 5, 2008)

> Do you hook it up to your sound card/chip, or does it process sound on its own?



I believe it processes it on it's own, if it's anything similar to the HDMI connections used on ATI's cards.

Although, between HDMI on both ATI and nVidia - I can't imagine what kinda headaches this will cause with X-Fi audio cards


----------



## btarunr (Jan 5, 2008)

Bah! Another hot sandwich. I was praying for this to turn-out into a single PCB card. It's like two PCBs with their components facing toward eachother. Wonder how much that could heat. Two of these cards and you can bake bricks.


----------



## GSG-9 (Jan 5, 2008)

Its pritty cool, I wish it was single slot though, if they had both cores on a single pcb they could avoid bridge bottleneck issues (possibly) that were on the 7900gx2s. Who knows, they may have already resolved that issue entirely. It does look rather awesome, Its a step up from the Dual 7800GT (Which I loved the appearance of)


----------



## Joe_tiger27 (Jan 5, 2008)

*Ouch!!!!!*

Daddy Likey!!!!!!


----------



## Judas (Jan 5, 2008)

It's very, very nice


----------



## Tatty_One (Jan 5, 2008)

Snipe343 said:


> ^+1, How much is this thing going to cost?



Try around $900


----------



## Judas (Jan 5, 2008)

btarunr said:


> Bah! Another hot sandwich. I was praying for this to turn-out into a single PCB card. It's like two PCBs with their components facing toward eachother. Wonder how much that could heat. Two of these cards and you can bake bricks.



Maybe you could make morning toast with it. Just slide you bread in-between the cards and add some butter and jam ..viola


----------



## Agility (Jan 5, 2008)

Lol it looks like an 2900XT OEM version. The biggy fan behind.


----------



## btarunr (Jan 5, 2008)

Judas said:


> Maybe you could make morning toast with it. Just slide you bread in-between the cards and add some butter and jam ..viola



Nice Idea...the toast would also have an embossing of the NVIDIA "eye" logo. Thank God, they didn't give this card two gold-fingers. Else imagine this in tri-SLI...You'll need a mineral oil immersion tank, a 1300 W PSU and a bank to rob.


----------



## erocker (Jan 5, 2008)

jbizzler said:


> Do you hook it up to your sound card/chip, or does it process sound on its own?



You could hook the card directly into your high-end stereo.


----------



## Darkmag (Jan 5, 2008)

I take it no after market cooler compatibly, so if you'd have to deal if how hot it runs. Should be interesting to see how it performs and how well the cooler works


----------



## erocker (Jan 5, 2008)

The fan seems to pull air in from both sides of the card.(dual pcb, makes sense)  Then it exhausts it out the back, (on the right of the DVI connectors) and by the looks of it, probablly quite forcibly!


----------



## ShadowFold (Jan 5, 2008)

Honestly, a HD 2400 looks sexier than that. 

(my god arguing about how sexy a piece of hardware looks!)


----------



## eidairaman1 (Jan 5, 2008)

Darkmag said:


> I take it no after market cooler compatibly, so if you'd have to deal if how hot it runs. Should be interesting to see how it performs and how well the cooler works



So would you have taken a 2900?



erocker said:


> The fan seems to pull air in from both sides of the card.(dual pcb, makes sense)  Then it exhausts it out the back, (on the right of the DVI connectors) and by the looks of it, probablly quite forcibly!



More Force= More Pressure which in turn equals more sound, so i say its a repeat of the FX5800



BloodTotal said:


> now it looks like a a graphics card with armor! that means the armor is protecting some good stuff inside!



Armor is not what makes a warrior a True Warrior, its the spirit. A 30% increase over the 8800 Ultra is nothing to be all giddy about, considering it uses 2 Chips and Double the Frame buffer.


----------



## erocker (Jan 5, 2008)

Sound isn't an issue with me, plus it's a big fan, I doubt it will be audible over my several case fans even at 100%


----------



## erocker (Jan 5, 2008)

Lol, I just found out it's still GDDR3! 

* Expected price is supposed to be $450 USD.  Nvidia is stupid for not calling this the 8800GX2


----------



## btarunr (Jan 5, 2008)

^big deal. The HD2900 DDR3 outperforms the HD2900 DDR4


----------



## erocker (Jan 5, 2008)

I'm talking GDDR5, and I was hoping that Nvidia could possibly work a little magic with GDDR4 since we haven't seen anything from them with that yet.  I guess perhaps I was a bit confused by the name of the card, as it deffinitely should be the 8800GX2.


----------



## Tatty_One (Jan 5, 2008)

erocker said:


> Lol, I just found out it's still GDDR3!
> 
> * Expected price is supposed to be $450 USD.  Nvidia is stupid for not calling this the 8800GX2



Lol I thought that was the price for a SINGLE forthcoming 9800???? There cannot be any way that 2 x High end cards bolted together are only going to cost that surely, I read somewhere that they cost around 900


----------



## TheGuruStud (Jan 5, 2008)

Naming convention is misleading and wrong, no doubt.

This will be like all of the "super ultra make your penis bigger" cards (I still like them though ). Even if the MSRP is reasonable, the lack of quantity will sky rocket the prices.

It's (as I and others have said before) a stop gap until the real 9xxx series. They can't have ATI release a card that can challenge the perf. of the Ultra, so they release this superficial card to keep claiming ultimate dominance.


----------



## xmountainxlionx (Jan 5, 2008)

btarunr said:


> Nice Idea...the toast would also have an embossing of the NVIDIA "eye" logo. Thank God, they didn't give this card two gold-fingers. Else imagine this in tri-SLI...You'll need a mineral oil immersion tank, a 1300 W PSU and a bank to rob.



not to mention a game to utilize them


----------



## EastCoasthandle (Jan 5, 2008)

This doesn't look good at all for an enclosed PC case that isn't heavily ventilated.   I'm also concerned with the price of this video card.  From memory the 7950 GTX2 wasn't cheap and was in limited quantity when they first came out. 

As for the look of it, its nothing special to me.  As a matter of fact I wonder if the added enclosure around the video card negatively effects proper cooling of both PCBs vs without it.
And finally, what is the shelf life for this product?  If the 7950 GTX2 had a shelf life of 3 months will this be the same or will it be in the market longer?


----------



## HaZe303 (Jan 5, 2008)

erocker said:


> I was a bit confused by the name of the card, as it deffinitely should be the 8800GX2.



Thats the whole idea. ATI began with the rehashed 2900xt and called it HD3850-70. And they seemed to sell quite nicely, so I guess nvidia thinks now its free to use whatever name and call old tech for next gen??? Very low taste tactics by Nvidia, all the little respect I had for Nvidia flew out the window the minute I heard the news about it having 2 old 8800 gpu´s inside. We need some real competition from ATI, until we do I doubt we´ll see any real "new" tech(50-70+ % improvement in performance from previous gen) from NV.


----------



## Duxx (Jan 5, 2008)

If this thing sells for 450$ ill shit myself.  That would be amazing.


----------



## erocker (Jan 5, 2008)

HaZe303 said:


> Thats the whole idea. ATI began with the rehashed 2900xt and called it HD3850-70. And they seemed to sell quite nicely, so I guess nvidia thinks now its free to use whatever name and call old tech for next gen??? Very low taste tactics by Nvidia, all the little respect I had for Nvidia flew out the window the minute I heard the news about it having 2 old 8800 gpu´s inside. We need some real competition from ATI, until we do I doubt we´ll see any real "new" tech from NV.



ATi needs to drop the R680 bomb on us all!


----------



## snuif09 (Jan 5, 2008)

^agreed

and if its 450$  then kids like me can play crisis on some good settings


----------



## imperialreign (Jan 5, 2008)

> ATi needs to drop the R680 bomb on us all!



R700, man - if it actually _does_ exist in their R&D departments

Oh, and I had thought that GDDR4 was ATI only as it was their technology they came up with?


----------



## Agility (Jan 5, 2008)

Just hope the 3870X2 owns it. I'm waiting to see how much performance is a 3870X2 vs 3870. If it's like GOD DAM WOW, i'mma give my 2900xt to my bro and get 3870X2 baby.


----------



## InfDamarvel (Jan 5, 2008)

Personally, I think theyll be good cards alone but Nvidia doesnt get much of a performance boost compared to Ati with dual card setups. I cant wait to see the performance difference from this and the 3870X2.


----------



## Xaser04 (Jan 5, 2008)

eidairaman1 said:


> looks like a FX5800.



It looks nothing like a FX5800. 

FX 5800 Ultra:







9800GX2:






Yes the likeness is uncanny


----------



## imperialreign (Jan 5, 2008)

Xaser04 said:


> It looks nothing like a FX5800.
> 
> FX 5800 Ultra:




the original leafblower!! 


http://www.youtube.com/watch?v=WOVjZqC1AE4


----------



## snuif09 (Jan 5, 2008)

^lol that guy behind the pc


----------



## Xaser04 (Jan 5, 2008)

imperialreign said:


> the original leafblower!!
> 
> 
> http://www.youtube.com/watch?v=WOVjZqC1AE4



Oh yes lol.


----------



## Tatty_One (Jan 5, 2008)

Here is a little more on the card (courtesy of Guru 3D/Hardcop), Ialthough no prices are mentioned anywhere, I have read the mention of "2nd mortgage on your home" so I would guess it is going to be near to my speculated $900 


http://www.guru3d.com/newsitem.php?id=6274


----------



## Weer (Jan 5, 2008)

7950 GX2 = 7900 GT 512MB SLi
9800 GX2 = 8800 GT 512MB SLi

Would have made much more sense to number the G92-based cards under the "8900" series.
The 9800 doesn't make much sense either, but I guess maybe they want to compete with ATI's crazy new naming scheme.


----------



## CDdude55 (Jan 5, 2008)

Won't fit in my case. Guess its time to go to consoles.Bye


----------



## mab1376 (Jan 5, 2008)

Tatty_One said:


> Try around $900



why so much?!?


----------



## Tatty_One (Jan 5, 2008)

mab1376 said:


> why so much?!?



Dunno, maybe I heard wrong


----------



## [I.R.A]_FBi (Jan 5, 2008)

nidiots milking us for all we got.


----------



## HeavyH20 (Jan 5, 2008)

I think they may be right around the $600 mark or so at launch.  I would expect this to be a $499 retail card, however. It is just two 8800 GTS cards in SLI on one slot. The real question is if the 256 bit memory bus is simply carried over.  And, the frame buffer will only be 512 MB. Looks like a product to simply skip over. I will wait for the next gen GTX cards.


----------



## Mussels (Jan 6, 2008)

xmountainxlionx said:


> not to mention a game to utilize them



every game can use the power, if you turn the settings up.
Hell, my system lags occasionally on the newer titles and it would only be worse if i had a larger screen.


----------



## vaperstylz (Jan 6, 2008)

Looks like it might be a while before we see Nvidia's real next gen technology.I would hazzard a guess that maybe they are just treading water until AMD/ATI releases R700.The problem with that is that probably won't make an appearance until mid 2008.In any case having to deal with all this shrunken rehashed "new" old tech is a bit frustrating.Income tax time is just about here,and i'm itching for some new toys!Depending on the specific specs and price point,this just might be one of the things that will help keep me busy until my next completely new build and that we be built around an Intel Nehalem multicore.But as for now my hardware jones needs to be fed,and if the specs are right this might just have to do for now.Oh well EVGA's step-up program has saved me more than once.LOL This hobby has proven to be a real crap shoot a long time ago.You've got to pay to play.There's always a risk the question is am i willing to take it...............we'll see.


----------



## pentastar111 (Jan 6, 2008)

HeavyH20 said:


> I think they may be right around the $600 mark or so at launch.  I would expect this to be a $499 retail card, however. It is just two 8800 GTS cards in SLI on one slot. The real question is if the 256 bit memory bus is simply carried over.  And, the frame buffer will only be 512 MB. Looks like a product to simply skip over. I will wait for the next gen GTX cards.


 I'm with you Heavy....So far I see no real reason to replace my current 8800's.....When the 8800's first made their appearance, the performance jump from 7900GT's to 8800GTS's was simply astounding, with these up and comers(9800) a 30% increase really doesn't pump my nads.:shadedshu..The minimal performance gains do not justify the painful punch in the wallet.(1000+ if you SLI) Just to add, and I hate to say it because I've always went nVidia (and probably still will) but the best bang for the buck right now appears to be the higher end ATI cards.   :-(


----------



## Ketxxx (Jan 6, 2008)

...... and nvidia expect you to put your soundcard and any other add-in boards you may need where now?


----------



## Mussels (Jan 6, 2008)

Ketxxx said:


> ...... and nvidia expect you to put your soundcard and any other add-in boards you may need where now?



in your OTHER puter, of course.


----------



## zOaib (Jan 6, 2008)

i can bet the 3870x2 will be less in price and similar or better in performance than this 9800gx2 ............. and also the 3870x2 will be released earlier than this card =)

3870x2 FTW


----------



## CDdude55 (Jan 6, 2008)

Decided not to get this but go with Bonetrail gonna look at some x38 and P35 mobos and make a thread tommarow. But gonna upgrade to a quad and 8800GTS on my current mobo first. Then later get new motherboard.


----------



## broke (Jan 6, 2008)

zOaib said:


> i can bet the 3870x2 will be less in price and similar or better in performance than this 9800gx2 ............. and also the 3870x2 will be released earlier than this card =)
> 
> 3870x2 FTW



I'm with you on that one, and if my memory serves me right, I think that the 3870x2 will have both cores implemented on one PCB.

also one thing that was never answered, will it be possible to run eight cores on spider if driver and game support were created?

EDIT: I mean with PCI-E 2.0 effectively doubling the bandwidth eight gpus is a great way to put the new standard to test.


----------



## hugz (Jan 6, 2008)

I read dunno where 499$


----------



## Mussels (Jan 6, 2008)

broke said:


> I'm with you on that one, and if my memory serves me right, I think that the 3870x2 will have both cores implemented on one PCB.



If thats right, i'd prefer that.

Great for those on watercooling, and you can still get custom air coolers made for it too.

(Mmmm quad crossfire with dual GPU cards, and a yorkfield....)


----------



## AsRock (Jan 6, 2008)

I seem to miss whats so good\nice about them apart for the black.  Think ATI have done a MUCH better job with there's fitting on one PCB...


----------



## broke (Jan 6, 2008)

Mussels said:


> If thats right, i'd prefer that.
> 
> Great for those on watercooling, and you can still get custom air coolers made for it too.
> 
> (Mmmm quad crossfire with dual GPU cards, and a yorkfield....)



lol none of the intel/nvidia boards currently have 4 pic-e slots, so if you want to go crossfire x youd have to get an amd board with the 790FX chipset. I dont even think that you could have tri crossfire on a non 790FX board


----------



## zOaib (Jan 6, 2008)

broke said:


> I'm with you on that one, and if my memory serves me right, I think that the 3870x2 will have both cores implemented on one PCB.
> 
> also one thing that was never answered, will it be possible to run eight cores on spider if driver and game support were created?



nope it will be only crossfire capable i.e quad core config , not crossfire x octa config.


----------



## zOaib (Jan 6, 2008)

AsRock said:


> I seem to miss whats so good\nice about them apart for the black.  Think ATI have done a MUCH better job with there's fitting on one PCB...



this is just a slap and patch job by nvidia on this card , they r just putting this out for the sake of having one called a gx2 ............. if they had their heart in it it wud have been on one pcb and much more things attractive about it.


----------



## broke (Jan 6, 2008)

zOaib said:


> nope it will be only crossfire capable i.e quad core config , not crossfire x octa config.



well if thats the case, one 3870x2 should be cheaper than two reg. 3870. otherwise whats the point?

EDIT: AHHH I think I've come up with the answer as to why someone would want two 3870x2 over 4 regular 3870's. 3870's have dual slot cooling solutions which wouldn't fit on a common 790FX board, but with two 3870x2's you can have crossfire x no problem


----------



## mab1376 (Jan 6, 2008)

im probably gonna pick one up the end of may once it makes its first price drop.

thats my guesstimation


----------



## v-zero (Jan 6, 2008)

That card is ug-er-ly!


----------



## yogurt_21 (Jan 6, 2008)

HeavyH20 said:


> I think they may be right around the $600 mark or so at launch.  I would expect this to be a $499 retail card, however. It is just two 8800 GTS cards in SLI on one slot. The real question is if the 256 bit memory bus is simply carried over.  And, the frame buffer will only be 512 MB. Looks like a product to simply skip over. I will wait for the next gen GTX cards.



yeah off of the last gx2 nightmare i see little reason to expect any success out of this card. (or the dual 3870 ) they simply need to work on better tech not just throw current gen cards together on one pcb and expect miracle performance.


----------



## tvdang7 (Jan 6, 2008)

yogurt_21 said:


> yeah off of the last gx2 nightmare i see little reason to expect any success out of this card. (or the dual 3870 ) they simply need to work on better tech not just throw current gen cards together on one pcb and expect miracle performance.



the dual 3870 is nothing like the old gx2.  1 pcb instead of 2 pcb.


----------



## LiveOrDie (Jan 6, 2008)

Weer said:


> 7950 GX2 = 7900 GT 512MB SLi
> 9800 GX2 = 8800 GT 512MB SLi
> 
> Would have made much more sense to number the G92-based cards under the "8900" series.
> The 9800 doesn't make much sense either, but I guess maybe they want to compete with ATI's crazy new naming scheme.



No more like 9800 GX2 = 8800 Ultra 768MB in SLi, because to 8800GTs doesnt give more than 20% increase in performance when in SLi Beaches but to 8800Ultras do like i said before all they have changed is the 80nm to 65nm for better cooling, its just like a old 8800GTS 80nm to the new 8800GTS 65nm a bit faster less memory


----------



## broke (Jan 6, 2008)

yeah the 3870x2 will be THE dual gpu card to own. I might even get one, since my mobo only has one PCI-E slot


----------



## Mussels (Jan 6, 2008)

assuming it works in every board. I know Nvidias GX2 quite often didnt work in SLI in a lot of systems, unless it was an Nvidia chipset to begin with.

(I hope both these new dual cards do, btw)


----------



## Solaris17 (Jan 6, 2008)

bring on the jigawattz


----------



## imperialreign (Jan 6, 2008)

> yeah the 3870x2 will be THE dual gpu card to own. I might even get one, since my mobo only has one PCI-E slot



considering ATI's track record with multi GPU setups, 2 GPUs on one PCB should be gold for them - although, I'm defi looking forward to the rumored R700 - dual core GPUs . . . and I can see ATI taking it a step further with two R700s on one PCB, which would equate to 4 cores on one card . . . xFire 2 cards like that, and that's 8 GPUs - 3 cards would equal 12.  Even if a xfire would be 5-10% more efficient and powerful than any similar nVidia hardware, a 3 or 4 card setup would bury them performance wise - unless nVidia straightens out their SLI implimentation a bit more.

. . . and that is what I think nVidia is being cautious about . . . they're not sure if ATI does intend to follow that route, but it would certainly be ATI's most promising venture at this point.




> bring on the jigawattz




I got the 1.21 all day, man!


----------



## Mussels (Jan 6, 2008)

imperialreign said:


> considering ATI's track record with multi GPU setups, 2 GPUs on one PCB should be gold for them - although, I'm defi looking forward to the rumored R700 - dual core GPUs . . . and I can see ATI taking it a step further with two R700s on one PCB, which would equate to 4 cores on one card . . .
> 
> . . . and that is what I think nVidia is being cautious about . . . they're not sure if ATI does intend to follow that route, but it would certainly be ATI's most promising venture at this point.
> 
> ...


AMD are also working on fusion, with a GPU core in the CPU.
AMD may well just go the GPU power route, and push as hard as they can since they're having CPU performance problems atm.


----------



## imperialreign (Jan 6, 2008)

Mussels said:


> AMD are also working on fusion, with a GPU core in the CPU.
> AMD may well just go the GPU power route, and push as hard as they can since they're having CPU performance problems atm.



+1 on that

Although ATI GPUs at the moment aren't as fast as nVidia's - they easily reclaim that performance difference in Crossfire.  Their GPUs communicate and work together much more efficiently.  Now that motherboard capability, and chipset capability is there, they need to start going the "performance through sheer fire power" route.

I also think that having a GPU core in a CPU will do dumbfounded wonders for ATI.  They'd have to clear up a lot of communication bottlenecks between a GPU core and a CPU core for it to work optimally, but you can bet that ATI cards would see a decent perfomance increase from the improved system communication.


----------



## psyko12 (Jan 6, 2008)

Solaris17 said:


> bring on the jigawattz



 that may seem to be a power hungry card  wow, hope in some years I could own a really good system.....


----------



## PVTCaboose1337 (Jan 6, 2008)

That is scary.  I remember the R600 pics...  we were all thinking it is huge...  look at this.


----------



## Mussels (Jan 6, 2008)

imperialreign said:


> +1 on that
> 
> Although ATI GPUs at the moment aren't as fast as nVidia's - they easily reclaim that performance difference in Crossfire.  Their GPUs communicate and work together much more efficiently.  Now that motherboard capability, and chipset capability is there, they need to start going the "performance through sheer fire power" route.
> 
> I also think that having a GPU core in a CPU will do dumbfounded wonders for ATI.  They'd have to clear up a lot of communication bottlenecks between a GPU core and a CPU core for it to work optimally, but you can bet that ATI cards would see a decent perfomance increase from the improved system communication.



they do have hypertransport, which can move a sh!tload of data.
You'd have a PCI-E card for outputs (HDMI/displayport) and ram, while the CPU/GPU could have the memory controller and processing power. Its really upto AMD how they want to pan it out, they have a lot of options with hypertransport, PCI-E 2.0, and integrated memory controllers.


----------



## btarunr (Jan 6, 2008)

^Missing the key point: Instructions per clock cycle. Unless AMD steps that up, HyperTransport is as good as null. Dunno what's keeping AMD from fixing this and the ROP count on thieir GPU's. Does it take a Pyramid of Giza to engineer that?


----------



## Agility (Jan 6, 2008)

Lol to ati and nvidia. Lol to intel and amd. Just shake hands, create one group and bam Nvidiati and aintelmd.


----------



## ShadowFold (Jan 6, 2008)

Agility said:


> Lol to ati and nvidia. Lol to intel and amd. Just shake hands, create one group and bam Nvidiati and aintelmd.



AMD and ATi are already one company, technically a ATi card is made by AMD. So it would be AMD and Nviditell.


----------



## Mussels (Jan 6, 2008)

ShadowFold said:


> AMD and ATi are already one company, technically a ATi card is made by AMD. So it would be AMD and Nviditell.



Intelivid.

gotta laugh tho... didnt he realise they were one company already? or did he mean something else.


----------



## ShadowFold (Jan 6, 2008)

Mussels said:


> Intelivid.



That does sound better


----------



## vaperstylz (Jan 6, 2008)

The MSRP will be about 449 USD. http://en.expreview.com/?p=166  Now that would be interesting,i can only dare to dream for now,with crossed fingers.Also why are so many people on this extreme power hunger issue with this card?:shadedshu The G92 core has already proven to be at least 33W less glutinous than an 8800GTX(G80)http://enthusiast.hardocp.com/article.html?art=MTQzMSw2LCxoZW50aHVzaWFzdA==
And as it looks like the two cores face each other and will share the same heatsink i'm having a real problem seeing any real critical flaws in this design(given current information at hand)IDK I guess in other words that I am willing (for the time being at least)to give Nvidia's engineers the benefit of the doubt,they can't obviously be the real idiots that some would like to take them for given that they are the same ones that gave us the already critically acclaimed G80

The only issue that really scares me about this product is whether or not the support will be there long term or not.The 7950GX2 debacle still leaves a bad taste in alot of mouths.If the driver support will be there,and the price point truly will be in the $450 to $530 range,and the memory bus is large enough as well as other specs being decent......this could very well be a very interesting proposition.But for now...........BRING ON THE BENCHIES!!!!!!!!!!!!
All this talk and speculation is really just mental masturbation especially on my part


----------



## TooFast (Jan 6, 2008)

I'd rather the 3870 X2. crossfire scales better in vista, plus its a true dual gpu board, not just 2 8800 boards stuck together with glue...:]
I'm also sure the cards will be cheaper! 55nm on one pcb.


----------



## InnocentCriminal (Jan 6, 2008)

I still think that normal SLi and Crossfire are overkill at the moment. However, I do think ATi have the right idea with bringing the GPUs together on one PCB. Just wish they could make them dual core so it'd only be one GPU. 

Pffft!


----------



## eidairaman1 (Jan 6, 2008)

InnocentCriminal said:


> I still think that normal SLi and Crossfire are overkill at the moment. However, I do think ATi have the right idea with bringing the GPUs together on one PCB. Just wish they could make them dual core so it'd only be one GPU.
> 
> Pffft!



Thats the Evolutionary Path, but they better get the drivers working right for the supposed R680 X2 first before final release


----------



## InnocentCriminal (Jan 6, 2008)

Well, they've been working on Fusion for a while now so we might see it happen sooner rather than later. But knowing AMD it won't happen, they'll most likely concentrate on Fusion more than going dual core with their GPUs.

ATi need to get the drivers down and developers really need to add far better support for multiple GPUs.


----------



## DarkMatter (Jan 6, 2008)

InnocentCriminal said:


> I still think that normal SLi and Crossfire are overkill at the moment. However, I do think ATi have the right idea with bringing the GPUs together on one PCB. Just wish they could make them dual core so it'd only be one GPU.
> 
> Pffft!



The problem with dual core GPUs, is that current GPUs are large enough to be a challenge in the fab. process already. Dual GPUs would be too big right now. And current architectures are so paralelized and internally independent enough, that I'm not sure dual core GPUs make a lot of sense really. GPUs have already multiple processors on the same chip, but instead of being the entire core, it's in their different parts where parallelism occurs. And component parallelism is more desirable than core parallelism IMHO. For example : what's the point of creating a dual core with 128SP, 64TMU and 16ROPs each, if you can create a single 256SP, 128TMU and 32ROP one while using the same (indeed some less) silicon?
I used G80/G92 architecture for this example, since it's the more paralelized one from a manufacturing point of view, quite evident if you look at it's block diagram. R600 even tho more parallel in execution, seems more rigid in it's architecture than G80.

http://www.techreport.com/articles.x/11211
http://www.techreport.com/articles.x/13603/2

Of course dual core GPUs would make sense sooner or later, for different reasons, the most important ones invisible to consumers, but I don't think this will happen too early. Maybe R800  or G110?


----------



## TooFast (Jan 6, 2008)

AMD has already started working on the technology in its Radeon HD 3000 series, according to sources at graphics card makers. The first product will be the Radeon HD 3870 X2 which will feature two RV670XT GPUs and will launch in January 2008 with a price set between US$299-349, noted the sources


AMD is currently planning to integrate the PCI Express bridge chip into its future GPUs so that it does not need to adopt third-party's chips. This design is expected to appear in AMD's next generation R700 series.

@ this price the green team is in some trouble.


----------



## kwchang007 (Jan 6, 2008)

The problem with just doubling everything is the huge die sizes.  As the dies get more and more complicated there's more risk that they're going to screw up and not work.  I think that's why they are doing multiple dies on either a single pcb or two pcbs.


----------



## DarkMatter (Jan 6, 2008)

kwchang007 said:


> The problem with just doubling everything is the huge die sizes.  As the dies get more and more complicated there's more risk that they're going to screw up and not work.  I think that's why they are doing multiple dies on either a single pcb or two pcbs.




Exactly. Isn't that what I said? Damn!  I wish someone (God, Mother Nature... insert your desired Supreme Being here) had graced me with any synthesis skills. You just  me with your two liner.


----------



## imperialreign (Jan 6, 2008)

> Of course dual core GPUs would make sense sooner or later, for different reasons, the most important ones invisible to consumers, but I don't think this will happen too early. Maybe R800 or G110?



ATI has been rumored at developing the R700 as being a dual core GPU (actually, dual chip, two cores on one die), rumored for release last half of this year.  Whether that's true or not remains to be seen, as it's only been a couple of sites mongering the R700 gossip.  The R700 has also been rumored to be on a 45nm process - we shall see . . . but if ATI get's their dual GPU PCBs out the gates without a hitch, and if the R700 is the next GPU we'll see, I can defi forsee ATI going the route of packing two R700s on one PCB sometime in the near future.  ATI's GPUs tend to work more efficiently when they handle smaller work loads - unlike nVidia's GPU which do their best when faced with massive work loads.


----------



## Tatty_One (Jan 6, 2008)

TooFast said:


> I'd rather the 3870 X2. crossfire scales better in vista, plus its a true dual gpu board, not just 2 8800 boards stuck together with glue...:]
> I'm also sure the cards will be cheaper! 55nm on one pcb.



Yep I would tend to agree on that one, but It still needs to be competetive, lets hope the GX2 is a bit better than the 2600XT dual cards were! At least if it is, it should keep the prices down a bit because of the competition.


----------



## kwchang007 (Jan 6, 2008)

DarkMatter said:


> Exactly. Isn't that what I said? Damn!  I wish someone (God, Mother Nature... insert your desired Supreme Being here) had graced me with any synthesis skills. You just  me with your two liner.



lol sorry.


----------



## vaperstylz (Jan 7, 2008)

Also Nvidia seems to have this part scheduled to launch after ATI's dual gpu part.That will give them a chance to make any tweaks that maybe needed to make sure that they stay ahead.Also i'm sure that this thing is being well synched with Crysis and its upcoming Patch.Nvidia has to be well aware that poor performance in Crysis at this late juncture would not be well tolerated,even by a "Fanboi" like me.Since Nvidia has been in such close and intimate terms with the  boys over at Crytek and sales for Crysis have been consideribly less than stellar.It only stands to reason that this product and the late appearance of the Crysis patch may be somehow connected..........More mental masturbation on my part.


----------



## InnocentCriminal (Jan 7, 2008)

DarkMatter said:


> The problem with dual core GPUs, is that current GPUs are large enough to be a challenge in the fab. process already.
> 
> For example : what's the point of creating a dual core with 128SP, 64TMU and 16ROPs each, if you can create a single 256SP, 128TMU and 32ROP one while using the same (indeed some less) silicon?
> 
> Of course dual core GPUs would make sense sooner or later, for different reasons, the most important ones invisible to consumers, but I don't think this will happen too early. Maybe R800  or G110?



Hence the pffft!


----------



## DarkMatter (Jan 7, 2008)

imperialreign said:


> ATI has been rumored at developing the R700 as being a dual core GPU (actually, dual chip, two cores on one die), rumored for release last half of this year.  Whether that's true or not remains to be seen, as it's only been a couple of sites mongering the R700 gossip.  The R700 has also been rumored to be on a 45nm process - we shall see . . . but if ATI get's their dual GPU PCBs out the gates without a hitch, and if the R700 is the next GPU we'll see, I can defi forsee ATI going the route of packing two R700s on one PCB sometime in the near future.  ATI's GPUs tend to work more efficiently when they handle smaller work loads - unlike nVidia's GPU which do their best when faced with massive work loads.



Yeah I knew that. But it was rumored to be multi-chip, so I thought they were refering to many dies on the same pcb. I have searched a bit and it seems that it's dual core as you said. Honestly with the Phenom disaster still in my mind, this is nothing I would be happy to confirm. The same could happen to R700. Only time will tell.

Anyway, one thing is what they want to do, and another one is what they can do. Complex architectures as Phenom have associated very poor yields and a widespread number of different "workable products" and that wouldn't work very well on GPUs. They can use dualcore chips for high-end and single core ones for mainstream, but what about the others? And how would they use defective cores? And two differently deffective cores on the same die?
That's what happened with Phenoms. "One of the four cores is darn slow, do we make a slow quad or do we make a fast tricore?"


----------



## imperialreign (Jan 7, 2008)

> Yeah I knew that. But it was rumored to be multi-chip, so I thought they were refering to many dies on the same pcb. I have searched a bit and it seems that it's dual core as you said. Honestly with the Phenom disaster still in my mind, this is nothing I would be happy to confirm. The same could happen to R700. Only time will tell.
> 
> Anyway, one thing is what they want to do, and another one is what they can do. Complex architectures as Phenom have associated very poor yields and a widespread number of different "workable products" and that wouldn't work very well on GPUs. They can use dualcore chips for high-end and single core ones for mainstream, but what about the others? And how would they use defective cores? And two differently deffective cores on the same die?
> That's what happened with Phenoms. "One of the four cores is darn slow, do we make a slow quad or do we make a fast tricore?"



yeah, it's definitely a gamble any way you look at it.  I'm kinda hoping that they're bridging two different technolgies together with it, though - following ATI's GPU architecture with AMD's die architecture.  ATI have demonstrated in the past that they _can_ design some killer GPUs, and if AMD's technology can tie those two cores together as effectively as they've done with some of their CPU's - they'll be looking good.

But, as you've mentioned, it could go the other way, and the whole project (which looks good on paper and in theory seems stellar in the R&D department) might go belly-up once it's actually out in the hands of consumers, being faced with various hardware and software setups.

Perhaps it's why we've seen very few rumors about the new GPU, and perhaps why AMD/ATI is taking their time with it.  But they've been put into a position where they've got very little to lose anymore, and that can equate to a company willing to try and re-break trodden ground and take a risk that a more solid company wouldn't even consider.  Hopefully, though, they won't go the way 3DFX did when they started shooting for extreme solutions


----------



## vampire622003 (Jan 24, 2008)

I don't see hwy its called the 9800GX2 all it is is two 8800's in one card they haven't addent any more shaders unlike ATI's 4.1 Pixel Shader with DirectX 10.1. Nvidia refuses to put it in thier cards. I have no idea why, but thats their choice. ATI ownz.


----------



## CDdude55 (Jan 24, 2008)

Now is a good time to switch to console gaming.lol


----------



## Deleted member 24505 (Jan 24, 2008)

Its a monstrosity like the 7950gx2(or whateva the hell it was called) was.Two cards bodged into one.

The ati 3870 x2 is a better way to do it.More finesse.


----------



## Mussels (Jan 25, 2008)

vampire622003 said:


> I don't see hwy its called the 9800GX2 all it is is two 8800's in one card they haven't addent any more shaders unlike ATI's 4.1 Pixel Shader with DirectX 10.1. Nvidia refuses to put it in thier cards. I have no idea why, but thats their choice. ATI ownz.



because google is your friend.

10 and 10.1 have next to nothing different, its a very minor change.

Hardly anything works in 10, and therse NOTHING in 10.1 - Nv can afford to wait a year on 10.1 cards, because they just dont matter right now.


----------

