Saturday, January 5th 2008

NVIDIA GeForce 9800GX2 New Pictures

Enjoy the pics:
Source: tomshardware.tw
Add your own comment

104 Comments on NVIDIA GeForce 9800GX2 New Pictures

#76
imperialreign
MusselsAMD are also working on fusion, with a GPU core in the CPU.
AMD may well just go the GPU power route, and push as hard as they can since they're having CPU performance problems atm.
+1 on that

Although ATI GPUs at the moment aren't as fast as nVidia's - they easily reclaim that performance difference in Crossfire. Their GPUs communicate and work together much more efficiently. Now that motherboard capability, and chipset capability is there, they need to start going the "performance through sheer fire power" route.

I also think that having a GPU core in a CPU will do dumbfounded wonders for ATI. They'd have to clear up a lot of communication bottlenecks between a GPU core and a CPU core for it to work optimally, but you can bet that ATI cards would see a decent perfomance increase from the improved system communication.
Posted on Reply
#77
psyko12
Solaris17bring on the jigawattz
:laugh: that may seem to be a power hungry card :D wow, hope in some years I could own a really good system.....

:banghead:
Posted on Reply
#78
PVTCaboose1337
Graphical Hacker
That is scary. I remember the R600 pics... we were all thinking it is huge... look at this.
Posted on Reply
#79
Mussels
Freshwater Moderator
imperialreign+1 on that

Although ATI GPUs at the moment aren't as fast as nVidia's - they easily reclaim that performance difference in Crossfire. Their GPUs communicate and work together much more efficiently. Now that motherboard capability, and chipset capability is there, they need to start going the "performance through sheer fire power" route.

I also think that having a GPU core in a CPU will do dumbfounded wonders for ATI. They'd have to clear up a lot of communication bottlenecks between a GPU core and a CPU core for it to work optimally, but you can bet that ATI cards would see a decent perfomance increase from the improved system communication.
they do have hypertransport, which can move a sh!tload of data.
You'd have a PCI-E card for outputs (HDMI/displayport) and ram, while the CPU/GPU could have the memory controller and processing power. Its really upto AMD how they want to pan it out, they have a lot of options with hypertransport, PCI-E 2.0, and integrated memory controllers.
Posted on Reply
#80
btarunr
Editor & Senior Moderator
^Missing the key point: Instructions per clock cycle. Unless AMD steps that up, HyperTransport is as good as null. Dunno what's keeping AMD from fixing this and the ROP count on thieir GPU's. Does it take a Pyramid of Giza to engineer that?
Posted on Reply
#81
Agility
Lol to ati and nvidia. Lol to intel and amd. Just shake hands, create one group and bam Nvidiati and aintelmd.
Posted on Reply
#82
ShadowFold
AgilityLol to ati and nvidia. Lol to intel and amd. Just shake hands, create one group and bam Nvidiati and aintelmd.
AMD and ATi are already one company, technically a ATi card is made by AMD. So it would be AMD and Nviditell.
Posted on Reply
#83
Mussels
Freshwater Moderator
ShadowFoldAMD and ATi are already one company, technically a ATi card is made by AMD. So it would be AMD and Nviditell.
Intelivid.

gotta laugh tho... didnt he realise they were one company already? or did he mean something else.
Posted on Reply
#84
ShadowFold
MusselsIntelivid.
That does sound better
Posted on Reply
#85
vaperstylz
The MSRP will be about 449 USD.:eek: en.expreview.com/?p=166 Now that would be interesting,i can only dare to dream for now,with crossed fingers.Also why are so many people on this extreme power hunger issue with this card?:shadedshu The G92 core has already proven to be at least 33W less glutinous than an 8800GTX(G80)enthusiast.hardocp.com/article.html?art=MTQzMSw2LCxoZW50aHVzaWFzdA==
And as it looks like the two cores face each other and will share the same heatsink i'm having a real problem seeing any real critical flaws in this design(given current information at hand)IDK I guess in other words that I am willing (for the time being at least)to give Nvidia's engineers the benefit of the doubt,they can't obviously be the real idiots that some would like to take them for given that they are the same ones that gave us the already critically acclaimed G80

The only issue that really scares me about this product is whether or not the support will be there long term or not.The 7950GX2 debacle still leaves a bad taste in alot of mouths.If the driver support will be there,and the price point truly will be in the $450 to $530 range,and the memory bus is large enough as well as other specs being decent......this could very well be a very interesting proposition.But for now...........BRING ON THE BENCHIES!!!!!!!!!!!!
All this talk and speculation is really just mental masturbation especially on my part:roll:
Posted on Reply
#86
TooFast
I'd rather the 3870 X2. crossfire scales better in vista, plus its a true dual gpu board, not just 2 8800 boards stuck together with glue...:]
I'm also sure the cards will be cheaper! 55nm on one pcb.
Posted on Reply
#87
InnocentCriminal
Resident Grammar Amender
I still think that normal SLi and Crossfire are overkill at the moment. However, I do think ATi have the right idea with bringing the GPUs together on one PCB. Just wish they could make them dual core so it'd only be one GPU.

Pffft!
Posted on Reply
#88
eidairaman1
The Exiled Airman
InnocentCriminalI still think that normal SLi and Crossfire are overkill at the moment. However, I do think ATi have the right idea with bringing the GPUs together on one PCB. Just wish they could make them dual core so it'd only be one GPU.

Pffft!
Thats the Evolutionary Path, but they better get the drivers working right for the supposed R680 X2 first before final release
Posted on Reply
#89
InnocentCriminal
Resident Grammar Amender
Well, they've been working on Fusion for a while now so we might see it happen sooner rather than later. But knowing AMD it won't happen, they'll most likely concentrate on Fusion more than going dual core with their GPUs.

ATi need to get the drivers down and developers really need to add far better support for multiple GPUs.
Posted on Reply
#90
DarkMatter
InnocentCriminalI still think that normal SLi and Crossfire are overkill at the moment. However, I do think ATi have the right idea with bringing the GPUs together on one PCB. Just wish they could make them dual core so it'd only be one GPU.

Pffft!
The problem with dual core GPUs, is that current GPUs are large enough to be a challenge in the fab. process already. Dual GPUs would be too big right now. And current architectures are so paralelized and internally independent enough, that I'm not sure dual core GPUs make a lot of sense really. GPUs have already multiple processors on the same chip, but instead of being the entire core, it's in their different parts where parallelism occurs. And component parallelism is more desirable than core parallelism IMHO. For example : what's the point of creating a dual core with 128SP, 64TMU and 16ROPs each, if you can create a single 256SP, 128TMU and 32ROP one while using the same (indeed some less) silicon?
I used G80/G92 architecture for this example, since it's the more paralelized one from a manufacturing point of view, quite evident if you look at it's block diagram. R600 even tho more parallel in execution, seems more rigid in it's architecture than G80.

www.techreport.com/articles.x/11211
www.techreport.com/articles.x/13603/2

Of course dual core GPUs would make sense sooner or later, for different reasons, the most important ones invisible to consumers, but I don't think this will happen too early. Maybe R800 or G110?
Posted on Reply
#91
TooFast
AMD has already started working on the technology in its Radeon HD 3000 series, according to sources at graphics card makers. The first product will be the Radeon HD 3870 X2 which will feature two RV670XT GPUs and will launch in January 2008 with a price set between US$299-349, noted the sources


AMD is currently planning to integrate the PCI Express bridge chip into its future GPUs so that it does not need to adopt third-party's chips. This design is expected to appear in AMD's next generation R700 series.

@ this price the green team is in some trouble.
Posted on Reply
#92
kwchang007
The problem with just doubling everything is the huge die sizes. As the dies get more and more complicated there's more risk that they're going to screw up and not work. I think that's why they are doing multiple dies on either a single pcb or two pcbs.
Posted on Reply
#93
DarkMatter
kwchang007The problem with just doubling everything is the huge die sizes. As the dies get more and more complicated there's more risk that they're going to screw up and not work. I think that's why they are doing multiple dies on either a single pcb or two pcbs.
Exactly. Isn't that what I said? Damn! :banghead: I wish someone (God, Mother Nature... insert your desired Supreme Being here) had graced me with any synthesis skills. You just :nutkick: me with your two liner. :p
Posted on Reply
#94
imperialreign
Of course dual core GPUs would make sense sooner or later, for different reasons, the most important ones invisible to consumers, but I don't think this will happen too early. Maybe R800 or G110?
ATI has been rumored at developing the R700 as being a dual core GPU (actually, dual chip, two cores on one die), rumored for release last half of this year. Whether that's true or not remains to be seen, as it's only been a couple of sites mongering the R700 gossip. The R700 has also been rumored to be on a 45nm process - we shall see . . . but if ATI get's their dual GPU PCBs out the gates without a hitch, and if the R700 is the next GPU we'll see, I can defi forsee ATI going the route of packing two R700s on one PCB sometime in the near future. ATI's GPUs tend to work more efficiently when they handle smaller work loads - unlike nVidia's GPU which do their best when faced with massive work loads.
Posted on Reply
#95
Tatty_Two
Gone Fishing
TooFastI'd rather the 3870 X2. crossfire scales better in vista, plus its a true dual gpu board, not just 2 8800 boards stuck together with glue...:]
I'm also sure the cards will be cheaper! 55nm on one pcb.
Yep I would tend to agree on that one, but It still needs to be competetive, lets hope the GX2 is a bit better than the 2600XT dual cards were! At least if it is, it should keep the prices down a bit because of the competition.
Posted on Reply
#96
kwchang007
DarkMatterExactly. Isn't that what I said? Damn! :banghead: I wish someone (God, Mother Nature... insert your desired Supreme Being here) had graced me with any synthesis skills. You just :nutkick: me with your two liner. :p
lol sorry.
Posted on Reply
#97
vaperstylz
Also Nvidia seems to have this part scheduled to launch after ATI's dual gpu part.That will give them a chance to make any tweaks that maybe needed to make sure that they stay ahead.Also i'm sure that this thing is being well synched with Crysis and its upcoming Patch.Nvidia has to be well aware that poor performance in Crysis at this late juncture would not be well tolerated,even by a "Fanboi" like me.Since Nvidia has been in such close and intimate terms with the boys over at Crytek and sales for Crysis have been consideribly less than stellar.It only stands to reason that this product and the late appearance of the Crysis patch may be somehow connected..........More mental masturbation on my part.
Posted on Reply
#98
InnocentCriminal
Resident Grammar Amender
DarkMatterThe problem with dual core GPUs, is that current GPUs are large enough to be a challenge in the fab. process already.

For example : what's the point of creating a dual core with 128SP, 64TMU and 16ROPs each, if you can create a single 256SP, 128TMU and 32ROP one while using the same (indeed some less) silicon?

Of course dual core GPUs would make sense sooner or later, for different reasons, the most important ones invisible to consumers, but I don't think this will happen too early. Maybe R800 or G110?
Hence the pffft! ;)
Posted on Reply
#99
DarkMatter
imperialreignATI has been rumored at developing the R700 as being a dual core GPU (actually, dual chip, two cores on one die), rumored for release last half of this year. Whether that's true or not remains to be seen, as it's only been a couple of sites mongering the R700 gossip. The R700 has also been rumored to be on a 45nm process - we shall see . . . but if ATI get's their dual GPU PCBs out the gates without a hitch, and if the R700 is the next GPU we'll see, I can defi forsee ATI going the route of packing two R700s on one PCB sometime in the near future. ATI's GPUs tend to work more efficiently when they handle smaller work loads - unlike nVidia's GPU which do their best when faced with massive work loads.
Yeah I knew that. But it was rumored to be multi-chip, so I thought they were refering to many dies on the same pcb. I have searched a bit and it seems that it's dual core as you said. Honestly with the Phenom disaster still in my mind, this is nothing I would be happy to confirm. The same could happen to R700. Only time will tell.

Anyway, one thing is what they want to do, and another one is what they can do. Complex architectures as Phenom have associated very poor yields and a widespread number of different "workable products" and that wouldn't work very well on GPUs. They can use dualcore chips for high-end and single core ones for mainstream, but what about the others? And how would they use defective cores? And two differently deffective cores on the same die?
That's what happened with Phenoms. "One of the four cores is darn slow, do we make a slow quad or do we make a fast tricore?"
Posted on Reply
#100
imperialreign
Yeah I knew that. But it was rumored to be multi-chip, so I thought they were refering to many dies on the same pcb. I have searched a bit and it seems that it's dual core as you said. Honestly with the Phenom disaster still in my mind, this is nothing I would be happy to confirm. The same could happen to R700. Only time will tell.

Anyway, one thing is what they want to do, and another one is what they can do. Complex architectures as Phenom have associated very poor yields and a widespread number of different "workable products" and that wouldn't work very well on GPUs. They can use dualcore chips for high-end and single core ones for mainstream, but what about the others? And how would they use defective cores? And two differently deffective cores on the same die?
That's what happened with Phenoms. "One of the four cores is darn slow, do we make a slow quad or do we make a fast tricore?"
yeah, it's definitely a gamble any way you look at it. I'm kinda hoping that they're bridging two different technolgies together with it, though - following ATI's GPU architecture with AMD's die architecture. ATI have demonstrated in the past that they can design some killer GPUs, and if AMD's technology can tie those two cores together as effectively as they've done with some of their CPU's - they'll be looking good.

But, as you've mentioned, it could go the other way, and the whole project (which looks good on paper and in theory seems stellar in the R&D department) might go belly-up once it's actually out in the hands of consumers, being faced with various hardware and software setups.

Perhaps it's why we've seen very few rumors about the new GPU, and perhaps why AMD/ATI is taking their time with it. But they've been put into a position where they've got very little to lose anymore, and that can equate to a company willing to try and re-break trodden ground and take a risk that a more solid company wouldn't even consider. Hopefully, though, they won't go the way 3DFX did when they started shooting for extreme solutions :ohwell:
Posted on Reply
Add your own comment
Dec 15th, 2024 16:34 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts