Tuesday, June 17th 2008
ATI Believes GeForce GTX 200 Will be NVIDIA's Last Monolithic GPU.
The head of ATI Technologies claims that the recently introduced NVIDIA GeForce GTX 200 GPU will be the last monolithic "megachip" because they are simply too expensive to manufacture. The statement was made after NVIDIA executives vowed to keep producing large single chip GPUs. The size of the G200 GPU is about 600mm2¬¬ which means only about 97 can fit on a 300mm wafer costing thousands of dollars. Earlier this year NVIDIA's chief scientist said that AMD is unable to develop a large monolithic graphics processor due to lack of resources. However, Mr. Bergman said that smaller chips allow easier adoption of them for mobile computers.
Source:
X-bit Labs
116 Comments on ATI Believes GeForce GTX 200 Will be NVIDIA's Last Monolithic GPU.
If AMD/ATI can develop a more sound fabrication process, or reduce the number of dead cores, it would make it viable, IMO.
I'm just keeping in mind that over the last 6+ months, AMD has been making contact with some reputable companies who've helped them before, and have also taken on quite a few new personnel who are very well respected and amoungst the top of the line in their fields.
The Fuzion itself is, IMO, a good starting point, and AMD proving to themselves they can do it. Integrating a GPU core like that wouldn't be resource friendly if their fabrication process left with a lot of dead fish in the barrel - they would be losing money just in trying to design such an architecture if fabrication would shoot themselves in the foot.
Perhaps it's possible they've come up with a way to stitch two cores together where if one is dead from fabrication, it doesn't cripple the chip, and the GPU can be slapped on a lower end card and shipped. Can't really be sure right now, as AMD keeps throwing out one surprise after another . . . perhaps this will be the one they hit the home run with?
On the other hand, the problem with GT200 is not transistor count, but die size, the fact they have done it in 65 nm. In 55 nm the chip would probably be around 400 cm2 which is not that high really.
Another problem when we compare GT200 size and the performance it delivers is that they have added those 16k caches in the shader processors where are not needed for any released game or benchmark. Applications will need to be programmed to use them. As it stands now GT200 has almost 0,5 MB of cache with zero benefit. 4MB of cache in Core2 are pretty much half the die size, in GT200 it's a lot less than that but a lot from a die size/gaming performance point of view. And to that you have to add L1 caches, that are probably double the size than on G92, with zero benefit again. It's here and in FP64 shaders where Nvidia has used a lot of silicon for future proofing the marchitecture, but we don't see the fruits yet.
I think that on GPUs bigger single core chips is the key to performance and multi-GPU is the key to profitability once reached one point in the fab-process. The better result is probably something in the middle, I mean not going with more than two GPUs and keep making the chips bigger according to the fab-process capabilities. As I explained above I don't think multi-core GPUs have any advantage over bigger chips. That would open the door to both bigger chips and, as you say, multi-core chips. Again I don't see any advantage on multi-core GPUs.
And what's the difference between that and what they do today? Well what Nvidia does today, as Ati is not doing that with RV670 and 770, but they did in the past.
AMD Creating this "nvidia is a dinosaur" hype is, viable.
If you have so much heat output on one single core, the cooling would be expensive to manufacture. 200W on one core, the cooling system would have to transfer the heat away ASAP. While, 2x100W cores, would fare better, with the heat output being spread out.
Realise that a larger core means a far more delicate card with the chip itself requiring more BGA solder balls; means the card cannot take much stress before BGA solder balls falter.
AMD is saying that, if they do what they are doing now, they will not need to completely redesign an architecture. It doesn't matter if they barely spend anything in R&D, in the end the consumer benefits from lower prices, we are the consumer remember.
AMD can decide to stack two or even three cores, provided they make the whole card function as one GPU (instead of the HD3870X2 style's 2 cards on software/hardware level), if the performance and price is good. Just correcting you, 2 on one die is what we have atm anyway. GPUs are effectively a collection of processors in one die. AMD is trying not to put dies together as they know that die shrinks under 65~45nm do not really help in terms of heat output, and therefore are splitting the heat output. As I mentioned before, a larger die will mean more R&D effort, and more expensive to manufacture.
Yeah I meant 400 mm2 :roll:
:slap::slap::slap::slap::slap::slap:
...that.
They have the resources to go 'smaller' if need be. ATi has less flexibility.
its the most powerful single gpu ever, of course ati will try and dull the shine on it.
i recall all the ati fanbios claiming foul when nv did the 79gx2 but now its cool to do 2 gpu on 1 card to compete?
wait till the 280gtx gets a die shrink and they slap 2 on 1 card, can you say 4870x4 needed to compete.
7950GX2 is an invalid claim as it could not function on every system, due to it being seen on a driver level as two cards; an SLi board was needed. You can't compare the 4870X2 to a 7950, its like comparing apples and oranges, 4870X2 to the system is only ONE card not two, CF is not enabled (therefore performance problems with multi GPUs go out the window). Moreover the way that the card uses memory is just the same as the C2Ds, two cores, shared L2.
If the 4870x2 & the 4850x2 are both faster than the GTX280 & costs a whole lot less then I don't see what the problem is except for people crying about the 2 GPU mess. As long as its fast & DON'T cost a bagillion bucks I'm game.
GT200s, well Nvidia are shearing down their profits just to get these things to sell, AMD on the otherhand enjoy not having to reinforce their cards and put high end air cooling on-they are way better off. If these 4850s sell well, as well as the RV770, the GT200s look like an awful flop.