Tuesday, June 17th 2008

ATI Believes GeForce GTX 200 Will be NVIDIA's Last Monolithic GPU.

The head of ATI Technologies claims that the recently introduced NVIDIA GeForce GTX 200 GPU will be the last monolithic "megachip" because they are simply too expensive to manufacture. The statement was made after NVIDIA executives vowed to keep producing large single chip GPUs. The size of the G200 GPU is about 600mm2¬¬ which means only about 97 can fit on a 300mm wafer costing thousands of dollars. Earlier this year NVIDIA's chief scientist said that AMD is unable to develop a large monolithic graphics processor due to lack of resources. However, Mr. Bergman said that smaller chips allow easier adoption of them for mobile computers.
Source: X-bit Labs
Add your own comment

116 Comments on ATI Believes GeForce GTX 200 Will be NVIDIA's Last Monolithic GPU.

#76
DarkMatter
EastCoasthandle

gtx 200 series gpu (from what I've found so far)

A wafer from the 4800 series gpu will offer a whole lot more. However, I haven't found one yet. Anyone have a 55nm wafer pic?
Wow I knew that such big die size and low die number meant fewer complete dies in theory, but seing that in a picture is more impressive! I counted 95 complete dies there, and like 30 incomplete ones. Almost 10 of the incomplete ones have more than 90% of the die intact, but I don't know if they can use them. I guess they can cut that part, that judging by the die picture that means cutting some SPs in most of them and sell it as GTX260. Nevertheless only that fact alone contributes to lower yields, I suppose, and the number of incomplete dies is going to be a lot lower at 55 nm.
Posted on Reply
#77
yogurt_21
MegastyThe only thing that's serious about it is how NV bet the farm on this thing. I'll be collecting that farm when I buy my 4870x2 :p
lol nvidia didn't bet the farm on this one. if they did we'd be seeing a commercial on televsion every 3 seconds followed by famous endorcements, several small islands being purchased and named gtx280. lol

nvidia is a big company and it would take alot for them to "bet the farm" on a single chip. it's not like nvidia will really care if ati's is faster thsi time. nvidia will just simply laugh when theirs outsells ati's faster card. this has been typical since the dawn of nvidia (though back then it was rare that ati got a win, the radeon was the first to even truly compete)

the gtx280 seems to be a flop, no biggie nvidia will launch a revision which may or may not flop as well. it doesn't matter because nvidia is already working 4-5 generations out. So if this generation is a flop, they'll simply pour more of their employees into the next gen.

Ati also works several generations out which is why it didn't matter that the r600 was a flop. they already had several others in the pipeline that they knew performed better perwatt.

and I seriously have to laugh at all the fanboys who say "look at the last 2 years, nvidia can't lose" wow was the 8800 your first gpu or what? nvidia nor intel nor amd nor ati nor via nor any other manufacturer can put out the best product every time. it's impossible and history tells us differently. the ti4000 series stomped all over ati's radeon 8500. and all the nvidia fanboys went "see nvidia can't lose" and then came the fx series in which the 9700's stomped all over and later the 9800's widened the gap. Then fact that nvidia has been ruling for the past 2 years only strengthens the argument that the ati card will be faster this time. ATi and nvidia have been doing this dance since long before many on this forum, knew what a graphics card was and they'll be doing this dance long after. it's development + pressure from competition + a little luck that forms the winner. and nvidia has been missing an element making them less likely to come on top this time.

the gtx280 was the chip specs wise we all wanted it to be double the rop's double the mem bit, and nearly double the shaders. the trouble is each time the gpu manufacturers double things, it takes games quite a while to catch up in coding to use the extra power. The gtx280 will only grow more powerful as time goes on, but it will likely be the last of it's kind. why? because it's the way the markets going. the bigger badder phase started when intel was pushing the clock speeds and ati was pushing the pixel pipeline. both required more cooling and psu than previous generations had seen. the core2duo is different offering more power without going for the GHZ (stock comparison of course) and each generation seems to have a lower tdp than the last. gpu's similarly will start (and have already started rv670/g92) doing the same thing. the high tdp gpus will start going by the wayside while cheaper/quieter/lower tdp versions will replace them. the g92 did a good job of increasing performance while dropping heat and energy requirements. the gt200b will likely do the same with nvidias next chip being cooler than the gt200b. it's market trends. more ussers are going for cooler quiter pc's than in 2000 making this quite a different battle than it used to be.
Posted on Reply
#78
PVTCaboose1337
Graphical Hacker
I agree that each company will take the lead somehow, and keep alternating. I believe that eventually, there will be a standstill when they hit physical limits of graphics processing technology, and advancements will slow, and basically most cards will be equal for a period of time.
Posted on Reply
#79
yogurt_21
DarkMatterWow I knew that such big die size and low die number meant fewer complete dies in theory, but seing that in a picture is more impressive! I counted 95 complete dies there, and like 30 incomplete ones. Almost 10 of the incomplete ones have more than 90% of the die intact, but I don't know if they can use them. I guess they can cut that part, that judging by the die picture that means cutting some SPs in most of them and sell it as GTX260. Nevertheless only that fact alone contributes to lower yields, I suppose, and the number of incomplete dies is going to be a lot lower at 55 nm.
is it just me or would a square wafer make alot more sense lol. I mean look at all the partials that needn't be that way if the dies weren't square and the wafer round.
Posted on Reply
#81
btarunr
Editor & Senior Moderator
[I.R.A]_FBiwhy is it round?
Because 'stuff' is 'planted' on it while it spins.
Posted on Reply
#82
DarkMatter
yogurt_21is it just me or would a square wafer make alot more sense lol. I mean look at all the partials that needn't be that way if the dies weren't square and the wafer round.
Yeah I thought the same some years back when I first saw a wafer picture. I suppose wafers being round has to do with their manufacturing process, but why can't be square is a question I have since that first time. It was a comparison of a chip at 180nm and 130nm, and I have to say there were a lot more dies than ~100, so incomplete ones were a lot less in comparison. Complete dies on 130 nm were more than double the ones on 180nm. Theoretically each CPU process (180-130-90-65-45-32...) can do double the number of dies than the previous one, but I think it's actually a bit more because of that.
btarunrBecause 'stuff' is 'planted' on it while it spins.
It spins? Really? Or are you just kidding? I thought they were made by exposition to "light" and chemicals. Pretty much how you would reveal photos in the old fashion. I read an article about how they made the chips and I don't remember anything about spining, not at least while the layout was being "printed"... :confused:
Posted on Reply
#83
Unregistered
So because of the smaller 55nm die,you'd get a lot more complete cores on a 300mm wafer.So in theory you'd get a higher yield with a 55nm die/300mm wafer.
Posted on Edit | Reply
#84
btarunr
Editor & Senior Moderator
DarkMatterIt spins? Really? Or are you just kidding? I thought they were made by exposition to "light" and chemicals. Pretty much how you would reveal photos in the old fashion. I read an article about how they made the chips and I don't remember anything about spining, not at least while the layout was being "printed"... :confused:
Why do people enjoy complicated lives? en.wikipedia.org/wiki/Semiconductor_fabrication

It's round so it aides several manufacturing processes. A Perti-plate is never square, we use them for microbial cultures. It's round and so aides streaking, colony design, etc. I wish pizza was square, but then it becomes difficult for Dominoes to make them. They come in a semi-manufactured state, the local Dominoes completes the manufacture before giving it away to the delivery boys.
Posted on Reply
#85
DarkMatter
tigger69So because of the smaller 55nm die,you'd get a lot more complete cores on a 300mm wafer.So in theory you'd get a higher yield with a 55nm die/300mm wafer.
And that only from geometry perspective. :)

You have to add lower operational voltages, lower in-between transistor latency = higher possible clocks and lower power consumption. Everything adds up to manufacturers being able to make the same chip a lot easier/cheaper or a faster chip for the same costs.
Posted on Reply
#86
HTC
They're planning 450 mm wafers!

Read this.
Posted on Reply
#87
Unregistered
It is cheaper (due to the mathematics of yields in producing semiconductor dies) to produce two smaller chips and place them on one PCB than to produce one larger chip, even if the sum of the transistors is equal in both cases... Q.E.D.
Posted on Edit | Reply
#89
DarkMatter
btarunrOk. I are serious cat now. Go through these at leisure: www.youtube.com/results?search_query=semiconductor+wafer&search_type=&aq=0&oq=semiconductor+wafer
Ok I've seen some of the videos on that link and these 2 explain the thing very well. It's very easy to know why they are round after seeing the secon one, and actually understanding in the end how they make them and specially how they make sure the silicon is pure:

www.youtube.com/watch?v=LWfCqpJzJYM&feature=related

www.youtube.com/watch?v=aWVywhzuHnQ&feature=related

EDIT: Anyway it's not because they spin while they "plant" suff in them, but quite the oposite from what I've understood. It's while they remove the remains and they could be square for that purpose. I had already taken into account they could make them spin to take those remains out, but square wafers could spin too, it would be just not as easier. :) They have to be round because of how the silicon bar is created though, and I didn't know that. I like learning this kind of things. :D
Posted on Reply
#90
razaron
oh oh i have a brilliant comparison of nvidia to ATI. nvidia is a shelby gt500 with good old muscle and ati would be ATI would be a lexus LS 460 a car that can park itself but would lose in a race with a shelby gt500. now hows tha for a car comparison.

ps. btarunr you would have made a brilliantly chavy sentence if you said "i is serious cat now." :D
Posted on Reply
#91
Assimilator
Who wants to bet that NV are going to skip the jump to 55nm and go straight to 45nm, a la Intel?
Posted on Reply
#92
DarkMatter
AssimilatorWho wants to bet that NV are going to skip the jump to 55nm and go straight to 45nm, a la Intel?
I thought the same and that thought was kinda strengthened by the fact that TSMC announced they were ready for 45nm and how Nvidia preffers bigger jumps ala Intel, as you said. But I don't think they are doing that, Nvidia also likes using proved technologies and GT200b is said to come really soon. Plus I think 55nm is already said to be GT200b's fab process.

But yeah, the posibility still remains, I wouldn't bet my leg though.
Posted on Reply
#93
Unregistered
AssimilatorWho wants to bet that NV are going to skip the jump to 55nm and go straight to 45nm, a la Intel?
It's extremely unlikely due to the complexity of gt200, making it on 65nm is already giving horrible yields...
Posted on Edit | Reply
#94
btarunr
Editor & Senior Moderator
razaronps. btarunr you would have made a brilliantly chavy sentence if you said "i is serious cat now." :D
No, I wouldn't



See, don't I look serious?™
DarkMatterOk I've seen some of the videos on that link and these 2 explain the thing very well. It's very easy to know why they are round after seeing the secon one, and actually understanding in the end how they make them and specially how they make sure the silicon is pure:

www.youtube.com/watch?v=LWfCqpJzJYM&feature=related

www.youtube.com/watch?v=aWVywhzuHnQ&feature=related

EDIT: Anyway it's not because they spin while they "plant" suff in them, but quite the oposite from what I've understood. It's while they remove the remains and they could be square for that purpose. I had already taken into account they could make them spin to take those remains out, but square wafers could spin too, it would be just not as easier. :) They have to be round because of how the silicon bar is created though, and I didn't know that. I like learning this kind of things. :D

This other one explains it all better though, and more deeply, but it's only viable for advanced minds.

www.youtube.com/watch?v=oHg5SJYRHA0&feature=related
See, they polish the wafers when they're rotated at high-speeds. Could you do that with squares? Next time, hide the rick-roll in a bundle of links, don't make it obvious, I didn't fall for that last link.
Posted on Reply
#96
candle_86
v-zeroIt is cheaper (due to the mathematics of yields in producing semiconductor dies) to produce two smaller chips and place them on one PCB than to produce one larger chip, even if the sum of the transistors is equal in both cases... Q.E.D.
but not as efficient as both cores have to then split resources at that point and loose a little power than a single core setup
Posted on Reply
#97
Megasty
yogurt_21lol nvidia didn't bet the farm on this one. if they did we'd be seeing a commercial on televsion every 3 seconds followed by famous endorcements, several small islands being purchased and named gtx280. lol

nvidia is a big company and it would take alot for them to "bet the farm" on a single chip. it's not like nvidia will really care if ati's is faster thsi time. nvidia will just simply laugh when theirs outsells ati's faster card. this has been typical since the dawn of nvidia (though back then it was rare that ati got a win, the radeon was the first to even truly compete)

the gtx280 seems to be a flop, no biggie nvidia will launch a revision which may or may not flop as well. it doesn't matter because nvidia is already working 4-5 generations out. So if this generation is a flop, they'll simply pour more of their employees into the next gen.

Ati also works several generations out which is why it didn't matter that the r600 was a flop. they already had several others in the pipeline that they knew performed better perwatt.
When I said bet the farm I was referring to this series. It might lose & lose big seeing how things are going but NV loves being the fastest, biggest, loudest, whatever & won't take losing lying down. I just hope they don't go crazy & make a $1200 GTX280 GX2 or some other sick bs to brag about :shadedshu
DarkMatter:roll:

HAHAHAHAHAHAHAHAH!

HAHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA!

HAHAHAHAHAHAHHAHAHAHAHHAHAHAHAHAHAHHAHAHAHAHA!

HAHA...

HA...

That was a very good one. The first one for me, in fact. Congrats, you were the first one rickrolling me.
That's why I don't ever click on youtube links from these goofy forums...NEVER :D
Posted on Reply
#98
DarkMatter
btarunrNo, I wouldn't



See, don't I look serious?™



See, they polish the wafers when they're rotated at high-speeds. Could you do that with squares? Next time, hide the rick-roll in a bundle of links, don't make it obvious, I didn't fall for that last link.
Yes you could use the same technique except on the edges, that BTW I don't know why they have to polish them. You could polish them with another technique, it would be more complex but could benefit in the end price of the chips. Of course because of the way they create the wafers they can't be square, but I don't think polishing would be a problem.

About the link I made it obvious because I wanted it to be obvious, that was my joke.

EDIT: BTW is that cat photoshoped? lol
Posted on Reply
#99
DarkMatter
MegastyThat's why I don't ever click on youtube links from these goofy forums...NEVER :D
I never do, that's why bt made such an achievement. :ohwell:
Posted on Reply
#100
swaaye
What's wrong with GTX 280 again? It looks like it's 30% faster than a 8800 GTX and that seems right inline with where it should be.
Posted on Reply
Add your own comment
Nov 28th, 2024 15:29 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts