Friday, December 28th 2007

NVIDIA Asks Card Makers to Reduce Manufacturing Costs of 8800 GT Cards

NVIDIA recently contacted its graphics card partners asking them to reduce the number PCB board layers used in GeForce 8800 GT-based graphics cards from ten to six in order to reduce manufacturing costs and so lower the card's ASP (average selling price) in the market. The redesign would allow the NVIDIA cards to compete in terms of pricing with AMD's Radeon HD 3800 series products. Although the Radeon HD 3800 series was launched three weeks later than the GeForce 8800 GT, Radeon 3800 demand has started to pick up, bringing the market shares of NVIDIA and AMD from 90% and 10%, originally, to 70% and 30%. If the PCB layers are reduced from ten to six, graphics card makers are expected to save more than US$10 for each card, which would allow the NVIDIA products to go into price competition with those of AMD. Despite the cost benefits, some graphics card makers are unhappy with NVIDIA's suggestion, pointing out that the chip maker is in effect asking them to do the job of improving the price/performance ratio of its products, while preserving its own profit margins. NVIDIA responded in saying that the redesign is only a suggestion which it believes is the best solution to meet the current market conditions. Card makers will not be forced to implement the change, the company stressed.
Source: DigiTimes
Add your own comment

61 Comments on NVIDIA Asks Card Makers to Reduce Manufacturing Costs of 8800 GT Cards

#26
EastCoasthandle
FreedomEclipsedoes this mean that they will be using cheaper parts? (capacitors, resistors etc) it would totally suck if they also did that. graphic cards aswell as many other hi-performance products should be made with the best resources available.

on a side note....

Today i am ditching my ATi X1800XT 512mb for a monster of a 8800GTS G92 512Mb!!!! :nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::toast::toast::toast::toast::toast:

my first ever NON ATi card. i am proud
It can be implied that way. The PCB layers could be just an example of how to reduce costs but if AIB partners have their freedom they could use cheaper components. What's stopping them unless it's specifically written in contract? Good times ahead I say :rolleyes:
Nvidia responded in saying that the redesign is only a suggestion...
This quote above is why I say that, it will cost AIB partners more R&D money to use 6 layers then just use cheaper components IMO. Time will tell how this will turn out.
ghost101The price is high because of demand. Not due to production costs. If nvidia felt their MSRP prices were too high, theyd reduce them.
:wtf:
The price is high do to production costs, demand has nothing to do with that! They simply cannot get enough of them to market (which can explain why they are losing market share). The whole point of the article is to reduce cost! They are trying to reduce MSRP by making the product cheaper!
Despite the cost benefits, some graphics card makers are unhappy with Nvidia's suggestion, pointing out that the chip maker is in effect asking them to do the job of improving the price/performance ratio of its products, while preserving its own profit margins.
No kidding LOL, instead of just cutting prices they put the burden on AIB parters to R&D 6 layer cards to be just as effective as current cards on the market! Time will tell how this will turn out!
Posted on Reply
#27
mdm-adph
newtekie1Meh, the mid-range and low end cards in both camps are complete ass anyways.
The $169 HD 3850 would like to differ. :p
Posted on Reply
#28
trog100
the 512 8800gt was never meant to be.. the real card was the 8800gts alongside a cheaper but not quite so cheap 256 8800gt with lesser performance..

nvidia commissioned a special batch of "super" 512 8800gt cards at a better price point and better performance rushed into market in limited numbers for one purpose and one purpose only.. cos the coming ati 3800 series had em worried..

the reason for shortage was they only commissioned a relatively small amount.. just enough to do the intended job.. the tricked worked remarkably well..

as i say ati are leading nvidia are reacting..

trog
Posted on Reply
#29
xfire
If it was viable for the manufacturer's to reduce the number of layers to save costs they would have done it in the first place. Obviously its more convinient for the companies to use the present PCB.
Edit:- The 8800GT is available here but costs 18,000Rs(450$) when it arrived here.
Posted on Reply
#30
newtekie1
Semi-Retired Folder
mdm-adphThe $169 HD 3850 would like to differ. :p
The HD3850 isn't a mid-range card, it is the bottom of the high-end. Mid-range is the HD2600 series.
Posted on Reply
#31
btarunr
Editor & Senior Moderator
xfireIf it was viable for the manufacturer's to reduce the number of layers to save costs they would have done it in the first place. Obviously its more convinient for the companies to use the present PCB.
Edit:- The 8800GT is available here but costs 18,000Rs(450$) when it arrived here.
And it's Rs. 10500 now :p
Posted on Reply
#32
newtekie1
Semi-Retired Folder
xfireIf it was viable for the manufacturer's to reduce the number of layers to save costs they would have done it in the first place. Obviously its more convinient for the companies to use the present PCB.
Not true at all, the original PCB was designed for the 8800GTS, it just worked for the 8800GT also, so they used it. It was easier to just use the same PCB to help get the cards out as quickly as possible. However, it is entirely viable to release a new revision that uses less layers, it is done all the time in the computer industry.
Posted on Reply
#33
xfire
@btarunr been some time since I checked prices:o
@newtekie1 If it was easy for them they would have done it. Now they again have to spend a bit of money on R&D and all, instead they can just buy/produce more of the present PCB while they are making it for the other card. Plus by the time they sort it out 9x series may come. In the end it has to be looked from all sorts of views. Would have been easier if Nvidia reduced prices of th chips.
Posted on Reply
#34
EastCoasthandle
newtekie1Not true at all, the original PCB was designed for the 8800GTS, it just worked for the 8800GT also, so they used it. It was easier to just use the same PCB to help get the cards out as quickly as possible. However, it is entirely viable to release a new revision that uses less layers, it is done all the time in the computer industry.
Not true, if it was AIB parnters wouldn't be complaining. This to me suggests that there was no R&D in a 6 layer PCB. You just can't go from 10 layers to 6 without major arch changes to the 10 layer PCB design and that takes R&D. If memory serves correctly they went from a 12 layer PCB (7900 series) to 10 layers on the G80. Now they expect AIB partners to go from 10 layers to 6 for cost reduction purposes? That's a major undertaking IMO. It was easier to use 10 layers PCB because that is what works best and I'm willing to guess there was no 6 layer PCB design in the works not because it was quicker to get it out the door. :shadedshu

If you read the article, after AIB partners complained Nvidia response was reducing PCB layers was a suggestion. SUGGESTION is the key word here! Therefore, it's not entirely viable to use 6 layers!
Posted on Reply
#35
btarunr
Editor & Senior Moderator
@xfire: mana growing economy toh patthi anntha cheap awoopoowunnu.
Posted on Reply
#36
xfire
btarunr@xfire: mana growing economy toh patthi anntha cheap awoopoowunnu.
hope chastunanu. no corsair or coolermaster or antec or power safe psu's here.
Posted on Reply
#37
btarunr
Editor & Senior Moderator
@xfire
Lamington Road, Mumbai or Nehru Place, New Delhi.....for us it's better than a nude beach.

All you want is there.
Posted on Reply
#38
suraswami
ha ha ATI is driving the price wagon now. If ATI continue to sell their 3800 series for less money with more profit we can see ATI repaying AMD for the losses. Atleast that will keep AMD live for a while.
Posted on Reply
#39
Mad-Matt
Isnt it nVidias responsibility to do the r/d work and produce the ideal reference board ? It cant be as easy as build and sell the gpu and allow companies to do what they want with it and sit back untill the board makers submit board for approval .. can it ? Maybe nv shouldbe the ones lowering the price by doing a die shrink instead of telling the other companies how they can improve their profits at the cost of quality ;)
Posted on Reply
#40
IcrushitI
allen337The pcb board layers is the guts of the card, the more layers the better transfer of voltages. Basically Its the same with motherboards. You see the pcchips boards with a low number of pcb layers and asus boards with alot more pcb layers. I ran into this a few years back buying a ecs motherboard at a mom and pop store. He told me the ecs had like 4 pcb layers and the asus,gigabyte and msi had like 8 layers. I bought the ecs because its all he had in stock. Lasted 8 months and died. ALLEN
Let me take a stab at this. Between each layer on the pcb board you have electrical circuits, which are seperated by each layer. Now imagine how close the wiring would be if you take some layers out. So yes the layers do help keep a card cooler. Most people here have heard of electrical leakage when it comes to cpu's well the same thing can happen with circuits on a pcb board if their too close together. So in away yes the quality can go down if you take the pcb layers out. Also the best route for electrons is in a straight line, take the layers out and you have to reroute the circuitry all over the place. What a traffic jam that would create. Hope this helps.
Posted on Reply
#41
newtekie1
Semi-Retired Folder
EastCoasthandleNot true, if it was AIB parnters wouldn't be complaining. This to me suggests that there was no R&D in a 6 layer PCB. You just can't go from 10 layers to 6 without major arch changes to the 10 layer PCB design and that takes R&D. If memory serves correctly they went from a 12 layer PCB (7900 series) to 10 layers on the G80. Now they expect AIB partners to go from 10 layers to 6 for cost reduction purposes? That's a major undertaking IMO. It was easier to use 10 layers PCB because that is what works best and I'm willing to guess there was no 6 layer PCB design in the works not because it was quicker to get it out the door. :shadedshu
There probably isn't much R&D in a 6 layer PCB. The fact that nVidia suggested it, means it is viable.
Posted on Reply
#42
EastCoasthandle
newtekie1There probably isn't much R&D in a 6 layer PCB. The fact that nVidia suggested it, means it is viable.
:shadedshu
Not when they responded to their AIB partners that it was a suggestion. You gotta read between the lines here.
Posted on Reply
#43
newtekie1
Semi-Retired Folder
No need to read between the lines. If nvidia is suggesting they do it, then it is viable.
Posted on Reply
#44
EastCoasthandle
newtekie1No need to read between the lines. If nvidia is suggesting they do it, then it is viable.
Since you are unable to provide any information then your loyalty to what they say I am going to disagree with you. I as well as others have posted why this isn't viable so no need to go around in circles about it. :laugh:
Posted on Reply
#45
CrAsHnBuRnXp
btarunrHow much will that reduction be anyway? The cheapest 8800 GT 512MB in my location is ~$230, EVGA brand, reference speeds. I'm not sure that would go down anything below $210.
I got mine (XFX) for $279.99. Still under $300 with overnight shipping. :toast:
Posted on Reply
#46
btarunr
Editor & Senior Moderator
^Is that a factory-overclocked variant? Just asking because I find the price too high for a stock-clock card.
Posted on Reply
#47
tkpenalty
MusselsThe thing is, the 8800GT uses the same PCB as the GTS. the GTS is a little longer, and has more power circuitry - how much of that goes unused on the GT?

If they're only removing wasted space, how it is going to affect quality?

(i AM saying if. I/we dont know for sure)
No no no... do you know what PCB layers are? They are literally the amount of layers the PCB has, for each circuit, you could say a form of redundancy for the circuits.

EDIT: 6 Layers eh... thats not TOO bad. In fact for a GPU, I wouldn't really complain. Anyway, in AUS, the ASUS 8800GT 512MB TOP with the nice cooler costs $366, while the Sapphire HD3870 512MB Costs $333... with only such a small price difference the HD3870 isnt very desirable. Whats worse for AMD is the fact that the 8800GT 256MB is ONLY $300! Now generally it does have more performance overall for everything under around say, 1280x1024 but it does show a lot.
Posted on Reply
#48
newtekie1
Semi-Retired Folder
EastCoasthandleSince you are unable to provide any information then your loyalty to what they say I am going to disagree with you. I as well as others have posted why this isn't viable so no need to go around in circles about it. :laugh:
And why is it not viable, I have yet to see anyone post why it isn't, other than you saying it isn't becuase you said so.

What, it isn't viable because it the manufactures haven't developed it yet?

They haven't developed 32nm GPUs yet either, but I'm sure their viable.

Of course you have no actually clue how much time has been spent on developing a 6 layer PCB for the 8800GT, but we can all just take your word for it.

Yep, you really make a compelling argument.
Posted on Reply
#49
pentastar111
10 layers to six??? WTF...Sounds to me that if you use a thinner board.... that it would make it flimsy due to loss of mass or very brittle if they tried to stiffen the board to compinsate for said loss of mass.?:wtf: And if the more layers equals better handling of voltage....WHY in the world would you want to "degrade" a product that already kicks butt:eek:...Is still leading the market share....Why...to save a few cents per board...Isn't greed wonderful...To start making am enferior product is shameful...Hey nVidia, you guys DON"T really have to eat lobster every night for dinner you know..:shadedshu..lol
Posted on Reply
#50
spud107
the board does not get any thinner, just less layers with tracks running through, more filler layers,
with cutting out the uneeded stuff and more room between the tracked layers the interference may be less,
dont really see what the fuss is about, its just a pcb redesign, like 8800gt rev2
Posted on Reply
Add your own comment
Jul 28th, 2024 18:17 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts