• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Asks Card Makers to Reduce Manufacturing Costs of 8800 GT Cards

:laugh::roll:

I'm sorry, but it's funny to read everyones comments here and see how ignorant most of you are with regards how it all works.
It's easier to design a product that has several PCB layers and generally you don't have to worry to much about trace layout, as you can put the traces that might cross on different layers thus avoiding this problem.
Most manufacturers that do their own designs, be it of motherboards or graphics cards will change the reference PCB and thus the traces won't be the same either.
Less layers doesn't always mean an inferior product as a lot of the time it's just a sign of a more mature product where the companies have figured out how to streamline their manufacturing.
As long as the company is question has come up with an electrically sound design then there's no reason why a product using fewer PCB layers should be worse than one with more layers as long as the rest of the components are the same.
Yes, it might affect overclocking, but again, this is depedant on much more than the PCB layers.
Most motherboard reference designs are made in 8 layers but are generally reduced to 6 layers when possible. I very much doubt anyone is using less for motherboards these days as they're just far to complex.
Depening on the layout of the power circuitry on these boards it should be very possible to make a card with less PCB layers that overclocks just as well as the current models, it just takes a little bit of time and efford from the board makers.
However, the smaller companies are unlikely to want to spend the R&D resources to come up with such a design unless they really have to, as it's not done in five minutes.
 
does this mean that they will be using cheaper parts? (capacitors, resistors etc) it would totally suck if they also did that. graphic cards aswell as many other hi-performance products should be made with the best resources available.

on a side note....

Today i am ditching my ATi X1800XT 512mb for a monster of a 8800GTS G92 512Mb!!!! :nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::nutkick::toast::toast::toast::toast::toast:

my first ever NON ATi card. i am proud

It can be implied that way. The PCB layers could be just an example of how to reduce costs but if AIB partners have their freedom they could use cheaper components. What's stopping them unless it's specifically written in contract? Good times ahead I say :rolleyes:
Nvidia responded in saying that the redesign is only a suggestion...

This quote above is why I say that, it will cost AIB partners more R&D money to use 6 layers then just use cheaper components IMO. Time will tell how this will turn out.


The price is high because of demand. Not due to production costs. If nvidia felt their MSRP prices were too high, theyd reduce them.

:wtf:
The price is high do to production costs, demand has nothing to do with that! They simply cannot get enough of them to market (which can explain why they are losing market share). The whole point of the article is to reduce cost! They are trying to reduce MSRP by making the product cheaper!

Despite the cost benefits, some graphics card makers are unhappy with Nvidia's suggestion, pointing out that the chip maker is in effect asking them to do the job of improving the price/performance ratio of its products, while preserving its own profit margins.

No kidding LOL, instead of just cutting prices they put the burden on AIB parters to R&D 6 layer cards to be just as effective as current cards on the market! Time will tell how this will turn out!
 
Last edited:
the 512 8800gt was never meant to be.. the real card was the 8800gts alongside a cheaper but not quite so cheap 256 8800gt with lesser performance..

nvidia commissioned a special batch of "super" 512 8800gt cards at a better price point and better performance rushed into market in limited numbers for one purpose and one purpose only.. cos the coming ati 3800 series had em worried..

the reason for shortage was they only commissioned a relatively small amount.. just enough to do the intended job.. the tricked worked remarkably well..

as i say ati are leading nvidia are reacting..

trog
 
If it was viable for the manufacturer's to reduce the number of layers to save costs they would have done it in the first place. Obviously its more convinient for the companies to use the present PCB.
Edit:- The 8800GT is available here but costs 18,000Rs(450$) when it arrived here.
 
Last edited:
The $169 HD 3850 would like to differ. :p

The HD3850 isn't a mid-range card, it is the bottom of the high-end. Mid-range is the HD2600 series.
 
If it was viable for the manufacturer's to reduce the number of layers to save costs they would have done it in the first place. Obviously its more convinient for the companies to use the present PCB.
Edit:- The 8800GT is available here but costs 18,000Rs(450$) when it arrived here.

And it's Rs. 10500 now :p
 
If it was viable for the manufacturer's to reduce the number of layers to save costs they would have done it in the first place. Obviously its more convinient for the companies to use the present PCB.

Not true at all, the original PCB was designed for the 8800GTS, it just worked for the 8800GT also, so they used it. It was easier to just use the same PCB to help get the cards out as quickly as possible. However, it is entirely viable to release a new revision that uses less layers, it is done all the time in the computer industry.
 
@btarunr been some time since I checked prices:o
@newtekie1 If it was easy for them they would have done it. Now they again have to spend a bit of money on R&D and all, instead they can just buy/produce more of the present PCB while they are making it for the other card. Plus by the time they sort it out 9x series may come. In the end it has to be looked from all sorts of views. Would have been easier if Nvidia reduced prices of th chips.
 
Not true at all, the original PCB was designed for the 8800GTS, it just worked for the 8800GT also, so they used it. It was easier to just use the same PCB to help get the cards out as quickly as possible. However, it is entirely viable to release a new revision that uses less layers, it is done all the time in the computer industry.

Not true, if it was AIB parnters wouldn't be complaining. This to me suggests that there was no R&D in a 6 layer PCB. You just can't go from 10 layers to 6 without major arch changes to the 10 layer PCB design and that takes R&D. If memory serves correctly they went from a 12 layer PCB (7900 series) to 10 layers on the G80. Now they expect AIB partners to go from 10 layers to 6 for cost reduction purposes? That's a major undertaking IMO. It was easier to use 10 layers PCB because that is what works best and I'm willing to guess there was no 6 layer PCB design in the works not because it was quicker to get it out the door. :shadedshu

If you read the article, after AIB partners complained Nvidia response was reducing PCB layers was a suggestion. SUGGESTION is the key word here! Therefore, it's not entirely viable to use 6 layers!
 
Last edited:
@xfire: mana growing economy toh patthi anntha cheap awoopoowunnu.
 
@xfire: mana growing economy toh patthi anntha cheap awoopoowunnu.
hope chastunanu. no corsair or coolermaster or antec or power safe psu's here.
 
@xfire
Lamington Road, Mumbai or Nehru Place, New Delhi.....for us it's better than a nude beach.

All you want is there.
 
ha ha ATI is driving the price wagon now. If ATI continue to sell their 3800 series for less money with more profit we can see ATI repaying AMD for the losses. Atleast that will keep AMD live for a while.
 
Isnt it nVidias responsibility to do the r/d work and produce the ideal reference board ? It cant be as easy as build and sell the gpu and allow companies to do what they want with it and sit back untill the board makers submit board for approval .. can it ? Maybe nv shouldbe the ones lowering the price by doing a die shrink instead of telling the other companies how they can improve their profits at the cost of quality ;)
 
The pcb board layers is the guts of the card, the more layers the better transfer of voltages. Basically Its the same with motherboards. You see the pcchips boards with a low number of pcb layers and asus boards with alot more pcb layers. I ran into this a few years back buying a ecs motherboard at a mom and pop store. He told me the ecs had like 4 pcb layers and the asus,gigabyte and msi had like 8 layers. I bought the ecs because its all he had in stock. Lasted 8 months and died. ALLEN

Let me take a stab at this. Between each layer on the pcb board you have electrical circuits, which are seperated by each layer. Now imagine how close the wiring would be if you take some layers out. So yes the layers do help keep a card cooler. Most people here have heard of electrical leakage when it comes to cpu's well the same thing can happen with circuits on a pcb board if their too close together. So in away yes the quality can go down if you take the pcb layers out. Also the best route for electrons is in a straight line, take the layers out and you have to reroute the circuitry all over the place. What a traffic jam that would create. Hope this helps.
 
Not true, if it was AIB parnters wouldn't be complaining. This to me suggests that there was no R&D in a 6 layer PCB. You just can't go from 10 layers to 6 without major arch changes to the 10 layer PCB design and that takes R&D. If memory serves correctly they went from a 12 layer PCB (7900 series) to 10 layers on the G80. Now they expect AIB partners to go from 10 layers to 6 for cost reduction purposes? That's a major undertaking IMO. It was easier to use 10 layers PCB because that is what works best and I'm willing to guess there was no 6 layer PCB design in the works not because it was quicker to get it out the door. :shadedshu

There probably isn't much R&D in a 6 layer PCB. The fact that nVidia suggested it, means it is viable.
 
There probably isn't much R&D in a 6 layer PCB. The fact that nVidia suggested it, means it is viable.

:shadedshu
Not when they responded to their AIB partners that it was a suggestion. You gotta read between the lines here.
 
No need to read between the lines. If nvidia is suggesting they do it, then it is viable.
 
No need to read between the lines. If nvidia is suggesting they do it, then it is viable.

Since you are unable to provide any information then your loyalty to what they say I am going to disagree with you. I as well as others have posted why this isn't viable so no need to go around in circles about it. :laugh:
 
How much will that reduction be anyway? The cheapest 8800 GT 512MB in my location is ~$230, EVGA brand, reference speeds. I'm not sure that would go down anything below $210.

I got mine (XFX) for $279.99. Still under $300 with overnight shipping. :toast:
 
^Is that a factory-overclocked variant? Just asking because I find the price too high for a stock-clock card.
 
The thing is, the 8800GT uses the same PCB as the GTS. the GTS is a little longer, and has more power circuitry - how much of that goes unused on the GT?

If they're only removing wasted space, how it is going to affect quality?

(i AM saying if. I/we dont know for sure)

No no no... do you know what PCB layers are? They are literally the amount of layers the PCB has, for each circuit, you could say a form of redundancy for the circuits.

EDIT: 6 Layers eh... thats not TOO bad. In fact for a GPU, I wouldn't really complain. Anyway, in AUS, the ASUS 8800GT 512MB TOP with the nice cooler costs $366, while the Sapphire HD3870 512MB Costs $333... with only such a small price difference the HD3870 isnt very desirable. Whats worse for AMD is the fact that the 8800GT 256MB is ONLY $300! Now generally it does have more performance overall for everything under around say, 1280x1024 but it does show a lot.
 
Since you are unable to provide any information then your loyalty to what they say I am going to disagree with you. I as well as others have posted why this isn't viable so no need to go around in circles about it. :laugh:

And why is it not viable, I have yet to see anyone post why it isn't, other than you saying it isn't becuase you said so.

What, it isn't viable because it the manufactures haven't developed it yet?

They haven't developed 32nm GPUs yet either, but I'm sure their viable.

Of course you have no actually clue how much time has been spent on developing a 6 layer PCB for the 8800GT, but we can all just take your word for it.

Yep, you really make a compelling argument.
 
10 layers to six??? WTF...Sounds to me that if you use a thinner board.... that it would make it flimsy due to loss of mass or very brittle if they tried to stiffen the board to compinsate for said loss of mass.?:wtf: And if the more layers equals better handling of voltage....WHY in the world would you want to "degrade" a product that already kicks butt:eek:...Is still leading the market share....Why...to save a few cents per board...Isn't greed wonderful...To start making am enferior product is shameful...Hey nVidia, you guys DON"T really have to eat lobster every night for dinner you know..:shadedshu..lol
 
Last edited:
Back
Top