Saturday, June 28th 2008
NVIDIA in a Rush for 55nm Parts, has Partners 'Red-Eyed'
With the cost of manufacture for a standard G200 die reaching up to US $110, thanks to yields as low as 40 per cent, NVIDIA seems to be in a rush for a 55nm revamp of its current GPUs. While nothing revolutionary is on the cards, and with 55nm G92b already in the making, NVIDIA plans to revamp its G200 graphics processors to the 55nm fab process, increasing yields up to 50 per cent. At 55nm, the G200 die will be effectively reduced to 470 sq. mm, implies 120 dice on a 300mm wafer.
The pace at which things are moving is having the partners red-eyed. NVIDIA's new Unilateral Minimum Advertised Price Policy (UMAP) has limited partners' playing field and minimizes competition between them. When NVIDIA at the same time decides to launch new cards based on existing cores, at lower prices, partners get upset over diminishing earnings. Add to that AMD's new RV770 chip is looking very tempting to some of these partners.
Source:
NordicHardware
The pace at which things are moving is having the partners red-eyed. NVIDIA's new Unilateral Minimum Advertised Price Policy (UMAP) has limited partners' playing field and minimizes competition between them. When NVIDIA at the same time decides to launch new cards based on existing cores, at lower prices, partners get upset over diminishing earnings. Add to that AMD's new RV770 chip is looking very tempting to some of these partners.
33 Comments on NVIDIA in a Rush for 55nm Parts, has Partners 'Red-Eyed'
It's the first GPU to actually make proper 100% real time use of it's 1GB texture RAM.
For too long, people have had components in their systems that are top of the line, high end, and in some cases overkill for the applications they are attempting to run. Yet the GPU is the thing that holds them back, with it's hitching, bottoming-out RAMDAC, and instabilities.
The 4870 might be on par with top-end potential frame rates, but 80 fps is not what I need to 'get my game on.' I don't need 60, 50 or even 40 frames for that matter. What I need, what we all need, is a card they lets us stay consistent as much as possible. To take whatever the game throws at it, chew it up, and spit it back out while grinning the whole time.
Nvidia knows that consumers in 'the know,' will pay the premium price just for this aspect alone.
The problem is, that the general consumer isn't aware of these issues; or rather they notice them, yet cannot define or comprehend them - thus not being able to realise that the GTX 280 resolves said issues.
THAT part of it, will hurt Nvidia, yet again, with the amount of money and resources they have, they probably don't care.
They'll drag out the price on the 280 as long as they can, and for good reason.
Which leads us back to square one : People on a budget need to stop bitching that they can't have high-end products at a non-budget price.
On a lighter note, consider that there's plenty of cards already on the market (pre 4800/200) that can handle a lot of 3d applications without a problem; and now, they're becoming cheaper.
If anything, rejoice that your budget can accomodate a nice piece of hardware, and let those people who are fortunate enough to have lots of money, be able to enjoy the 'best of the best,'
rather than crying and bitching about it.
Life.... deal with it.
but has anyone really stopped to think how much the system itself adds to this issue moreso than the VGA apdater itself?
Sure, it might take a 3870 1GB a little longer to load textures into VRAM, but sitting on a PCIE2.0 BUS decreases load times as well; but if you're trying to load up textures with a P4 sitting in the CPU throne, that in itself will drastically lengthen how long it takes for the VGA adapter to be able to load everything up. You're dealing with a massively bottlenecked system, a bottlenecked BUS, pathetic L1/L2 caches . . . all that can seriously anchor minimum FPS by extended load times.
Just in comparison with my system, I experience very little load-up stuttering and even then, the amount of time is minimal (except in the case of Crysis). Does it detract at all from the gaming experience? Not in the least, seeing as how it's only an issue once a game level is loaded up . . . once the game itself is off an running, I don't notice any other issues until you hit that one spot where it needs to swap in/out textures, and unless that 0.5s pause is going to get one's panties in a bunch, I find it nothing to complain about at all.
Sorry mods