Bo_Fox
New Member
- Joined
- May 29, 2009
- Messages
- 480 (0.08/day)
- Location
- Barack Hussein Obama-Biden's Nation
System Name | Flame Vortec Fatal1ty (rig1), UV Tourmaline Confexia (rig2) |
---|---|
Processor | 2 x Core i7's 4+Gigahertzzies |
Motherboard | BL00DR4G3 and DFI UT-X58 T3eH8 |
Cooling | Thermalright IFX-14 (better than TRUE) 2x push-push, Customized TT Big Typhoon |
Memory | 6GB OCZ DDR3-1600 CAS7-7-7-1T, 6GB for 2nd rig |
Video Card(s) | 8800GTX for "free" S3D (mtbs3d.com), 4870 1GB, HDTV Wonder (DRM-free) |
Storage | WD RE3 1TB, Caviar Black 1TB 7.2k, 500GB 7.2k, Raptor X 10k |
Display(s) | Sony GDM-FW900 24" CRT oc'ed to 2560x1600@68Hz, Dell 2405FPW 24" PVA (HDCP-free) |
Case | custom gutted-out painted black case, silver UV case, lots of aesthetics-souped stuff |
Audio Device(s) | Sonar X-Fi MB, Bernstein audio riser.. what?? |
Power Supply | OCZ Fatal1ty 700W, Iceberg 680W, Fortron Booster X3 300W for GPU |
Software | 2 partitions WinXP-32 on 2 drives per rig, 2 of Vista64 on 2 drives per rig |
Benchmark Scores | 5.9 Vista Experience Index... yay!!! What??? :) |
What I was saying is that AMD shifted their strategy towards multi-GPU as the way to create the high-end parts. In theory, their desire is to create small dies to serve the mainstream and performance price points and then put them together to create the high-end. When they presented this strategy, they said that soon they would be putting 4, 6, 8 small dies in order to create different performance levels. This is in absolute contrast to Nvidia's atrategy of creating the bigger modular design they can and then cut it down to create the mainstream products.
My comment was about that divergence in focus. At Nvidia when they start designing their chip they have to aim for the dual-GPU card to be on the safe side, even if they are going to create a dual-GPU card themselves, because when the project starts, 3-4 years before it reaches stores, they don't know what AMD will do. What if AMD puts 3 or 4 dies on a card, for example?
About pricing, it's really hard to say. We can make a guesstimate about how much it will cost Nvidia to create the cards, but we don't know how much they will charge, it will depend on the market, demand, performance, etc.. About production costs, once that 40nm yields improve, they will be cheaper to produce than GT200 cards (smaller die, 384vs512 bit), so if needed or if they simply want to, they can sell them at very similar prices as Ati cards* without sacrificing profits like they did with GT200.
*Reasons being:
- HD5xxx cards cost more than HD4xxx card to produce: bigger die, fastest GDDR5 memory.
- Nvidia will apparently use cheaper slower GDDR5 memory, that will aleviate the price difference a bit.
- Nvidia will sell Fermi Tesla cards (technically the same thing) in the HPC market and depending on how well they do there, they will be able to adapt their profit requirements on the GPU market and compete better. Profits-per-card are 10-20 times bigger in the HPC market, so in essence every Tesla card sold could aleviate the need to make a profit in 10-20 GeForce cards if really required.
One thing is sure, they will always be competitive, at least on cards that directly compete with AMD cards and the faster ones will be forced to come down too or they will become worthless. This makes slower cards to adapt again and the ball keeps rolling.
Nvidia having faster cards doesn't hurt competition as much as people think. The GTX260 did come down in price a lot (so did the 8800GT at the time) because it competed with the HD4870, only the GTX280/285 remained expensive AND if the prerequisite for competition is that Nvidia doesn't have a faster card, then the undeniable truth is that GTX280/285 and that performance level would have never existed in the first place.
The feel of lack of competition is just subjective and abstract. It's that people look at the GTX285 and want it and think there's no competition at that price point because it's expensive, that it doesn't make sense in a performance/price basis and hence they think it would have been better if Nvidia didn't outperform AMD. Well, but if GTX285 never existed (or performed like a HD4870) that performance level would have never existed in the first place and HD4870/GTX260 (or GTX280 with same performace as HD4870) would probably cost much more than they did, they would both be priced as premium cards instead of "second in charge" cards. What I'm trying to say is that the prerequisite for competition is that Nvidia releases a card with similar performance as the HD5870, how many cards they have above that level is irrelevant. The ideal thing is not that Nvidia fails to outperform HD5870 now. the idel situation would have been if the HD5870 was already in the performance point where the GTX380 specs suggest it will land.
Sorry for the long response.
--HD 4870/4890 had "FAST" GDDR5 memory at the time those cards were launched. GDDR5 was a brand new tech, so it was not cheap. It was expensive enough for Nvidia to deliberately avoid it for the time being.
-- Even though Nvidia will be using slower GDDR5 memory than the 5870, the GT300 will be using 12 GDDR5 chips, which will be more than 8 GDDR5 modules found on the 5870. Even if Nvidia is using only 384 bits, which is slightly cheaper than 512-bit on the GTX 285, 12 GDDR5 chips at 4.2GHz effective is hardly any cheaper than 8 GDDR5 chips at 4.8GHz effective found on 5870 cards. Heck, those new GDDR5 chips on the GT300 would be rated at like 4.6-4.8 GHz or so.
-- The GTX 280/285 did not "remain expensive". Nvidia designed those chips with a $500+ price-point in mind. The reality is that Nvidia ended up having to sell those cards for much less than $500 from day one.
Now, Nvidia is refraining from making any more of those GT200b (55nm) chips, even if it pisses off the 3rd party retailers (EVGA, BFG, MSI, XFX, etc..). Nvidia knows that it's no longer worth selling those monster chips at a loss, especially after ATI has released the next-generation GPU (since selling at a loss would only backfire further if Nvidia tried harder). It's good to just let the prices rise a little bit just in time for Fermi's release, so that the high-end Fermi chip will sell well at $550+.
Just my 2 cents.. to add to this discussion blah blah!