I don't think I'm following you to the TEE but I will still try to respond. So If I don't get the my answer correct please clarify what you said.
So what your telling me is that ATI/AMD doesn't differentiate their Double Die cards from a single die. And you believe they said that sentence just to lour in ATI into believing that they had Nvidia's Femi on the run?
And you are saying that for Nvidia we will see a GTX380(or varients of it) being their top card. And ATI's Duel GPU 5970 as theirs.
So you don't think that Nvidia will respond with a Duel GPU them selves? Something like a GTX395?
If that is what you are saying....... I could see why Nvidia would use that tatic against ATI. Making ATI think Nvidia is worried. Also if they used just one large die instead of two Nvidia might be able to keep the price down.
but with that said these new Gen cards are really powerful. IMO I still can't see it happening. (being out powered by a single GPU) If you remember just recently when the 5800 series was about to release...... The specs looked like that it should kill the 4870x2 and alos the GTX 295. When the card eventually released it came close but it wasn't as great as everyone thought it would be.
That same thing seems to happen every time any Video card comes out. Going back from the GTX200's to the 4800 Series to the 9000GTX's. The only one that happened to be right on the mark IMO was the 8800 series by Nvidia. The whole point is that every new card looks like a killer on paper. But there are allot of thing contributing to it....... Not only does it 50% depend on hardware but the other 50% depends on Drivers as well.
Now I won't go any further with that because again...... It might just be me reading it wrong or maybe you just were typing really fast but I'm not following you completely and want you to better explain yourself so I don't get it wrong
I will admit that some of my thoughts come from Red team brand loyalty. But to be honest I am always for any company that takes the technology to the next level. Because of course that will always make things keep pushing it to the next level!
After thought....... you seem to be someone who knows a great deal about Nvidia Cards and the company. What is your opinion on the prices that they should bring these great cards out at? And if since Nvidia from what I heard seem's to be in financial problems what do you think would happen to them if they priced them too high?
What I was saying is that AMD shifted their strategy towards multi-GPU as the way to create the high-end parts. In theory, their desire is to create small dies to serve the mainstream and performance price points and then put them together to create the high-end. When they presented this strategy, they said that soon they would be putting 4, 6, 8 small dies in order to create different performance levels. This is in absolute contrast to Nvidia's atrategy of creating the bigger modular design they can and then cut it down to create the mainstream products.
My comment was about that divergence in focus. At Nvidia when they start designing their chip they have to aim for the dual-GPU card to be on the safe side, even if they are going to create a dual-GPU card themselves, because when the project starts, 3-4 years before it reaches stores, they don't know what AMD will do. What if AMD puts 3 or 4 dies on a card, for example?
About pricing, it's really hard to say. We can make a guesstimate about how much it will cost Nvidia to create the cards, but we don't know how much they will charge, it will depend on the market, demand, performance, etc.. About production costs, once that 40nm yields improve, they will be cheaper to produce than GT200 cards (smaller die, 384vs512 bit), so if needed or if they simply want to, they can sell them at very similar prices as Ati cards* without sacrificing profits like they did with GT200.
*Reasons being:
- HD5xxx cards cost more than HD4xxx card to produce: bigger die, fastest GDDR5 memory.
- Nvidia will apparently use cheaper slower GDDR5 memory, that will aleviate the price difference a bit.
- Nvidia will sell Fermi Tesla cards (technically the same thing) in the HPC market and depending on how well they do there, they will be able to adapt their profit requirements on the GPU market and compete better. Profits-per-card are 10-20 times bigger in the HPC market, so in essence every Tesla card sold could aleviate the need to make a profit in 10-20 GeForce cards if really required.
One thing is sure, they will always be competitive, at least on cards that directly compete with AMD cards and the faster ones will be forced to come down too or they will become worthless. This makes slower cards to adapt again and the ball keeps rolling.
Nvidia having faster cards doesn't hurt competition as much as people think. The GTX260 did come down in price a lot (so did the 8800GT at the time) because it competed with the HD4870, only the GTX280/285 remained expensive AND if the prerequisite for competition is that Nvidia doesn't have a faster card, then the undeniable truth is that GTX280/285 and that performance level would have never existed in the first place.
The feel of lack of competition is just subjective and abstract. It's that people look at the GTX285 and want it and think there's no competition at that price point because it's expensive, that it doesn't make sense in a performance/price basis and hence they think it would have been better if Nvidia didn't outperform AMD. Well, but if GTX285 never existed (or performed like a HD4870) that performance level would have never existed in the first place and HD4870/GTX260 (or GTX280 with same performace as HD4870) would probably cost much more than they did, they would both be priced as premium cards instead of "second in charge" cards. What I'm trying to say is that the prerequisite for competition is that Nvidia releases a card with similar performance as the HD5870, how many cards they have above that level is irrelevant. The ideal thing is not that Nvidia fails to outperform HD5870 now. the idel situation would have been if the HD5870 was already in the performance point where the GTX380 specs suggest it will land.
Sorry for the long response.