Well,
I did mention it in passing, but most people are fixated on gaming rather than the industry as a whole. Which is strange considering that the larger picture dictates what products, how many, and in what timeframe we see future Radeon branded cards.
Exactly this thread went off the rails within the first page. Honestly I had not dug through
all these posts, I went to the end by like the second page. But yes the issue is neither want to be suck with inventory, and to accomplish that having other Professional sku's does add to the profit margins to obtain a payback.
You seem to have forgotten the Quadro line entirely. The M6000 is currently in the channel, while it seems it will be joined by further Maxwell 2 variants when this years SIGGRAPH rolls around.
You stay up on that side of Green team, it's hard to forget something that I've never read about. It appears to have been brushed-over in its mid-March announcement, bowled-over with TitanX, I suppose I see how I missed that. The
Anandtech article was one of a handful that mentioned it in any detail. Come-one... SIGGRAPH was from an article yesterday, super big news on lower-end, not about these big chips. Sidetrack much?
If history is any indicator, the Nvidia card at a lower price will be the 980 Ti. Nvidia still have to release a full-die GM 200 GTX-number series card ( as seen with the 780 Ti / Titan Black). I would assume that Nvidia's product release cadence (in the face of a minimal urgency now that the Fury X has been realized) would be to ready it for the high sales volume fourth quarter holiday season, since both companies have shot their wad architecturally speaking for the foreseeable future.
So you'd say they run the 980Ti MSRP down in the stank and work a full-fledged GM200 on a
new numbered GTX in above. Then there are surly GM200's that are below the current 980Ti gelding spec, do we see how they perhaps need to use those?
enthusiast sector, whether single card or multi-GPU, has never been a significant proportion of the buying public. Tech sites tend to give a distorted view of uptake. Go to a mainstream site and uptake looks better than any random gathering of people, go to a specialist enthusiast/OC/benchmark/hard-mod site and you'd swear that entry level is a single (minimum) top tier card an multiple diaplays and/or 4K. Sales of ultra enthusiast (say $600+) cards might be in the order of tens of thousands- maybe six figures if you're lucky, balanced out in a total market of fifty million in the year.
Exactly, lots of this engineering progression, hinges on understanding the market expansion. Not being in the right point of the curve to offer enough of the right product, or too much wrong ones can stick you in this case with costly inventory (aka Tahiti). I do say Nvidia is good at reading the tea-leaves and getting mix right.
While we might be seeing some Enthusiast from just a year or so ago starting to feel the "drum-beat" of 4K, I see it’s more reality in the next node shrink. I perhaps think there’s gamers that might try 3x 1440p for Eyefinity/Surround if the price was right, that’s perhaps a better the hook to sell these top-end cards.
What I was postulating, is mainstream gamers need to see the path in transitioning from 1080p. It seems right today panel manufactures aren't or can't move higher resolution and their subsequent tech down in price, so the whole thing goes stagnate. Sure GPU side can offer the hardware (what they can within 28nm); however folk aren’t going to buy into that if they can’t see a monitor(s) upgrade as obtainable in the near term. I really thought panel manufactures would've ran with the whole Adaptive Sync to aid in some new market push, but it (or they) don’t seem to be generating that momentum. A mainstream gamer (@1080p) should be able right now to realize a card that's in a $280-400 range, and then see in the relative near future decent though generic 1440p Adaptive Sync panel for $350-400. From where I’m watching, I’m not sensing a move in that direction.