The GK104 has been a massive success for Nvidia, and as you said yourself it was purely focused on gaming and it has and still does it's job brilliantly. No one gives a toss about compute benchmarks they never run. Can't say I've missed out on anything running my little GTX 670, in fact it does me proud every day.
At least give them credit where it is due.
And yet GK110 powered cards like the K20 and K20X have been on the market for ages already, the chip that powers Titan is literally over a year old already.
At least give them credit where it is due.
Can't argue with that, but you make your bed and you lie in it.
Way to contradict yourself, bud. Attempts to argue that compute performance is worthless, proceeds to bring up compute cards as prime examples. Umm...kay.
I will give Nvidia credit where it's due: Nvidia have done well to milk their obscenely overpriced range of cards for this long, and a fully ungimped and functioning GK110 still has a lot of potential as a GeForce card against the R290X, if Nvidia are willing to release it soon and for a comparable price. If they still decide not to, Nvidia can prepare to delay Maxwell or scrap their costly GK110 salvage parts and/or sell them at give-away prices.
There are some things you seem to be forgetting:
These cards are being sold mostly to gamers, so no one cares how good the cards are at GPU Compute tasks. You're making the same argument that people made to defend Fermi. AMD should have learned from nVidia's mistake and learned the GPU Compute doesn't sell desktop GPUs.
Of course if the 7970 had been faster nVidia wouldn't have been able to use GK104 to compete, but it wasn't faster. I might as well say if the GTX580 had been a little bit faster, nVidia wouldn't have needed to release GK104 at all. You can live in a land of IF's all you want, but it doesn't help you make a point.
The fact is no one but nVidia knows how ready GK110 was. We certainly know the fab was capable of producing GK110 when GK104 was launch, we also know the architecture was ready. So it really comes down to yields, and obviously since nVidia knew what they had to compete with, they went with GK104 because it had much better yields. But I bet GK110 would have been possible in place of the GK104 cards we got if nVidia had to use it.
And no, AMD cards don't compete mainly against their own previous generation, they compete against nVidia. And we should be cutting AMD slack just because they mis-managed their company and now have no R&D funds.
I guess you missed the part where I said I liked the flagship Fermi parts and would prefer a fully functioning/overheating Fermi-equivalent with un-gimped compute -- over a cut-down, gimped and self-throttling Kepler to skew benchmarks.
I never supported AMD's horrible management that cut some 30% of its engineering force, was merely pointing out the fact that AMD during its best days vs Nvidia during their worst recent days, cannot remotely compare on paper in R&D budgets or any other financial stats, seeing how they are competing (or at least attempting to compete) against both Intel AND Nvidia, and it's a miracle that AMD are actually managing to do it in the costly top-of-the-line GPU market, instead of abandoning discrete GPUs entirely and becoming another APU/SOC-only company chasing Intel's most sought-after market (which may be a reality for AMD sooner than later). By all means feel free to show me info that says otherwise, but if you're going to somehow argue against them even attempting to compete, don't bother. Enjoy your brownie points from the green-favouring zealots on here and move along, I'm not going to even attempt to prove you (or them) wrong in this respect.
...so to sum it up:
1) they claim its cheaper
2) they ignore noise, temp and power consumption
3) then they say throw a water block on it ignoring the price of a water-cool kit (+$150 for the block and +$400 for the whole setup)
4) selective/inconsistency in comparison of performance of 290x (in uber mode), but using silent mode to compare noise and temp.
5) overhype with words such as "destroy" "kill" "massacre" when benchmarks show they are fairly equal matched.
6) selectively/inconsistency in comparison of price to titan when situation favors them. but compares to 780 when other situations fits them(performance).
1) people claim it's cheaper because it is, get over it.
2) that's because 3rd party designs are already on the way, which are never going to happen for the Titan.
3) see above post. Even so, R290X + watercooling or any other 3rd party cooling kit you're going to try grasping straws over, comes out cheaper, regardless of whether it is going against a Titan or a 780.
4) again, see point 2. It will take MSI/Gigabyte 5 minutes to drop some silent triple/double fan coolers they already have on their 7000/700 series cards on this R290X, which will make your point moot. Reference cooler card reviews sometimes don't mean shit (go see the GTX 770 stock cooler reviews with the Titan cooler -- it is almost not sold anywhere and is therefore pointless).
5) that's because it's true. R290X wins against whatever you want to try compare it to. Is it cheaper? Yep. Does it perform better? Yup. At lower res (290X disadvantage -- little use for the huge 512bit bus/ROPs)? Yep. At higher res (huge 290X disadvantage vs Titan with its 2GB more VRAM)? Still yes. You cannot spin it in Nvidia's favour in any way, other than the facts that Nvidia did enjoy the early lead and lower power consumption/thermals. And this is all excluding the fact that the R290X is on very early drivers which WILL get better performance from AMD -- not so with Nvidia, as they have had a 9 month head start already. And with temperature throttling on that shitty stock cooler, which will get a big advantage once those Twin Frozr/Windforce-like designs drop.
Yes quite some massacre. Always looks good when you leave out the benches that don't look so good- let me guess, you left out the Skyrim bench by accident?. I bet George Armstrong Custer is rueing the fact that he couldn't "rt click>save as" the Lakota Sioux he wanted to fight.
Seeing as how you posted so many benches, I assume you were going for the completeness motif- so here's the TESV bench and the CFX/SLI 7680x1440 results:
http://img.techpowerup.org/131025/tomshw.jpg
Can you even read what you're posting? R290X won 6/8 of the single card benchmarks you posted, (WITH A FREAKING 2GB VRAM deficiency, no less) so as a last desperate attempt you have to drag in Crossfire support of a 1-day old self-throttling card vs 9+ month old Nvidia cards in your pathetic attempt to grasp at your green-coloured straws -- which relies ENTIRELY on several-month old support of stable post-release drivers?
And DIRECTX 9?
Thanks for the good laugh, ya crazy Nvidia zealot, but you invalidated your opinion the minute you brought up a shitty, old & horribly ported DX9 game in what is now AMD's 4th gen DX11 flagship card review.
Please leave and take your fail with you, and while you're at it, bring in the old DX7/DX8 titles with Quake III, HL1 and Unreal Tournament in the mix for good measure, because if Nvidia aren't winning, you gotta keep digging for those pre-historic benchmarks nobody gives a flying shit about!
Please feel free to flame me with your predictable "AMD fanboy" comments though, despite the fact that all my current PCs run Intel/Nvidia GPUs -- I can always use a good laugh. Nvidia zealots are getting too predictable these days.
P.S. and 4K benchmarks DO matter because they show exactly how future proof the GPU is. How many of you were screaming "1080p benches are worthless" 10 years ago when we were still rolling on our 1280x1024 CRTs? 4K is on its way to being relevant over the next 2 years; the 7970 was AMD's flagship for nearly 2 years, so 4K benches sure as hell matter, to show progress in future GPUs if nothing else.