EVGA, I think.
Heck, he could work just for himself, for all I know.
The whole "Titan = GTX680 @ 1800 MHz" is what gets me. I don't really care about the rest.
Now I remember, phoned with EVGA a year ago, they said something like he's their home clocker.
To be precise it was just: Titan < 680SLI@1260MHz.
______________________________________________________________________________________
If not posted already by someone else, this sums it up:
http://www.tomshardware.com/reviews/geforce-gtx-titan-performance-review,3442-13.html
"...by Chris Angelini
I gave you a handful of conclusions in Tuesday's story, and promised today’s data would back me up. Between then and now, I’ve run and re-run a bunch of data. How well do my first impressions carry through? Here’s what I said:
1) Pay the same $1,000 for a GeForce GTX 690 if you only want one dual-slot card and your case accommodates the long board. It remains the fastest graphics solution we’ve ever tested, so there's no real reason not to favor it over Titan.
We can stand by this one. Although it’s technically true that the GeForce GTX 690’s 2 GB per GPU potentially limits performance at high detail settings and resolutions, our tests at 5760x1200 didn’t turn up any troublesome numbers. Far Cry 3 was the one title that felt choppy—and that was the case even as far down as 2560x1600. I don’t think on-board memory is the issue.
2) The Titan isn’t worth $600 more than a Radeon HD 7970 GHz Edition. Two of AMD’s cards are going to be faster and cost less. Of course, they’re also distractingly loud when you apply a demanding load. Make sure you have room for two dual-slot cards with one vacant space between them. Typically, I frown on such inelegance, but more speed for $200 less could be worth the trade-off in a roomy case.
This proved to be a little controversial. If you judge solely on performance per dollar, two Radeon HD 7970 GHz Edition boards absolutely cost less and go faster than a GeForce GTX Titan. But there’s the case against poor acoustics. There’s also a discussion to be had about micro-stuttering. Our single-GPU frame latency numbers show that AMD has already made inroads into minimizing frame latency in some games, and that other titles remain problematic. But we can’t compare multi-GPU configs using the same tools. Fortunately, we have something coming soon that’ll address micro-stuttering more definitively. In the meantime, those Radeon cards are compelling, so long as you’re able to cope with their noise.
Given a number of driver updates, one 7970 GHz Edition is quicker than GeForce GTX 680. As long as Nvidia sells the 680 for more than AMD’s flagship, the Tahiti-based boards are going to continue turning heads.
3) Buy a GeForce GTX Titan when you want the fastest gaming experience possible from a mini-ITX machine like Falcon Northwest’s Tiki or iBuyPower’s Revolt. A 690 isn’t practical due to its length, power requirements, and axial-flow fan.
This is unequivocal. There’s no way to get anything faster than a GeForce GTX Titan into a Tiki, Revolt, Bolt, and so on. Why would you spend $1,000 on a card that tends to be slower than the GeForce GTX 690? This is why.
4) Buy a GeForce GTX Titan if you have a trio of 1920x1080/2560x1440/2560x1600 screens and fully intend to use two or three cards in SLI. In the most demanding titles, two GK110s scale much more linearly than four GK104s (dual GeForce GTX 690s). Three Titan cards are just Ludicrous Gibs!
Gaming at 5760x1200 is sort of the jumping-off point where one GeForce GTX 690 starts to look questionable. Of course, then you’re talking about $2,000 worth of graphics hardware to either go four-way GK104s or two-way GK110s. To me, the choice is easy: a pair of GeForce GTX Titans is more elegant, lower-power, and set up to accept a third card down the road, should you hit the lottery.
With all of that said, the benchmarks also reveal that OpenCL support isn’t fully-baked yet in the GeForce GTX Titan driver. A number of issues in synthetics and real-world apps make it clear that bugs still need to be stomped out, and that’s never a pleasant revelation about such a pricey piece of kit.
At least we know that GK110 does have the compute chops GK104 lacks. Developers who would have loved a Tesla K20X but couldn’t afford its almost-$8,000 price tag may consider $1,000 for a Titan true value. If, on the other hand, you’re a bitcoin miner—well, AMD’s GCN architecture still has the lock on hashing performance.
At the end of the day, we maintain that a $1,000, GeForce GTX Titan is for two very specific gamers, both of which we explicitly called out on Tuesday. Everyone else will still consider this a very fast, very well-built piece of hardware. However, it runs into serious competition in Nvidia’s stack, and from rapidly-improving Tahiti-based cards that got beaten up on pricing early on in their life."