Dude, you're sadder than me
Probably!
This does have a greater purpose though. I'm a system builder, and generally have to tailor my component fit-out advice based on specific need -for gamers usually a core of games/game engines, and I find that a lot of reviews tend to stick with a limited number of releases (BF3, Metro 2033, DiRT3, AvP for example), so going further afield nets a larger variety.
The information (in spreadsheet form) also highlights which benchmarks offer consistancy, and what kind of range is covered. Consistant outliers favouring one brand or another tend to be readily apparent
But as you obviously visit many sites for reviews (as do I) you'll notice scary differences between review sites
Partly due to bias (or non consistant benchmark settings), recycling old benchmarks and/or testing games that aren't to the same patched/revision status, misreporting the game i.q. used, forced CP/third-party utility settings which may, or may not be applied in game, and whether the bench is run with normal backround processes concurrently or not.
I tend to stick to the few i know from experience that are not biased one way or another or that use good test structures
Likewise. The ones I put the most faith in are those that quantify all settings used and the revision/patch status of the bench/game being used. I will include all benchmarks (within reason) for an overview.
so at 2560x1600 res, 24 wins for 7970, 28 wins for 680. Considering a lot of those games are TWIMTBP (sponsored) it's not too bad a result.
Much like auto racing it's "run what you brung". You could argue that a lot of games featured are Nvidia friendly or TWIMTBP- that also says to me that Nvidia have an eye for sponsoring/supporting gaming titles that gamers want to play. It stands to reason that a benchmark suite should reflect current gaming trends and game popularity, so I certainly wouldn't begrudge the widespread use of BF3, DiRT3, TESV:Skyrim or Batman:AC...although, the continued use of Metro 2033 (ok from a torture test angle) and Far Cry 2 I find debateable...does anyone actually play these, and if so how many would replay them?
And I'm guessing that's ref clocks?
The GTX 680 is stock in every case. The HD 7970 is stock in most cases ( a minority of reviews used factory overclocked cards for comprison.
Maximum PC for instance used the XFX Black Edition 7970).
As gaming f.p.s. was only a part of the info I was culling (along with power usage, heat, acoustics, overclocking headroom, overclock-to-power draw delta etc.) I figured that a handful of slight OC'ed 7970's wouldn't impact the overall dataset too highly.
In order for AMD to stay in the game they'll need another price drop to equalize that price performance, because as of this moment you still get more bang out of your buck with a 680 (regardless of how little that is).
That kind of depends what you have to pay for each respective card. Prices seem to fluctuate wildly depending upon the market.
As for AMD cutting prices...that is a double edged sword. Might gain some favourable comments at the conclusion of a few reviews, but I'm guessing if you're in the market for an enthusiast level card (or two), pricing isn't the be all and end all.
From a PR and public perception standpoint; AMD have just had a hefty price reduction...they are also giving away a three game pack...add another price cut and it starts looking like desperation...meanwhile, Nvidia's latest and greatest (GTX 690) is being compared to a work of art and/or supercar. Add in the fact that all this stems from ONE GPU (GK104) that traces it's origin to a general laughingstock (GF100) and you have a near complete swing in performance, die area, and most importantly, brand perception, and you can see that the momentum favours Nvidia regardless of AMD reaction -short of rolling out their own quantum leap in GPU tech. A much harder job when the baseline you are comparing with isn't a bad level of performance in its own right.
To a degree, pricing becomes secondary (esp if GK 104 is constrained) since the thing AMD are losing is not marketshare, it's mindshare.
Buying a performance AMD card already has one caveat built in against it for a lot of people* -it sorely doesn't need two.
*Resale. If you're updating cards regularly, resale value tends to play a significant part in the upgrade cycle. AMD's cards have historically lost value faster than Nvidia's cards. You now have the situation where one of AMD's biggest virtues- Bitcoin- also becomes a force that drives down the resale market, since many are wary of picking up a card which may have spent it's life at near 24/7 100% GPU usage