Since when did heat output from the two-GPU 6990 HAVE to warrant the same "bitching" from a single-GPU 480? Whereas the 6990 OC is still 2 degrees cooler than the 480.
Since number of gpus don't matter, only how many slots it plugs into, and the end results of that. It's a single card, and it runs hot, and consumes more power than the 480. Yet, magically, nobody cares this time around. I wonder how much of that has to do with it being an ATI card?
Wow, even with a overclocked 980X and a GTX 580? Would you be referring to Metro 2033? As far as I can see, other games get way above the 30 fps average which is considered playable. Although when playing multiplayer, it's nice to get 40-50 fps, just so that the gameplay is absolutely smooth. But I guess for the most of us, we would rather sacrifice the detail level a little bit than our bank account.
Yeah, Metro is one of them. 30fps is not playable for me in most titles. Most I can get away with 45fps or better, and Crysis I can even play at around 35 (gets choppy below that), but for the rest, I need 60fps or better to see a smooth picture.
And I don't sacrifice detail level, period. I play it maxed out (with the exception of AA, I only like a little. To much softens the image to me.), or I don't play it. If that means saving up for a year to get what I want, so be it.
This. The 480 was a single-GPU card -- there was no excuse for the heat it produced, just poor engineering being rushed to market.
Again, number of gpus is irrelevant. If it plugs into a single slot, it is a single card to an end user, and subject to the same expectations as any other single card solution.