- Joined
- Apr 19, 2011
- Messages
- 2,198 (0.43/day)
- Location
- So. Cal.
Yeah, if you read what you posted [H] say they're using "real gaming and recorded the highest value in each"... Not an average of what it took to complete that section! Sure a 7970 might peak for a millisecond, is that what they mean as the "highest" value?Well, something must have happened between those old tests and the newer ones at [H]OCP...
While now [H] doen't tell us the games used, but hopefully figure the 5 [H] used in that new review, which are different that the earlier 5. [H] drop Batman and Witcher (use 11% more watts than the 7970GHz) which as move the data against the GHz Edition. Also, in most of the titles the 7970GHz provide more Fps verse a 680, so we'd logial anticipate more power usage. Even Sleeping Dog [H] had to us the lower 1920x Res to have more Fps,
Going back to an average of the what a card require to complete the run-throughs of each game, and them take those five games add them together and divide by 5 is more real world anyway you slice it.
Why persist in spreading misinformation? This is from a previous thread about the GTX780:
From W1zzard's own GTX Titan review you can find in the TPU website:
That's only one game Crysis 2 on a specific run-through. Sure it looks good by that on one data point, but hardly is telling the whole story, when various titles have their average power usage and over a long period of playing each. Sure if all you play is "Crysis 2 at 1920x1200, Extreme profile, representing a typical gaming power draw. Highest single reading during the test" and then limited your play to that one small run-through each time then you can abide with that one point of data.
Last edited: