- Joined
- Dec 31, 2009
- Messages
- 19,374 (3.53/day)
Benchmark Scores | Faster than yours... I'd bet on it. :) |
---|
Thanks for taking another stab at it. I do think I understood already.You, me and everybody is aware how we, the people, play our games ... we put shit to ultra and resolution to native, sometimes we curse and swear because of the fps dips then adjust the settings only as much is needed
In other words, you are not understanding me right, let me put it the other way ... using high resolution when benching cpus in games is ok (it applies) when game is cpu hungry enough so you can get meaningful value range for your graph (for example if you don't like differences that are fractions of a frame) ... if you want wider value range for your graph in those couple of games (that's when it doesn't apply) you lesser the burden on the gpu and get less compressed graph. It's fine because you are trying to analyze relative cpu performance executing that particular game code. It's kinda game to game basis IMHO, to get a more readable graph.
Now, argument how we should always and exclusively always test cpu in games using uhd res is resonable to an extent only if somehow lowering the resolution lowers the amount of job cpu has to do inside a frame. Does it? Valid question because games usually adjust level of detail system at higher resolutions, but LOD is all about geometry, not draw calls. Tough one.
I simply disagree with your assertion. I don't want a more readable graph because of an unrealistic testing environment. I want to know what the results are at resolution and settings most people play at and not an 'exaggerated to show results' situation using high end cards on below 1080p res and low settings. Show me what it's like on ultra with AA at 1080/2560x1440/4k(no AA @ 4K is fine with me since it isn't needed)!!!! That's where we all play!