Literally four games of 10 showed improvements that were not within margin of error/run variance (~1%).
Who said runtime? Run variance.
Just because there are positive results doesnt mean there has to be negative results either. If I ran a test 5 times and saw, 100/100/100/100/101 and 100/100/100/101/101 are we really calling that 1 FPS an improvement or does it fall within a typical run variance? The difference is so small. There is also more run variance in games without canned benchmarks (not sure if these are those offhand). We can run a test 5 times and get 5 different results. Just because one test popped a fps higher doesnt necessarily mean it showed improvement...but that one result could then bump an average up a couple/few tenths.
All I'm saying is any result like this and a 1% difference can be attributed to margin of error/run variance in the testing. 4 games showed improvement (with two not even reaching 2%), 6 did not. Also, didn't The Witcher 3 show a negative result?
You've just cherry picked the positive results to formulate an average? Can't say I math like that...
Anyway, thanks for the crack at it... I'll wait for W1z.