I agree that the lower resolution provides a better side by side comparison, assuming raw power in X game scenario is actually useful. The problem I see with it is that it's not indicitive of current or future performance for the actual users. You can get an idea of how a specific engine is handled with each CPU design but for one off games, I don't see this as particularly useful in the long run.
That said, I think a more in depth analysis for something like "unreal engine' and 'unity' at a low res may provide useful game estimation in the future, whereas if you actually care about horsepower the synthetics and computational workloads there are plenty of other benchmarks to provide more useful long term comparisons.
I would however appreciate more in depth framerate lows and latency constency analysis across the board.
Well, it certainly is indicative of current performance, as that's how the CPU is performing right now on a particular game. Extrapolating to future games is more iffy though due to changes in game engines and Windows versions, including DirectX version, but then again, Sandy Bridge had strong gaming performance and that's still true 5 years after I bought my 2700K, so perhaps it is indicative?
In the end, comparing the framerate performance of different CPUs by having the framerates capped by the graphics card is idiotic beyond belief and really doesn't need any explanation why. I mean seriously, how hard can this be to understand? I can't even believe that we're having this discussion!
Perhaps the biggest reason to have these low res tests, is so that a user can pick the fastest CPU and know what the max framerates it can achieve are. They will then be safe in the knowledge that it will provide the least bottleneck when they upgrade to a faster graphics card down the road, since CPUs aren't upgraded that often. I've had a slow CPU bottleneck a fast new card and it sucks, I can tell you.
By all means have the high res tests as that's the real world scenario, just don't cut out the low res ones. I remember UT2003 actually had a benchmark mode where the graphics card was taken out of the loop altogether, simulating an infinitely fast card. If I remember correctly, it did this by terminating draw calls before they were sent to the graphics card. All you saw was a static picture while the benchmark ran.
@Vayra86 You've been making a valliant effort in explaining in simple and clear language why we need lo res tests, but at 35 votes to 14 right now and the kind of comments being posted all wanting to cut out the lo res ones, it's clearly falling on deaf ears, so it might not be worth wasting any more time on it, to protect your sanity.

I'm gonna get flamed by them, aren't I? lol
@newtekie1 A GTX 1080 can certainly bottleneck at 1080p as I discovered when playing CoD: Infinite Warfare a while back. It was still running pretty fast mind you and still perfectly smoothly, but definitely rather slower than at the lowish resolution I compared it with.