Has anyone done a serious, controlled study on whether framerates above, say, 200FPS really make a difference? I've seen a few videos where esports titles like CS:GO were group tested at 60, 144, 240Hz and observed in slo-mo - and yes, 60Hz was measurably worse but the difference between 144 and 240fps is negligible enough to call margin of error and human variance.
LTT's one ft Shroud was perhaps the most controlled and in-depth one I've seen and if an e-sports world champion barely sees any improvement between 144 and 240Hz, what hope do the rest of us have of capitalising on higher framerates?! Clearly 144Hz is already at the point of diminishing returns for Shroud, so perhaps the sweet spot is merely 100fps or 120fps? It's likely no coincidence that FPS higher than the server tickrate achieves nothing of value.
I'm curious if there's any real benefit to running north of 200fps whatsoever. At 500fps we're talking about a peak improvement of 3ms, a mean improvement of 1.5ms and this is taken into consideration alongside the following other variables:
- 8.3ms server tick interval
- 10-25ms median ping/jitter to the game server via your ISP
- 2ms input sampling latency
- 2-5ms pixel response time
- 80-120ms human reflex time
or
20-50ms limitations of human premeditated timing accuracy (ie, what's your ability to stop a stopwatch at exactly 10.00 seconds, rather than 9.98 or 10.03 seconds, even though you know EXACTLY when to click?)
Whilst it's true that some of that is hidden behind client prediction, it's still 42ms of unavoidable latency. Throwing several thousand dollars at hardware to support 500+ FPS does seem to me like a fool's errand just to gain a mean improvement of 1.5ms in the best possible case scenario, and worst case you've spent all that for 1.5ms out of 160ms.