Tick 1: 15ms
the 60hz users display this 1.6ms after it happens
the 500Hz user gets it 1ms after it happens
<snip>
Now yes this is napkin math, but the point is that with external sources of delays that prevent the monitor timing being perfect - you can end up with very large delays before the image even gets sent to the monitor to display
In reality these numbers would be all over the place, with ping varying every second as well as delays from all other players in the game and THEIR pings to the server - so information is chaotically arriving at erratic times
a 60hz user vs a user with 1000FPS and Vsync off, could have a ~15ms delay before things even display
A 60hz 60FPS user, who happens to get 1ms network latency at the wrong time, can suddenly get a 15ms delay vs everyone else
This is napkin math and ignoring network latency, input latency and a dozen other variables, but not including at least this much is just ignorance
If the human eye can see a single frames difference in the hundreds, and monitor technology can show a 15ms difference quite easily... how many frames are in that 15ms? How many chances to get updated visual information vs the competition?
The reality is a bit more complex than that.
A network game's end-to-end latency will be something like this:
1) Device input lag (with OS overhead). ~15 ms (if we assume USB HID)
2) Client side tick (before sending to server). If we assume 120 Hz: 1000/120 + 50%* = 12.5 ms
3) Network transfer: Variable based on distance and condition. Let's assume 10 ms (average) for this example.
4) Server tick: 60 Hz seems fairly common, so I assume that. 1000/60 + 50%+ = 25 ms
5) Network transfer - same as 3): 10 ms
6) Client side tick (yes, again): 12.5 ms
7a) Rendering at 60 FPS: 1000/60 + 50%* = 25 ms
7b) Rendering at 120 FPS: 1000/120 + 50%* = 12.5 ms
7c) Rendering at 500 FPS: 1000/500 + 50%* = 3 ms
8) Frame pacing: unpredictable
9) Monitor input lag: I assume 5 ms
Total:
Scenario a 60 FPS: 115 ms
Scenario b 120 FPS: 102.5 ms
Scenario c 500 FPS: 93 ms
You decide whether this difference is significant or not. And keep in mind I assumed a low network latency here. I'm also ignoring pixel response time of the monitor, only looking at the monitor's input lag. I'm also assuming V-Sync is not used.
*) 50% is added for average variation whenever two things are not in sync. If a packet/event comes in 0.0002 ms too late, it needs to wait for the next tick.
So look at the impact if some of the other factors changed, like:
* Increasing the client and/or server side tick rate, e.g. to 200 Hz
* Choosing a game server that's closer
* Using a faster input device, either a device with a special protocol or PS/2.