The official technical specification sheet doesn't mention the response time of the INNOCN 27C1U. The panel also doesn't use Overdrive technology to make pixel transitions faster, so this obviously isn't a monitor worth considering for hardcore gaming. The Amazon listing for this monitor mentions AMD FreeSync support, but I didn't find a way to activate any kind of adaptive synchronization technology in its settings, nor did my NVIDIA RTX 2080 SUPER graphics card detect it as G-SYNC-compatible.
With that being said, I didn't notice any hints of excessive ghosting when playing games like Rocket League, Overwatch, and Warzone. The panel behaved as well as expected considering its IPS technology and 60 Hz refresh rate.
To measure the input lag of a monitor, I'm using the NVIDIA LDAT v2, which I've covered extensively in my
NVIDIA Reflex review.
The Latency Display Analysis Tool (LDAT) is a small device that is strapped onto a monitor; it measures brightness changes and provides us with end-to-end system latency data—the so-called mouse to photon latency. Using it for reliable and repeatable monitor input lag testing was made possible by NVIDIA's inclusion of the flash indicator feature in some Reflex-supported games. The flash indicator is essentially a white box, displayed at the left edge of the screen the moment a mouse click "arrives" to screen. I simply place the LDAT sensor on the part of the screen where the flash indicator will appear and click the mouse button on the sensor. The sensor will then detect the sudden change in brightness and calculate how much time passed between the mouse button click and on-screen flash—that's the aforementioned end-to-end system latency. While this method doesn't let me isolate the input lag of the monitor as the only measured value, which the high-speed camera method didn't either, if the rest of my system remains unchanged, the tested monitors can be compared to each other.
One other excellent characteristic of the LDAT method is the Auto Fire feature of the LDAT software. The Auto Fire feature allows me to select a number of "shots" (test iterations), as well as the "shot delay" (delay between mouse clicks). All I have to do is run the desired game, align the LDAT sensor with the flash indicator area, and press the "mouse" button on the LDAT. The sensor and accompanying software take care of everything else. Less than a minute later, I have my test results—the minimum, average, and maximum measured end-to-end system latency and standard deviation.
The game I'm using for monitor input lag testing is Overwatch, for several reasons. It has an integrated 400 FPS framerate limit, which my test system has no trouble hitting and maintaining at low settings at any given resolution. By keeping the framerate at 400 FPS, I'm maintaining a low and constant GPU render latency of around 2 ms. Overwatch also has an easily accessible practice range, which allows me to do all of my tests in a controlled environment. Finally, its game engine detects inputs even when a gun is reloading, meaning I don't have to worry about character selection or their ammo magazine size—once I trigger the Auto Fire feature, the test will run in 100 iterations regardless of what's happening in the game. My tests are conducted with the NVIDIA Reflex technology turned off (at 400 FPS, it wouldn't make any difference to system latency anyway), with adaptive refresh rate technologies (G-Sync, FreeSync, VRR) deactivated and V-Sync off.
Monitor Input Lag Test System |
---|
CPU | Intel Core i9-9900K |
---|
GPU | Palit GeForce RTX 2080 Super GP |
---|
Motherboard | ASUS ROG Maximus XI Formula |
---|
RAM | ADATA Spectrix D60G DDR4 32 GB |
---|
SSD | Kioxia Exceria 500 GB NVMe |
---|
In the end, we get the so-called button-to-pixel lag—the time that passes between you performing an action with your mouse and said action first registering on screen. Anything below 16 ms, which equals one frame of lag at 60 Hz, can be considered gaming-grade, and such a monitor is suitable for even the most demanding gamers and esports professionals. If the input lag falls between 16–32 ms, which is between 1–2 frames of lag at 60 Hz, the monitor is suitable for almost everyone but the most hardcore gamers, especially if they're playing first-person shooters on a professional level. Finally, if a monitor's input lag is higher than 32 ms, which is over 2 frames of lag at 60 Hz, even casual gamers should be able to notice it. Will they be bothered by it? Not necessarily, but I can't recommend a screen like that for serious gaming.
Here's how the INNOCN 27C1U holds up in terms of input lag.
After doing 100 iterations of the LDAT-powered input lag test, the INNOCN 27C1U showed an average input lag of 16.7 milliseconds, which means it's responsive enough for everyday work and casual gaming.