The INNOCN 40C1R sports a 144 Hz refresh rate IPS ADS panel that supports both Intel's and AMD's adaptive synchronization technology. The adaptive synchronization range is 48–144 Hz, so that's the framerate range your PC should be able to achieve at 3440x1440 resolution to experience buttery smooth, tear-free gameplay. The monitor is AMD FreeSync Premium certified, which means it supports the Low Framerate Compensation (LFC) technology. If the game runs at fewer frames per second than the bottom limit of the FreeSync operating range (48 FPS in this case), the LFC technology displays frames multiple times to stay above the lower limit of FreeSync and maintain the full fluidity of the action. Of course, this "multiplication of frames" is completely invisible to the human eye. Thanks to this approach, the bottom limit of the required number of frames per second becomes irrelevant and should not be thought about. Of course, for the best-possible gaming experience, a high framerate remains something you should strive for.
Response Time and Overdrive
The response time of the INNOCN 40C1R isn't officially specified, which is usually a red flag. After testing the 40C1R for over two months, I don't think INNOCN has anything to hide because the monitor performs as well as or better than any other competing model equipped with an IPS-type panel. The panel uses overdrive technology to speed up pixel transitions, and you will find the option under Game Settings > Response Time in the OSD. Overdrive has a total of four settings: Off, Normal, Fast, and Ultrafast.
I extensively tested all of them by using the so-called pursuit camera method developed by the good people of Blur Busters, namely Mark D. Rejhon. The idea of the procedure is to use a standard DSRL camera to capture the motion blur exactly like your eyes see it. That's achieved by mounting the camera on a smooth slider, setting the camera exposure to four times the length of the monitor refresh rate, and loading the Ghosting Test Pattern with the Pursuit Camera Sync Track invented by Mark Rejhon of Blur Busters. The camera then has to be slid sideways at the same speed as on-screen motion. The sync track is there to tell you if you're moving the camera too fast or too slow, or if it shakes too much. The procedure takes some practice and getting used to, but yields great results and lets us examine the differences between various overdrive settings at various monitor refresh rates.
I made a series of photos at 60, 100, 120, and 144 Hz with all available overdrive settings. Let's look at the results and figure out what the ideal overdrive setting would be.
As you can see by examining the motion blur samples I've taken, you have to really focus to notice a significant difference between various overdrive settings. With that being said, I'd give a slight advantage to the Fast setting, as it seems to perform the best out of the four. But again, you really have to concentrate hard to differentiate it from the Ultrafast setting.
Input Lag
To measure the input lag of a monitor, I'm using the NVIDIA LDAT v2, which I've covered extensively in my NVIDIA Reflex review.
The LDAT (Latency Display Analysis Tool) is a small device that is strapped onto a monitor, measures brightness changes, and provides us with end-to-end system latency data—the so-called mouse to photon latency. Using it for reliable and repeatable monitor input lag testing was made possible by NVIDIA's inclusion of the flash indicator feature in some Reflex-supported games. The flash indicator is essentially a white box displayed at the left edge of the screen at the moment a mouse click "arrives" on screen. I simply place the LDAT sensor on the part of the screen where the flash indicator will appear and click the mouse button on the sensor itself. The sensor will then detect the sudden brightness change and calculate how much time it took between the mouse button click and flash appearing on screen—that's the aforementioned end-to-end system latency. While this method doesn't let me isolate the input lag of the monitor as the only measured value (the high-speed camera method didn't do that either), if the rest of my system remains unchanged, the tested monitors can be compared to each other.
Another excellent characteristic of the LDAT method is the Auto Fire feature of the LDAT software. The Auto Fire feature allows me to select the number of "shots" (test iterations) and "shot delay" (a delay between mouse clicks). All I have to do is run the desired game, align the LDAT sensor to the flash indicator area, and press the "mouse" button on the LDAT. The sensor and accompanying software take care of everything else. Less than a minute later, I have my test results—the minimum, average, and maximum measured end-to-end system latency and standard deviation.
The game I'm using for monitor input lag testing is Overwatch for several reasons. It has an integrated 400 FPS framerate limit, which my test system has no trouble hitting and maintaining at low settings at any given resolution. By keeping the framerate at 400 FPS, I'm maintaining a low and constant GPU render latency (around 2 ms). Overwatch also has an easily accessible practice range, which allows me to do all of my tests in a controlled environment. Finally, its game engine detects inputs even when a gun is reloading, so I don't have to worry about character selection or their ammo magazine size—once I trigger the Auto Fire feature, the test will run in 100 iterations regardless of what's happening in the game. My tests are conducted with the NVIDIA Reflex technology turned off (at 400 FPS, it wouldn't make any difference to system latency anyway), with adaptive refresh rate technologies (G-Sync, FreeSync, VRR) deactivated and V-Sync off.
Monitor Input Lag Test System
CPU
Intel Core i9-9900K
GPU
Palit GeForce RTX 2080 Super GP
Motherboard
ASUS ROG Maximus XI Formula
RAM
ADATA Spectrix D60G DDR4 32 GB
SSD
Kioxia Exceria 500 GB NVMe
In the end, we get the so-called button-to-pixel lag—the time that passes between an action with your mouse and said action first registering on the screen. Anything below 16 ms (that equals one frame of lag at 60 Hz) can be considered gaming-grade, and such a monitor is suitable for even the most demanding gamers and esports professionals. If the input lag falls between 16–32 ms (between 1–2 frames of lag at 60 Hz), the monitor is suitable for almost everyone but the most hardcore gamers, especially if they're playing first-person shooters on a professional level. Finally, if a monitor's input lag is higher than 32 ms (over 2 frames of lag at 60 Hz), even casual gamers should be able to notice it. Will they be bothered by it? Not necessarily, but I can't recommend a screen like that for serious gaming.
Here's how the INNOCN 40C1R holds up in terms of input lag.
After doing 100 iterations of the LDAT-powered input lag test, the INNOCN 40C1R shows an average input lag of 10.2 milliseconds, making it a viable choice even for hardcore gaming.