The Gigabyte M32U sports a 144 Hz refresh rate SS IPS panel, which supports the adaptive synchronization technology from both AMD and NVIDIA graphics cards. The adaptive synchronization range is 48–144 Hz, so that's the framerate range your PC should be able to achieve at 4K resolution to experience buttery smooth, screen tear-free gameplay. The monitor is AMD FreeSync Premium Pro certified, which means it integrates HDR on the hardware level and supports the Low Framerate Compensation (LFC) technology. If the game runs at less frames per second than the bottom limit of the FreeSync operating range (48 FPS in this case), the LFC technology displays frames multiple times to stay above the lower limit of FreeSync and maintain the full fluidity of the action. Of course, this "multiplication of frames" is completely invisible to the human eye. Thanks to this approach, the bottom limit of the required number of frames per second becomes basically irrelevant and should not be thought about. Of course, for the best-possible gaming experience, a high framerate remains something you should strive for.
Response Time and Overdrive
The Gigabyte M32U has a specified 1 ms MPRT response time. The panel uses overdrive technology to make the pixel transitions faster, and you will find the option under "Overdrive" in the OSD (Main Menu > Gaming > Overdrive). Response Time has a total of four settings: Off, Smart OD, Picture Quality, Balance, and Speed. The "Smart OD" option is an interesting addition to the usual overdrive settings. It promises to change the overdrive level in relation to the framerate for the lowest possible blur regardless of which framerate we're currently playing on.
I extensively tested all of them by using the so-called pursuit camera method developed by the good people of Blur Busters, namely Mark D. Rejhon. The idea of the procedure is to use a standard DSRL camera to capture the motion blur exactly like your eyes see it. That's achieved by mounting the camera on a smooth slider, setting the camera exposure to four times the length of the monitor refresh rate, and loading the Ghosting Test Pattern with the Pursuit Camera Sync Track invented by Mark Rejhon of Blur Busters. The camera then has to be slid sideways at the same speed as on-screen motion. The sync track is there to tell you if you're moving the camera too fast or too slow, or if it shakes too much. The procedure takes some practice and getting used to, but yields great results and lets us examine the differences between various overdrive settings at various monitor refresh rates.
I made a series of photos at 60, 100, 120, and 144 Hz with all five available Response Time settings. Let's look at the results and figure out what the ideal overdrive setting would be:
The photos tell the whole story—setting Overdrive to "Picture Quality" results in the clearest reproduction of moving objects. The Smart OD option seemed to keep the overdrive at Balance at higher refresh rates, and between Balance and Speed at refresh rates below 100 Hz, which isn't ideal. I advise against using the Smart OD option as you're better off just manually setting the overdrive to the aforementioned Picture Quality option. Balance is another valid setting, which can look even slightly sharper at refresh rates/framerates past 120 Hz/FPS, but it doesn't behave as well as the Picture Quality setting below these values.
Moving Picture Response Time (MPRT)
In the OSD, the Gigabyte M32U offers the MPRT toggle, hidden under the "Aim Stabilizer Sync" name. If you turn it on, the backlight will start strobing to achieve a "1 millisecond-like" response time at the expense of picture brightness and other strobing-related issues, such as flickering and strobe crosstalk. The Aim Stabilizer Sync technology can be used together with adaptive synchronization, which wasn't the case with older/lesser MPRT implementations.
The "1 ms MPRT" response time is not to be confused with 1 ms GtG response time, as the commonly used GtG value tells us how much time it takes for a pixel to change between two colors, while MPRT, also known as display persistence, represents for how long a pixel is continuously visible. It's important to know that MPRT isn't a blur reduction technology, but a measurement which can be lowered by backlight strobing. Here's a comparison of moving object sharpness with Aim Stabilizer Sync toggle off and on.
Activating Aim Stabilizer Sync results in the sharpest-possible moving visuals the Gigabyte M32U has to offer, but not without a certain amount of inverse ghosting. It isn't as pronounced as my pursuit photo would lead you to believe, but definitely present. Aim Stabilizer Sync also locks the picture brightness to around 103 nits, which is far lower than what I'd deem acceptable for combined daytime and nighttime usage. In other words, keep the Aim Stabilizer Sync toggle off; while moving objects will be slightly blurrier, you'll have a significantly wider brightness range at your disposal.
Input Lag
To measure the input lag of a monitor, I've switched from using the high-speed camera method to a new tool: NVIDIA LDAT v2, which I've covered extensively in my NVIDIA Reflex review.
The latency display analysis tool (LDAT) is a small device that is strapped onto a monitor to measure brightness changes and provide us end-to-end system latency data—the so-called mouse to photon latency. Using it for reliable and repeatable monitor input lag testing was made possible by NVIDIA's inclusion of the flash indicator feature in some Reflex-supported games. The flash indicator is essentially a white box, displayed at the left edge of the screen at the moment when a mouse click "arrives" to the screen. I simply place the LDAT sensor on the part of the screen where the flash indicator will appear and click the mouse button on the sensor itself. The sensor will then detect the sudden change in brightness and calculate how much time it took between the mouse button click and flash on the screen, which is the aforementioned end-to-end system latency. While this method doesn't let me isolate the input lag of the monitor as the only measured value (the high-speed camera method didn't do that either), if the rest of my system remains unchanged, the tested monitors can be compared to each other.
One other excellent characteristic of the LDAT method is the Auto Fire feature of the LDAT software. The Auto Fire feature allows me to select a number of "shots" (test iterations) as well as the "shot delay" (a delay between mouse clicks). All I have to do is run the desired game, align the LDAT sensor to the flash indicator area, and press the "mouse" button on the LDAT. The sensor and accompanying software take care of everything else. Less than a minute later, I have my test results—the minimum, average, and maximum measured end-to-end system latency and standard deviation.
The game I'm using for monitor input lag testing is Overwatch, for several reasons. It has an integrated 400 FPS framerate limit, which my test system has no trouble hitting and maintaining at low settings at any given resolution. By keeping the framerate at 400 FPS, I'm maintaining a low and constant GPU render latency (around 2 ms). Overwatch also has an easily accessible practice range, which allows me to do all of my tests in a controlled environment. Finally, its game engine detects inputs even when a gun is reloading, meaning I don't have to worry about character selection or their ammo magazine size—once I trigger the Auto Fire feature, the test will run in 100 iterations regardless of what's happening in the game. My tests are conducted with the NVIDIA Reflex technology turned off (at 400 FPS, it wouldn't make any difference to system latency anyway), with adaptive refresh rate technologies (G-Sync, FreeSync, VRR) deactivated and V-Sync off.
Monitor Input Lag Test System
CPU
Intel Core i9-9900K
GPU
Palit GeForce RTX 2080 Super GP
Motherboard
ASUS ROG Maximus XI Formula
RAM
ADATA Spectrix D60G DDR4 32 GB
SSD
Kioxia Exceria 500 GB NVMe
In the end, we get the so-called button-to-pixel lag—the time between you doing an action with your mouse and said action first registering on the screen. Anything below 16 ms, which equals one frame of lag at 60 Hz, can be considered gaming-grade, and such a monitor is suitable for even the most demanding gamers and esports professionals. Input lag between 16–32 ms (between 1–2 frames of lag at 60 Hz) means the monitor is suitable for almost everyone but the most hardcore gamers, especially if they're playing first-person shooters on a professional level. Finally, if a monitor's input lag is higher than 32 ms (over 2 frames of lag at 60 Hz), even casual gamers should be able to notice it. Will they be bothered by it? Not necessarily, but I can't recommend a screen like that for serious gaming.
Here's how the Gigabyte M32U holds up in terms of input lag:
After doing 100 iterations of the LDAT-powered input lag test, the Gigabyte M32U showed an average input lag of only 10.9 milliseconds, which means it's responsive enough even for most demanding gamers.
Updating the monitor to the latest available firmware (version F06 as this is written) is essential because firmware version F06 brings with it a significant reduction in input lag. My sample of the M32U originally had the F02 firmware installed, and measured average input lag was 17.1 ms. The monitor firmware can be updated through the OSD Sidekick Windows app, which can be quite an ordeal, but you shouldn't avoid doing so.