Gigabyte M34WQ Monitor Review - Gaming Meets Productivity 29

Gigabyte M34WQ Monitor Review - Gaming Meets Productivity

Value & Conclusion »

Gaming Performance

The Gigabyte M34WQ sports a 144 Hz refresh rate IPS panel that supports the adaptive synchronization technologies from AMD and NVIDIA graphics cards. The adaptive synchronization range is 48–144 Hz, so that's the framerate range your PC should be able to achieve at 3440x1440 resolution to experience buttery smooth, tear-free gameplay. The monitor is AMD FreeSync Premium certified, which means it supports the Low Framerate Compensation (LFC) technology. If the game runs at fewer frames per second than the bottom limit of the FreeSync operating range, 48 FPS in this case, the LFC technology displays frames multiple times to stay above the lower limit of FreeSync and maintain the full fluidity of the action. Of course, this "multiplication of frames" is completely invisible to the human eye. Thanks to this approach, the bottom limit of the required number of frames per second becomes irrelevant and should not be thought about. Of course, for the best-possible gaming experience, a high framerate remains something you should strive for.

Response Time and Overdrive

The Gigabyte M34WQ has a specified 1 ms MPRT response time. The panel uses overdrive technology to speed up pixel transitions, and you will find the option under "Overdrive" in the OSD (Main Menu > Gaming > Overdrive). Overdrive has a total of five settings: Off, Smart OD, Picture Quality, Balance, and Speed. The "Smart OD" option is an interesting addition to the usual overdrive settings. It promises to change the overdrive level in relation to the framerate for the lowest possible blur regardless of which framerate we're currently playing on.



I extensively tested all of them by using the so-called pursuit camera method developed by the good people of Blur Busters, namely Mark D. Rejhon. The idea of the procedure is to use a standard DSRL camera to capture the motion blur exactly like your eyes see it. That's achieved by mounting the camera on a smooth slider, setting the camera exposure to four times the length of the monitor refresh rate, and loading the Ghosting Test Pattern with the Pursuit Camera Sync Track invented by Mark Rejhon of Blur Busters. The camera then has to be slid sideways at the same speed as on-screen motion. The sync track is there to tell you if you're moving the camera too fast or too slow, or if it shakes too much. The procedure takes some practice and getting used to, but yields great results and lets us examine the differences between various overdrive settings at various monitor refresh rates.

I made a series of photos at 60, 100, 120, and 144 Hz with all five available overdrive settings. Let's look at the results and figure out what the ideal overdrive setting would be. Click on the image to examine it in high resolution.



The photos tell the whole story—setting overdrive to "Picture Quality" results in the clearest reproduction of moving objects. The Smart OD option seemed to keep overdrive at Balance or Speed depending on the current refresh rate of the panel. I advise against using the Smart OD option as you're better off just manually setting overdrive to the aforementioned Picture Quality option.

Moving Picture Response Time (MPRT)

In the OSD is the MPRT toggle, hidden under the "Aim Stabilizer Sync" name. If you turn it on, the backlight will start strobing to achieve a "1 millisecond-like" response time at the expense of picture brightness and other strobing-related issues, such as flickering and strobe crosstalk. The Aim Stabilizer Sync technology can be used together with adaptive synchronization, which isn't the case with older or lesser MPRT implementations.

The "1 ms MPRT" response time is not to be confused with 1 ms GtG response time, as the commonly used GtG value tells us how much time it takes for a pixel to change between two colors, while MPRT, also known as display persistence, represents how long a pixel is continuously visible. It's important to know that MPRT isn't a blur reduction technology, but a measurement that can be lowered by backlight strobing. Here's a comparison of moving object sharpness with Aim Stabilizer Sync off and on. In the comparison, I'm using the Picture Quality setting for overdrive.



Activating Aim Stabilizer Sync lowers the maximum brightness of the panel to 62 cd/m², rendering it unusable for most users and in most scenarios. While moving images do look even sharper compared to just using overdrive, the heavily restricted brightness means Aim Stabilizer Sync is best kept deactivated.

Input Lag


To measure the input lag of a monitor, I'm using the NVIDIA LDAT v2, which I've covered extensively in my NVIDIA Reflex review.

The Latency Display Analysis Tool (LDAT) is a small device that is strapped onto a monitor; it measures brightness changes and provides end-to-end system latency data—the so-called mouse to photon latency. Using it for reliable and repeatable monitor input lag testing was made possible by NVIDIA's inclusion of the flash indicator feature in some Reflex-supported games. The flash indicator is essentially a white box displayed at the left edge of the screen right when a mouse click "arrives" on screen. I simply place the LDAT sensor on the part of the screen where the flash indicator will appear and click the mouse button on the sensor itself. The sensor will then detect the sudden change in brightness and calculate how much time passed between the mouse button click and flash appearing on the screen—that's the aforementioned end-to-end system latency. While this method doesn't let me isolate the input lag of the monitor as the only measured value—the high-speed camera method didn't either—if the rest of my system remains unchanged, tested monitors can be compared to each other.

One other excellent characteristic of the LDAT method is the Auto Fire feature of the LDAT software. The Auto Fire feature allows me to select a number of "shots" (test iterations), as well as the "shot delay" (a delay between mouse clicks). All I have to do is run the desired game, align the LDAT sensor to the flash indicator area, and press the "mouse" button on the LDAT. The sensor and accompanying software take care of everything else. Less than a minute later, I have my test results—the minimum, average, and maximum measured end-to-end system latency and standard deviation.

The game I'm using for monitor input lag testing is Overwatch, for several reasons. It has an integrated 400 FPS framerate limit, which my test system has no trouble hitting and maintaining at low settings at any given resolution. By keeping the framerate at 400 FPS, I'm maintaining a low and constant GPU render latency of around 2 ms. Overwatch also has an easily accessible practice range, which allows me to do all of my tests in a controlled environment. Finally, its game engine detects inputs even when a gun is reloading, meaning I don't have to worry about character selection or their ammo magazine size—once I trigger the Auto Fire feature, the test will run in 100 iterations regardless of what's happening in-game. My tests are conducted with the NVIDIA Reflex technology turned off (at 400 FPS, it wouldn't make any difference to system latency anyway), with adaptive refresh rate technologies (G-Sync, FreeSync, VRR) deactivated and V-Sync off.

Monitor Input Lag Test System
CPUIntel Core i9-9900K
GPUPalit GeForce RTX 2080 Super GP
MotherboardASUS ROG Maximus XI Formula
RAMADATA Spectrix D60G DDR4 32 GB
SSDKioxia Exceria 500 GB NVMe

In the end, we get the so-called button-to-pixel lag—the time that passes between the action with your mouse and said action first registering on screen. Anything below 16 ms, which is one frame of lag at 60 Hz, can be considered gaming-grade, and such a monitor is suitable for even the most demanding gamers and esports professionals. If the input lag falls between 16–32 ms, between 1–2 frames of lag at 60 Hz, the monitor is suitable for almost everyone but the most hardcore gamers, especially if they're playing first-person shooters on a professional level. Finally, if a monitor's input lag is higher than 32 ms, over 2 frames of lag at 60 Hz, even casual gamers should notice it. Will they be bothered by it? Not necessarily, but I can't recommend a screen like that for serious gaming.

Here's how the Gigabyte M34WQ holds up in terms of input lag.



After doing 100 iterations of the LDAT-powered input lag test, the Gigabyte M34WQ shows an average input lag of 10.4 milliseconds, making it a viable choice for even hardcore gaming.
Next Page »Value & Conclusion
View as single page
Nov 27th, 2024 10:58 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts