Cooler Master GM34-CWQ ARGB Review - Quantum Dots For the Win 40

Cooler Master GM34-CWQ ARGB Review - Quantum Dots For the Win

Value & Conclusion »

Gaming Performance

The Cooler Master GM34-CWQ ARGB sports a 144 Hz refresh rate VA panel, which supports the adaptive synchronization technology from both AMD and NVIDIA graphics cards. The adaptive synchronization range is 48–144 Hz, so that's the framerate range your PC should be able to achieve at 3440x1440 resolution to experience buttery smooth, tear-free gameplay. The monitor is AMD FreeSync Premium certified, so it supports the Low Framerate Compensation (LFC) technology. If the game runs at fewer frames per second than the bottom limit of the FreeSync operating range (48 FPS in this case), the LFC technology displays frames multiple times to stay above the lower limit of FreeSync and maintain the full fluidity of the action. Of course, this "multiplication of frames" is completely invisible to the human eye. Thanks to this approach, the bottom limit of the required number of frames per second becomes irrelevant and should not be thought about. Of course, for the best-possible gaming experience, a high framerate remains something you should strive for.

Response Time and Overdrive

The response time of the Cooler Master GM34-CWQ ARGB is specified as 0.5 ms MPRT without stating a more common GtG value. MPRT mode can be activated in the OSD, where it's called Motion Clearness. We'll examine it in greater detail in a moment.

The panel uses overdrive technology to make pixel transitions faster; you find the option under Setup Menu > Over Drive in the OSD. Overdrive has a total of five settings: Off, Normal, Advanced, Ultra Fast, and Dynamic. The latter is the most interesting one, and also the default. In Dynamic mode, the overdrive technology actively switches between Normal, Advanced, and Ultra Fast modes in relation to the current framerate. This isn't the first monitor with such a feature I tested, but I never ran into an implementation that is better than just manually selecting a single overdrive setting, so it will be interesting to see if anything's different when it comes to Dynamic mode on the Cooler Master GM34-CWQ ARGB.



I extensively tested all available overdrive modes by using the so-called pursuit camera method developed by the good people of Blur Busters, namely Mark D. Rejhon. The idea of the procedure is to use a standard DSRL camera to capture the motion blur exactly like your eyes see it, which is achieved by mounting the camera on a smooth slider, setting the camera exposure to four times the length of the monitor refresh rate, and loading the Ghosting Test Pattern with the Pursuit Camera Sync Track invented by Mark Rejhon of Blur Busters. The camera then has to be slid sideways at the same speed as on-screen motion. The sync track is there to tell you if you're moving the camera too fast or too slow, or if it shakes too much. The procedure takes some practice and getting used to, but yields great results and lets us examine the differences between various overdrive settings at various monitor refresh rates.

I made a series of photos at 60, 100, 120, and 144 Hz with all available overdrive settings. Let's look at the results and figure out what the ideal overdrive setting would be.


The photo samples tell the whole story. The best overdrive setting is Advanced unless you're constantly hovering around 60 FPS when gaming. In that case, you'll be happier with the Normal overdrive setting. The Dynamic setting, which tries to adjust the overdrive in relation to the current framerate, should be avoided. It chooses the Normal overdrive setting at lower framerates and correctly switches to Advanced past 100 FPS, but also opts for Ultra Fast at 144 Hz, which results in a ton of overshoot and makes moving objects look awful. Even scrolling through a webpage or a text looks bad with the Dynamic overdrive setting selected because your refresh rate and framerate are locked to 144 Hz/FPS, so the system is enforcing the horrendous Ultra Fast overdrive setting. In a future firmware update, Cooler Master should either make the Dynamic overdrive setting stick to Advanced past 100 Hz/FPS or make the Advanced overdrive setting the default.

Moving Picture Response Time (MPRT)

In the OSD, the MPRT toggle is hidden under the "Motion Clearness" name. If you turn it on, the backlight will start strobing to achieve a "0.5 millisecond-like" response time at the expense of picture brightness and other strobing-related issues, such as flickering and strobe crosstalk. The Motion Clearness technology can be used together with adaptive synchronization and refresh rates of up to 144 Hz, which isn't the case with older or lesser MPRT implementations.

"0.5 ms MPRT" response time is not to be confused with 0.5 ms GtG response time, as the commonly used GtG value tells us how much time it takes for a pixel to change between two colors, while MPRT, also known as display persistence, represents how long a pixel is continuously visible. It's important to know that MPRT isn't a blur reduction technology, but a measurement that can be lowered by backlight strobing. Here's a comparison of moving object sharpness with Motion Clearness set to Low, Medium, and High.



Activating Motion Clearness lowers the maximum brightness of the panel and locks you from changing it in any direction. Namely, when set to Low, the panel brightness drops to 183 cd/m². Setting it to Medium lowers the brightness to a mere 94 cd/m², and setting it to High results in only 47 cd/m² of actual screen brightness. In other words, with Motion Clearness set to Medium or High, the monitor is more or less unusable. I wouldn't bother using it at Low either to be honest because 183 cd/m² isn't particularly bright, and there's still some apparent strobe crosstalk present.

Input Lag


To measure the input lag of a monitor, I'm using the NVIDIA LDAT v2, which I've covered extensively in my NVIDIA Reflex review.

The LDAT (Latency Display Analysis Tool) is a small device that is strapped onto a monitor, measures brightness changes, and provides us with end-to-end system latency data—the so-called mouse to photon latency. Using it for reliable and repeatable monitor input lag testing was made possible by NVIDIA's inclusion of the flash indicator feature in some Reflex-supported games. The flash indicator is essentially a white box displayed at the left edge of the screen the moment a mouse click "arrives" to the screen. I simply place the LDAT sensor on the part of the screen where the flash indicator will appear and click the mouse button on the sensor itself. The sensor will then detect the sudden change in brightness and calculate how much time passed between the mouse button click and flash appearing on the screen—that's the aforementioned end-to-end system latency. While this method doesn't let me isolate the input lag of the monitor as the only measured value (the high-speed camera method didn't do that either), if the rest of my system remains unchanged, the tested monitors can be compared to each other.

One other excellent characteristic of the LDAT method is the Auto Fire feature of the LDAT software. The Auto Fire feature allows me to select a number of "shots" (test iterations), as well as the "shot delay" (a delay between mouse clicks). All I have to do is run the desired game, align the LDAT sensor to the flash indicator area, and press the "mouse" button on the LDAT. The sensor and accompanying software take care of everything else. Less than a minute later, I have my test results—the minimum, average, and maximum measured end-to-end system latency and standard deviation.

The game I'm using for monitor input lag testing is Overwatch for several reasons. It has an integrated 400 FPS framerate limit, which my test system has no trouble hitting and maintaining at low settings at any given resolution. By keeping the framerate at 400 FPS, I'm maintaining a low and constant GPU render latency (around 2 ms). Overwatch also has an easily accessible practice range, which allows me to do all of my tests in a controlled environment. Finally, its game engine detects inputs even when a gun is reloading, so I don't have to worry about character selection or their ammo magazine size—once I trigger the Auto Fire feature, the test will run in 100 iterations regardless of what's happening in the game. My tests are conducted with the NVIDIA Reflex technology turned off (at 400 FPS, it wouldn't make any difference to system latency anyway) with adaptive refresh rate technologies (G-Sync, FreeSync, VRR) deactivated and V-Sync off.

Monitor Input Lag Test System
CPUIntel Core i9-9900K
GPUPalit GeForce RTX 2080 Super GP
MotherboardASUS ROG Maximus XI Formula
RAMADATA Spectrix D60G DDR4 32 GB
SSDKioxia Exceria 500 GB NVMe

In the end, we get the so-called button-to-pixel lag—the time that passes between an action with the mouse and said action first registering on screen. Anything below 16 ms (that equals one frame of lag at 60 Hz) can be considered gaming-grade, and such a monitor is suitable even for the most demanding gamers and esports professionals. If the input lag falls between 16–32 ms (between 1–2 frames of lag at 60 Hz), the monitor is suitable for almost everyone but the most hardcore gamers, especially if they're playing first-person shooters on a professional level. Finally, if a monitor's input lag is higher than 32 ms (over 2 frames of lag at 60 Hz), even casual gamers should be able to notice it. Will they be bothered by it? Not necessarily, but I can't recommend a screen like that for serious gaming.

Here's how the Cooler Master GM34-CWQ ARGB holds up in terms of input lag.



After doing 100 iterations of the LDAT-powered input lag test, the Cooler Master GM34-CWQ ARGB shows an average input lag of 10.7 milliseconds, making it a viable choice even for hardcore gaming.
Next Page »Value & Conclusion
View as single page
Nov 28th, 2024 03:37 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts