Several weeks of hardcore gaming on the AOC AGON AG271QG confirmed what I already knew - if you have a high-end graphics card (GTX 1080 in my case), your gaming experience will completely transform for the better after getting a 27" IPS G-Sync monitor with a refresh rate of up to 165 Hz. Assuming you can reach 100 or more FPS in a game of your choice, you'll have a hard time believing that it can run with such buttery smoothness. It won't take you long to realize that all these years you spent with a 60 Hz monitor, you were doing it wrong. The effective G-Sync range is 35-165 Hz (or 35-165 FPS, if you want to look at it that way), which means that a perfect synchronization of the in-game framerate and the monitor's refresh rate is pretty much ensured regardless of what kind of hardware you're running on. Let's not forget about the improved mouse accuracy too, which is yet another aspect that's bound to stun anyone who switches from a 60 Hz to a 144/165 Hz monitor.
Response Time & Overdrive
According to AOC's specifications, the AGON AG271QG has a 4 ms GtG response time. The panel uses the Overdrive technology to speed up pixel transitions, and you will find the option under "Overdrive" in the OSD. Overdrive has a grand total of six settings - Off, Weak, Light, Normal, Medium, and Strong.
I extensively tested all of them by using the so-called pursuit camera method developed by the good people of
Blur Busters, namely Mark D. Rejhon. The idea of the procedure is to use a standard DSLR camera to capture the motion blur exactly like your eyes would see it. That's achieved by mounting a camera on a smooth slider, setting the camera exposure to four times the length of the monitor's refresh rate, and loading the
Ghosting Test Pattern with the Pursuit Camera Sync Track, invented by Mark Rejhon of Blur Busters. Then, the camera has to be slid sideways at the same speed as the on-screen motion. The sync track is there to tell you whether you're moving the camera too fast or too slow or if it shakes too much. The procedure takes some practice and getting used to, but it yields great results and lets us examine the differences between various overdrive settings at various monitor refresh rates.
I made a series of photos at refresh rates of 60, 100, 120, 144, and 165 Hz, at every available overdrive setting. Let's take a look at the results and figure out what the ideal overdrive setting would be (click on the picture to see it in full resolution):
A couple of things become clear after examining the pursuit camera's photos. First and foremost, you shouldn't bother setting Overdrive to either "Off" or "Strong". There's too much ghosting present when it's turned off and significant overshoot can easily be spotted when its set to "Strong". Between its other available options, it's going to come down to "Light" or "Medium". I'd give a slight edge to "Medium" simply because it results in the sharpest moving visuals at higher refresh rates.
Since my previous monitor review, we did a bit of shopping in order to substantially improve our input-lag testing procedure. Now, we consider it to be refined enough to offer you the exact numbers instead of just a rough categorization of the monitor's input-lag performance. However, in order for you to interpret them properly, you need to be familiar with my testing methodology.
I start by connecting a modified gaming mouse - the Logitech G9x - to my PC. The mouse has a blue LED connected directly to its primary button, and the LED instantly illuminates after that button is pressed. The USB sample rate is set to 1,000 Hz via the Logitech Gaming Software. I then mount the Nikon 1 J5, a mirrorless camera capable of recording video in 1,200 FPS, in front of the monitor. After that, I run Counter Strike: Global Offensive and load a custom map (Map_Flood, made by a
member of the Blur Busters community) consisting of nothing but a huge white square suspended in a black void. The camera is set up in a way that has it record the entire screen.
Every video setting in CS:GO is either switched to the lowest-possible setting or turned off, and a console command "fps_max 0" is used to turn off the built-in FPS limiter and get as many frames per second as possible. The purpose of this is to remove the input lag caused by the game engine from the equation. My system is equipped with an overclocked Core i7-6700K and a GTX 1080 Ti to make sure it has no trouble hitting 2,000 FPS in that scenario. Vertical Sync and G-Sync are also turned off because we don't want anything delaying the drawing of the frames - our goal is to have the first frame reaching the screen as fast as the monitor itself lets it, rather than limiting it by various syncing methods. You're probably wondering how much additional input lag can be introduced when G-Sync is used, which is undoubtedly something every user of a monitor that supports it will do - that's why you're buying it in the first place. I tested two different 165 Hz G-Sync monitors extensively with G-Sync on and off and found out that G-Sync introduces an additional 2 ms of input lag on average.
This test is conducted by starting the video recording and pressing the left mouse button, which is bound to the in-game command "Strafe Left", after which the LED blinks and an in-game movement occurs. I repeat this twenty times and then open the recorded videos in QuickTime, which has a nice option of browsing through a video frame by frame. I then look at the frame where the LED first turned on and carefully look for the frame where the first glimpse of on-screen movement can be seen. The exact number of frames it took between those two events to happen is then multiplied by 0.8333 because I'm recording in 1,200 FPS (1 frame = 0.8333 ms). To get the final result, I subtract 5 ms because that's the average click latency of the Logitech G9x (it measures in at between 4-6 ms). There are a couple of other factors that slightly influence the score, such as the LED reaction time (1 ms or less), camera lag (1 ms), and USB polling rate (1 ms), but those aren't constant, so I'm not subtracting them from the final calculated result. That's also one of the reasons why I'm doing as many as twenty measurements - the impact of the aforementioned error margins is reduced with each new sample taken.
In the end, we get the so-called button-to-pixel lag value - the time from the moment you perform an action with your mouse to when it is first registered on the screen. Anything below 16 ms (that equals a frame of lag at 60 Hz) can be considered gaming-grade, and such a monitor is suitable even for the most demanding gamers and eSports professionals. If input lag falls somewhere between 16-32 ms (between 1-2 frames of lag at 60 Hz), the monitor is suitable for almost everyone but the most hardcore gamers, especially if they're playing first-person shooters on a professional level. Finally, if a monitor's input lag is higher than 32 ms (over 2 frames of lag at 60 Hz), even casual gamers should be able to notice it. Will they be bothered by it? Not necessarily, but I can't recommend a screen like that for serious gaming.
Here's how the AOC AGON AG271QG holds up in that regard:
As we can see by looking at the numbers, the AGON AG271QG offers superb gaming performance, with the minimum measured input lag being as low as 5.83 ms and the maximum almost never going over 10 ms.
ULMB
ULMB (Ultra Low Motion Blur) is a technology enabled by NVIDIA's G-Sync module, but one that cannot be combined with it, as it works at a fixed refresh rate. In other words, it's either G-Sync or ULMB. It tries to annihilate the blur by having the backlight act as a strobe. However, if not implemented properly, it can introduce flickering. It also has a few other caveats. If you want to use ULMB, you'll have to first disable G-Sync and lower the refresh rate to 120 Hz. Only then can it be turned on in the OSD. After you do, you'll notice that the brightness of the screen takes a big hit, and that's not something you can change or fix.
In exact numbers, if you stick with the factory settings, the maximum brightness you'll be able to achieve after turning the ULMB on is 126 nits. That's fine for nighttime gaming, but during the day, you'll want to look at a brighter picture. If you set the monitor up for calibration and use an ICC profile, your maximum ULMB brightness will be 78 nits - way too low for anything. For comparison, a calibrated Acer Predator XB271HU offers 139 nits of brightness with ULMB activated.
Even though there's no significant strobe crosstalk visible with ULMB on, the motion blur reduction isn't perfect either, as can be seen from these pursuit camera photos (click on the picture to see it in full resolution):
Overall, I suggest you simply forget about it and stick with G-Sync. That way, you get a much brighter, vivid picture, more smoothness, and a higher refresh rate, without a significant increase in perceived motion blur. With that in mind, I can't think of a single reason why you'd want to use ULMB with this monitor.