At the heart of NVIDIA's Reflex technology lays Reflex Low Latency Mode. In games that support it, you'll usually find it in the video settings, where it will be called "NVIDIA Reflex" or "NVIDIA Reflex Low Latency." Reflex Low Latency Mode is what most users will refer to when mentioning "NVIDIA Reflex" even though Reflex itself is in fact a wider collection of latency-abolishing technologies. NVIDIA Reflex Low Latency Mode (NVIDIA Reflex, for the sake of simplicity) comes with three modes of operation: Off, On, or On + Boost.
When on, it behaves as a driver-driven frame-rate limiter. It will effectively lower your frames per second by a tiny, barely noticeable amount while at the same time trimming overall system latency by up to 50%, especially if you find yourself in a GPU-bound scenario. This high level of communication between the game engine and the graphics driver is achieved through game integration, which is why NVIDIA is providing game developers with Reflex SDK. To paraphrase NVIDIA, in GPU-bound cases, Reflex SDK enables the CPU to start submitting rendering work to the GPU just before it finishes the prior frame, significantly reducing and often eliminating the render queue. This allows for better response times. Reflex also reduces back pressure on the CPU, enabling games to sample mouse input at the last possible moment. While similar to the Ultra Low Latency mode technology in the driver, it's superior since the technique used to control the CPU render submission is happening directly from within the game engine.
One other option in the video settings of every Reflex-supported game is to set NVIDIA Reflex to On + Boost. In this scenario, Reflex SDK boosts GPU clocks and allows frames to be submitted to the display slightly sooner in some heavily CPU-bound cases. This is similar to the "Prefer Maximum Performance" mode available in the NVIDIA control panel and comes at an expense of a slightly increased power draw.
In case you're wondering how all of this differs from a good old in-game frame-rate limiter, you're thinking in the right direction. A framerate limiter indeed does a similar job to NVIDIA Reflex in that it prevents GPU-bound scenarios, which reduces render queues and prevents system latency from skyrocketing when set properly. That's the thing, though. A traditional framerate limiter has to be adjusted manually, and that's a tedious process with many pitfalls, originating from the fact that it's very hard to come up with an optimal value. Your average framerate and framerate stability can vary between two levels of a game. They can even oscillate within the same level (open world vs. closed quarters, for example). NVIDIA Reflex completely removes any user interaction from this complex equation. You turn it on and that's that—the game engine talks to the graphics driver and they continuously provide an optimal balance of framerate, GPU load, and overall system latency.
Here's the current list of games with NVIDIA Reflex integration and support. As you can see, all of them offer the Reflex Low Latency Mode and Low Latency Boost features (On + Boost). Fortnite, Mordhau, Valorant, and Warface also offer in-game latency statistics, and Fortnite, Ghostrunner, Mordhau, Overwatch, Rainbow Six: Siege, and Warface come with an integrated flash indicator. The flash indicator is an excellent addition for us reviewers; it's a bright white box that flashes on the left edge of the screen whenever a mouse button click finds its way from your mouse to your screen.
NVIDIA Reflex and LDAT v2 Benchmarks
I've used two games to test NVIDIA Reflex, Overwatch, and Rainbow Six: Siege. Both offer the aforementioned flash indicator, and both are multiplayer first-person shooters, where system latency arguably matters the most. That makes them an excellent choice for in-depth Reflex testing.
The importance of the flash indicator cannot be stressed enough. Without it, LDAT-based tests are extremely delicate and prone to operator error. The goal is to place the sensor on a part of the monitor where a luminance change will happen first. Muzzle flash is an obvious choice. However, adjusting the sensor to capture muzzle flash can be tricky because muzzle flash blooms, so you can generate a ton of bad data simply because you didn't place the sensor directly on the center of the muzzle flash. Some weapons in certain games have sway, sometimes the muzzle flash can be delayed compared to when it was actually triggered, there are character animations to think about—it's a proper mess. Having a flash indicator on the left edge of the screen removes any guesswork from the process. You simply place the LDAT sensor on the part of the screen where the flash indicator will appear and start firing. That part of the screen will flash as soon as the mouse click "arrives" to the screen, and the LDAT sensor will measure the corresponding latency. As I've mentioned earlier in the review, the LDAT v2 sensor even removes the trouble of using an external mouse; it's equipped with its own mouse button. Here's how all of that looks in practice.
Using the LDAT v2 sensor, its Auto Fire feature, and two games with an integrated flash indicator allowed me to do 100 iterations of each test in a short amount of time. In case you jumped straight to the benchmarks without reading my introduction first, allow me to reiterate: all my tests measure the so-called mouse-to-photon latency, or overall system latency—the time it takes for a mouse click to register on the screen.
NVIDIA Reflex Test System
Monitor
Aorus FI27Q-X (27", 2,560x1,440, 240 Hz)
CPU
Intel Core i9-9900K
GPU
Palit GeForce RTX 2080 Super GP
Motherboard
ASUS ROG Maximus XI Formula
RAM
ADATA Spectrix D60G DDR4 32 GB
SSD
Kioxia Exceria 500 GB NVMe
Overwatch
Let's start by taking look at LDAT measurements in Overwatch. I've run 100 iterations of every single test at three different graphic detail settings: Low, High, and Epic. I didn't change a single video setting from the preset value, other than resolution scale, which I kept at 100% for the Low and High detail settings and pushed to 200% for the Epic settings, to recreate a worst-case GPU-bound scenario and examine how NVIDIA Reflex deals with it. All tests were run at 2560x1440 screen resolution.
Overwatch End-to-End System Latency Comparison
Average Latency
Minimum Latency
Maximum Latency
Standard Deviation
GPU Utilization
CPU Utilization
Framerate
GPU Power Draw
Low Settings (100% Resolution Scale)
Reflex OFF
13.3 ms
8.0 ms
18.3 ms
2.5
88%
24%
400 FPS (capped)
200 W
Reflex ON
13.3 ms
7.7 ms
19.7 ms
2.6
87%
27%
400 FPS (capped)
201 W
Reflex ON + Boost
13.3 ms
6.7 ms
19.1 ms
2.7
86%
24%
400 FPS (capped)
201 W
High Settings (100% Resolution Scale)
Reflex OFF
25.7 ms
19.2 ms
32.8 ms
3.0
94%
22%
256 FPS
210 W
Reflex ON
16.8 ms
10.6 ms
22.6 ms
2.9
93%
22%
256 FPS
210 W
Reflex ON + Boost
16.7 ms
11.1 ms
28.7 ms
3.0
94%
22%
255 FPS
209 W
Epic Settings (200% Resolution Scale)
Reflex OFF
82.3 ms
70.7 ms
96.3 ms
5.3
95%
9%
61 FPS
223 W
Reflex ON
35.0 ms
24.1 ms
45.7 ms
5.2
97%
13%
63 FPS
226 W
Reflex ON + Boost
35.4 ms
22.6 ms
48.2 ms
5.4
97%
12%
61 FPS
225 W
By analyzing these numbers, a couple of things become very apparent. First and foremost, looking at the results of Overwatch tested at low settings, it's clear that if you don't find yourself in a GPU-bound scenario, NVIDIA Reflex Low Latency Mode serves no purpose. Your graphics card won't have any issues handling incoming frames without placing them in a render queue, which will result in a snappy, low-latency gaming environment. If you ever felt that lowering visual settings not only increases the framerate but also makes the game somehow feel more responsive, these numbers explain what's happening "behind the scenes." With this in mind, NVIDIA Reflex obviously isn't intended for those who run a high-end graphics card and play at non-4K resolutions. Upgrading your system with a more powerful graphics card is a quickfire way to lower end-to-end system latencies, but also the most expensive one.
Luckily, as the rest of the table clearly shows, there's a much cheaper way to cut down the latency. After bumping the Overwatch video settings to High, where my system was no longer easily hitting the imposed framerate cap of 400 FPS and the graphics card had a lot more to deal with, NVIDIA Reflex Low Latency Mode started working its magic. The average end-to-end system latency was cut down from 25.7 ms to 16.8 ms, with an 8.6 ms reduction of the minimum system latency and a whopping 10.2 ms reduction of the maximum system latency. This came at absolutely no expense in terms of framerate and/or GPU power draw; they remained unchanged. Switching NVIDIA Reflex to On + Boost mode made no noteworthy difference compared to "just" using Reflex Low Latency Mode. This was to be expected as my CPU was nowhere close to becoming a bottleneck (its utilization never went above 25%).
Switching the Overwatch video settings to Epic and pushing the render scale to 200% meant that the game was effectively rendering at 5120x2880 (so-called "5K") resolution and then downscaling to 2560x1440, which is the native resolution of the monitor I used for testing. These settings caused the framerate to plummet down to around 60 FPS and the system latency to climb up to 82.3 ms (on average). The game felt extremely sluggish and unresponsive until I activated NVIDIA Reflex Low Latency Mode, that is. The latency was immediately cut down to 35 ms—a reduction of over 57%. The numbers remain equally impressive when we take a look at the measured minimum and maximum latency, where massive reductions are again very apparent. All this is happening with no apparent or measurable framerate drop, or a jump in GPU power draw.
These are some fantastic results, but the numbers don't tell the whole story. You might ask yourself whether a difference between 13, 35, or 82 milliseconds is even noticeable to an average human, and it most definitely is. NVIDIA quotes some studies to support their claims, but I can't verify those, so there's no point in discussing them. What I can tell you from firsthand experience is that activating the Reflex technology in a GPU-bound scenario instantly makes the game feel significantly more responsive. Mouse movement feels snappier and more direct, which translates into a vastly improved overall experience.
Rainbow Six: Siege
Let's move on to Rainbow Six: Siege, another game equipped with the Reflex flash indicator, which makes it a good choice for system latency testing. Let's examine the numbers.
Rainbow Six: Siege End-to-End System Latency Comparison
Average Latency
Minimum Latency
Maximum Latency
Standard Deviation
GPU Utilization
CPU Utilization
Framerate
GPU Power Draw
Low Settings (100% Resolution Scale)
Reflex OFF
15.8 ms
10 ms
21.3 ms
3.0
96%
44%
268 FPS
206 W
Reflex ON
13.5 ms
8.7 ms
18 ms
2.5
92%
43%
260 FPS
204 W
Reflex ON + Boost
14.3 ms
7.9 ms
18.5 ms
2.6
93%
38%
262 FPS
211 W
High Settings (100% Resolution Scale)
Reflex OFF
17.9 ms
11.8 ms
24.8 ms
3.9
95%
37%
214 FPS
213 W
Reflex ON
15.7 ms
8.2 ms
20.1 ms
2.7
91%
38%
210 FPS
208 W
Reflex ON + Boost
16.3 ms
11.6 ms
21.9 ms
2.6
92%
37%
210 FPS
209 W
Ultra Settings (100% Resolution Scale)
Reflex OFF
22.4 ms
15.9 ms
28.8 ms
3.0
97%
54%
166 FPS
216 W
Reflex ON
17.5 ms
11.5 ms
23 ms
3.0
93%
53%
159 FPS
210 W
Reflex ON + Boost
17.8 ms
11.8 ms
24.7 ms
3.3
93%
39%
158 FPS
209 W
These results basically verify everything mentioned in my Overwatch test result analysis. Rainbow Six: Siege isn't able to stress my system enough for NVIDIA Reflex to show its full potential, although end-to-end system latency improvements can still be spotted, even at the High settings. As the load on the GPU increases, so does the efficiency of the NVIDIA Reflex technology. We again see no noteworthy framerate drops or jumps in power consumption.
CPU Bound Testing
Since the Reflex Low Latency + Boost mode didn't seem to do anything in a straightforward GPU-bound scenario, I wanted to emulate a CPU-bound scenario, which is what Reflex On + Boost mode is primarily designed for. For that purpose, I used my motherboard's UEFI BIOS settings to trim my Core i9-9900K CPU down to two cores. I turned off Hyper-Threading, disabled Turbo, and set the frequency of those two cores to 3.5 GHz. For this series of tests, I measured the total system power consumption by connecting the test system to the Brennenstuhl PM 231 E power meter.
As expected, Reflex Low Latency + Boost mode starts to make sense in a CPU-bound scenario. Even though it's not as effective as Reflex proved to be in a GPU-bound scenario, a 20% improvement in end-to-end system latency can still be noticed, with only a minor bump in total power consumption. I'll happily take that any day!
NVIDIA Reflex vs. FPS Limiter
Finally, I wanted to compare the NVIDIA Reflex technology to a traditional framerate limiter, which anyone can use to reduce some of the strain from the GPU. I've covered various potential issues of using a framerate limiter earlier in the article, so I'm not going to repeat them. Let's go straight to the numbers.
NVIDIA Reflex vs Manual Framerate Limiter
Average Latency
Minimum Latency
Maximum Latency
Standard Deviation
Framerate
Overwatch Epic Settings, 200% Resolution Scale
Reflex OFF
82.3 ms
70.7 ms
96.3 ms
5.3
61 FPS
Reflex ON
35.0 ms
24.1 ms
45.7 ms
5.2
63 FPS
Reflex ON + Boost
35.4 ms
22.6 ms
48.2 ms
5.4
61 FPS
Manual Framerate Limiter (60 FPS)
37.4 ms
24.3 ms
51.5 ms
6.3
60 FPS (capped)
The results show that while very effective in reducing overall system latency, a manual framerate limiter still isn't as effective as NVIDIA Reflex Low Latency Mode (or Reflex Low Latency + Boost). Additionally, it requires a lot of tinkering and user interaction, while NVIDIA Reflex is something you simply turn on and forget about—the game engine and the graphics driver talk with each other and continuously optimize your experience.