Wednesday, September 2nd 2020

NVIDIA Reflex Feature Detailed, Vastly Reduce Input Latency, Measure End-to-End System Latency

NVIDIA Reflex is a new innovation designed to minimize input latency with competitive e-sports games. When it comes out later this month with patches to popular e-sports titles such as Fortnite, Apex Legends, and Valorant, along with a GeForce driver update, the feature could improve input latencies even without any specialized hardware. Input latency is defined as the time it takes for a user input (such as a mouse click) in a game, to reflect as output on the screen, or the time it takes for your mouse click to register as a gunshot in an online shooter, and appear on-screen. The feature is compatible with any NVIDIA GeForce GPU, GTX 900 series or later.

NVIDIA briefly detailed how this works. On the software side, the NVIDIA driver co-operates with a compatible game engine to optimize the game's 3D rendering pipeline. This is accomplished by dynamically reducing the rendering queue, so fewer frames are queued up for the GPU to render. NVIDIA claims that the technology can also keep the GPU perfectly in sync with the CPU (1:1 render queue), reducing the "back-pressure" on the GPU, letting the game sample mouse input at the last possible moment. NVIDIA is releasing Reflex to gamers as GeForce driver updates, and to game developers as the Reflex SDK. This allows them to integrate the technology with their game engine, providing a toggle for the technology, and also put out in-game performance metrics.
Speaking of metrics, NVIDIA innovated Reflex as a hardware feature for its new G-SYNC 360 Hz IPS gaming display standard. Popular display manufacturers such as Acer, ASUS, MSI, GIGABYTE, ViewSonic, etc., are developing new monitors that feature the G-SYNC 360 feature logo. These monitors feature G-SYNC hardware, as well as a hardware-side implementation of Reflex, called Reflex Latency Analyzer, that lets you precisely measure input latency and optimize the software-side further. It's important to note here, that these new monitors are not a requirement to use Reflex, and anyone with a compatible graphics card and updated drivers can use it on compatible games.

In G-SYNC 360 Hz IPS gaming displays, you will find a 2-port USB hub, in the display. You plug this hub to your PC via an included USB cable, and plug in your gaming mouse to one of the two downstream USB ports of the monitor. This can't be just any mouse, but an NVIDIA-certified mouse. ASUS, Razer, and Logitech are developing these mice. With the mouse plugged in, you launch the Reflex Latency Analyzer utility from the monitor's OSD settings, and run the game with the Reflex metrics toggle enabled.

This way this works is, each time you click on the mouse, the click is registered in the USB hub of the monitor, which then measures the time it takes for the "output" gun flash pixels to appear on the screen. You can train the utility to look for where the gun flash pixels appear. This way, you get extremely accurate measurements of not just input latency, but also end-to-end system latency. Something like this required high-speed cameras and manual math to calculate in the past. Input latencies, coupled with end-to-end latency data, can be viewed from the Performance Metrics screen in the GeForce Experience overlay, when spawned in a compatible game.
Add your own comment

15 Comments on NVIDIA Reflex Feature Detailed, Vastly Reduce Input Latency, Measure End-to-End System Latency

#1
Legacy-ZA
I am really excited about this feature, I really need that near-instant connected feel or I get annoyed by the delays. I can feel any input lag with immediate effect, wither it is the mouse, the keyboard, monitor, audio, or whatever. I can't wait to test this out and see if vastly improves the user experience. I always cry about DPC latency. :)

The RTX3070 seems to be the one to get for me, depending on the availability of course. I think this is the first time in a while that nVidia has offered significant advancements to justify an upgrade, good job nVidia. That being said, I am also keen to see what the AMD camp has to offer. :D

Exciting times.
Posted on Reply
#2
ExcuseMeWtf
So it needs a bunch of new, certainly expensive hardware and explicit support in game engine. Yeah, sounded too good to be true.
Posted on Reply
#3
Ruyki
Everyone with a GTX 900 series or later should be able to use this as long as the game is compatible.
The certified monitor and mouse is only needed if you want to be able to measure the latency.
Posted on Reply
#4
TheoneandonlyMrK
ExcuseMeWtfSo it needs a bunch of new, certainly expensive hardware and explicit support in game engine. Yeah, sounded too good to be true.
Yeah not good, their competition has anti lag that Just works, on everything without fanfare.
Still waiting on reviews, I trust Nvidia's speal about as much as Boris on Brexit.
Posted on Reply
#5
R00kie
ExcuseMeWtfSo it needs a bunch of new, certainly expensive hardware and explicit support in game engine. Yeah, sounded too good to be true.
No it doesn't, its only needed if you want to measure the latency, you can use your normal peripherals, as long as the game engine supports the feature.
Posted on Reply
#7
Peka
I feel like over the last decade or two, input latency has taken a back seat when it's actually a pretty important performance metric. There have been games I've been put off because of poor input latency (though I do realise I'm a minority here). I'm just glad there seems to be more attention on the subject now.
Posted on Reply
#8
John Naylor
theoneandonlymrkYeah not good, their competition has anti lag that Just works, on everything without fanfare.
Still waiting on reviews, I trust Nvidia's speakl about as much as Boris on Brexit.
By that metric, it would appear that Boris is right then since G-Sync has always had inherently lower lag times.

www.tftcentral.co.uk/articles/variable_refresh.htm

"... by removing the traditional scaler [that AMD uses] it does seem that all hardware G-sync module screens have basically no input lag. We have yet to test a G-sync screen that showed any meaningful lag, which is a great positive when it comes to gaming. "

"As there is no hardware G-sync module added to the screen [with Freesync] , a normal scaler chip is used and this can in some cases result in additional input lag. You will still find plenty of FreeSync screens with low lag, but you will need to check third party tests such as our reviews to be sure. It's not as simple as with G-sync screens where the presence of that hardware module basically guarantees there will be no real input lag. With FreeSync screens we are more reliant on the manufacturer focusing in reducing lag than on G-sync screens. "

"We would hope that more manufacturers of Adaptive-sync/FreeSync screens invest in developing solid VRR implementations to ensure certification under the G-sync Compatible scheme, which will give consumers more faith in the performance of their models. They should also focus on ensuring that lag is low and additional features like blur reduction backlights are considered and included where possible. "

"We expect many of the cutting edge gaming screens to appear with traditional G-sync module inclusion before FreeSync alternatives are available, including the latest and greatest high refresh rates. That is part of the market where the G-sync module seems to have a firm grasp right now. Usage of the G-sync v2 module also seems to be a requirement so far for delivering the top-end HDR experience in the monitor market, with all current FALD Backlight models featuring this chip "
Posted on Reply
#9
R0H1T
G-Sync is also more expensive due to the dedicated hardware it employs, you'd expect with the premium price it definitely should work better no?
Posted on Reply
#10
TheoneandonlyMrK
John NaylorBy that metric, it would appear that Boris is right then since G-Sync has always had inherently lower lag times.

www.tftcentral.co.uk/articles/variable_refresh.htm

"... by removing the traditional scaler [that AMD uses] it does seem that all hardware G-sync module screens have basically no input lag. We have yet to test a G-sync screen that showed any meaningful lag, which is a great positive when it comes to gaming. "

"As there is no hardware G-sync module added to the screen [with Freesync] , a normal scaler chip is used and this can in some cases result in additional input lag. You will still find plenty of FreeSync screens with low lag, but you will need to check third party tests such as our reviews to be sure. It's not as simple as with G-sync screens where the presence of that hardware module basically guarantees there will be no real input lag. With FreeSync screens we are more reliant on the manufacturer focusing in reducing lag than on G-sync screens. "

"We would hope that more manufacturers of Adaptive-sync/FreeSync screens invest in developing solid VRR implementations to ensure certification under the G-sync Compatible scheme, which will give consumers more faith in the performance of their models. They should also focus on ensuring that lag is low and additional features like blur reduction backlights are considered and included where possible. "

"We expect many of the cutting edge gaming screens to appear with traditional G-sync module inclusion before FreeSync alternatives are available, including the latest and greatest high refresh rates. That is part of the market where the G-sync module seems to have a firm grasp right now. Usage of the G-sync v2 module also seems to be a requirement so far for delivering the top-end HDR experience in the monitor market, with all current FALD Backlight models featuring this chip "
So I was talking about Nvidia Reflex , I didn't use half as many word's as you and didn't mention Gsync , I'll strike it off as a miss quote.
Because all of that's irrelevant to my opinion.
Posted on Reply
#11
Cheeseball
Not a Potato
theoneandonlymrkYeah not good, their competition has anti lag that Just works, on everything without fanfare.
Still waiting on reviews, I trust Nvidia's speal about as much as Boris on Brexit.
Its only good between 60 to 100 FPS though. Above that its best to disable RAL. So its a good advantage for 5500 XT and 5600 XT users.
Posted on Reply
#12
TheoneandonlyMrK
CheeseballIts only good between 60 to 100 FPS though. Above that its best to disable RAL. So its a good advantage for 5500 XT and 5600 XT users.
And 4K 60 users who are definitely not going beyond 100fps.
Posted on Reply
#13
Cheeseball
Not a Potato
theoneandonlymrkAnd 4K 60 users who are definitely not going beyond 100fps.
I don't have a 4K monitor yet, but at that resolution I would not use RAL since you'll have low minimum FPS anyway. This may work with some older competitive titles like CS:GO or Quake Live or a MOBA/RTS.
Posted on Reply
#14
TheoneandonlyMrK
CheeseballI don't have a 4K monitor yet, but at that resolution I would not use RAL since you'll have low minimum FPS anyway. This may work with some older competitive titles like CS:GO or Quake Live or a MOBA/RTS.
It's not important how I or you use it, I am waiting on reviews before debating at length though I was not calling the feature bad anyway, like many. ! features! I will read a few reviews get a feel for it and if I would use it.
Posted on Reply
#15
Cheeseball
Not a Potato
theoneandonlymrkIt's not important how I or you use it, I am waiting on reviews before debating at length though I was not calling the feature bad anyway, like many. ! features! I will read a few reviews get a feel for it and if I would use it.
:confused:

I used to use RAL for PUBG and VALORANT and to be honest it really doesn't make any noticeable difference when your frame rate is beyond 100 FPS. I would probably say the same for this Reflex thing as it seems to be the second generation of their own "just in time" frame scheduling that they implemented last year (I never used this when I had my previous 2080 Super).

EDIT: Now that I think about it, this might be good for Flight Simulator 2020 or other heavy games. Reduced input lag in simulations does sound quite nice.
Posted on Reply
Add your own comment
May 21st, 2024 22:40 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts