Hardware architecture only paints half the story of what's new with GeForce Ampere. Here, we quickly run you through the noteworthy new features introduced by the GeForce RTX 30 series. Some of these will work even with the RTX 20 series through driver updates. When NVIDIA introduced the RTX 20 series Turing, the hot new feature was RTX and real-time raytracing, along with AI acceleration. NVIDIA builds on these with its 2nd generation RTX architecture for vast generational performance improvements. NVIDIA is so confident in the performance uplift that it set its eyes on 8K gameplay. This was probably important as new-generation game consoles, such as the PlayStation 5 and Xbox Series X, formally advertise 4K gaming. High-end gaming PCs can't be seen to play in the same field.
8K is four times the pixels of 4K, or sixteen times Full HD, which is no small ask. Yet NVIDIA believes it can take a crack at 8K by using the new 8K DLSS feature in combination with its new display outputs that support 8K 60 Hz leveraging DSC 1.2a. 8K DLSS renders the game at 1440p and uses a special 9X Super Sampling algorithm to reconstruct details at 8K. NVIDIA demonstrated that this looks better than merely running the game at 8K and making it scale up game assets using bilinear upscaling. The company also listed a large selection of games where frame rates above 8K were obtained.
With competitive e-sports gaming that has millions riding on it, it's not enough that network latency is reduced and frame rates increased. NVIDIA discovered that "whole system latency" plays an equally important role as network latency in affecting gamers' competitiveness. System latency is defined by NVIDIA as the time it takes for you to physically click a button on your mouse and for the click to register in-game as an action. In the heat of gameplay, system latency can mean the difference between scoring a hit against the opponent or their getting away. NVIDIA Reflex is a feature that works to minimize system latency.
On the software side, the NVIDIA driver co-operates with a compatible game engine to optimize the game's 3D rendering pipeline. This is accomplished by dynamically reducing the rendering queue, so fewer frames are queued up for the GPU to render. NVIDIA claims that the technology can also keep the GPU perfectly in sync with the CPU, reducing the "back-pressure" on the GPU by letting the game sample mouse input at the last possible moment. NVIDIA is releasing Reflex to gamers as GeForce driver updates, and to game developers as the Reflex SDK. This allows them to integrate the technology into their game engine with a toggle for the technology, and also put out in-game performance metrics.
Although NVIDIA Reflex works with any monitor, the company also introduced a new display standard targeted at competitive e-sports gamers it dubbed the NVIDIA G-SYNC 360 e-Sports Display. This is a display feature-set logo that certifies a monitor as featuring an IPS dual-driver panel with a 360 Hz refresh rate, at least 240 Hz ultra-low motion blur (ULMB), the new G-SYNC e-sports mode, and hardware support for the NVIDIA Reflex Latency Analyzer feature. On these displays, you'll find a 2-port USB hub integrated into the display. You plug this hub into your PC via an included USB cable, and plug your gaming mouse into one of two downstream USB ports on the monitor. This can be any mouse, but an NVIDIA-certified mouse (ASUS, Razer, or Logitech) will offer additional features.
With the mouse plugged in, you launch the Reflex Latency Analyzer utility from the monitor's OSD settings and run the game with the Reflex metrics toggle enabled. Each time you click on the mouse, the click is registered in the USB hub of the monitor, which then measures the time it takes for the "output" gun flash pixels to appear on the screen. You can train the utility to look for where the gun muzzle flash pixels appear. This way, you get extremely accurate measurements of not just input latency, but also end-to-end system latency. Something like this required high-speed cameras and manual math to calculate in the past. Input latencies, coupled with end-to-end latency data can be viewed in the Performance Metrics screen of the GeForce Experience overlay, when spawned in a compatible game.
Storage has traditionally been the slowest component in the PC, which also happens to have the highest overhead (associated with IO, data compression, and in some cases encryption). With the introduction of NVMe, SSD sequential transfer rates are on a meteoric rise, as is storage IO overhead. NVIDIA predicts that for a 7 GB/s NVMe SSD that takes advantage of PCIe Gen 4, which is moving compressed data to the GPU, the overhead could have a tangible impact on CPU performance, saturating as many as 24 logical processors. NVIDIA RTX-IO aims to fix this by leveraging the Microsoft DirectStorage API with NVIDIA-specific optimizations on top. RTX-IO enables compressed data transfers between your SSD and the GPU memory with minimal involvement of the CPU. The compressed data stream is decompressed by the GPU's compute capability. RTX-IO requires game-level support. Since most games for next-gen consoles have some sort of readiness for DirectStorage, RTX-IO support won't be too far behind.
When NVIDIA introduced Ansel a few years ago, it spawned a new class of still art using in-game assets. What if you could make 3D animated movies using in-game assets? This concept is called machinima and already has a small, growing community of artists. NVIDIA wants to democratize and grow this ecosystem by introducing the new Omniverse Machinima software. When used with a supporting game, the software lets you make detailed 3D movies using all available game assets. Think making your own Star Trek fan fiction using STO assets.
When you think about RTX, it's often AAA games that come to mind, and not competitive e-sports titles, as RTX inflicts a performance costs, and e-sports titles are designed to favor performance over eye candy. This is about to change with Fortnite going RTX-on. Fortnite uses almost the entire RTX feature set, including raytraced reflections, shadows, ambient occlusion, and global illumination. The game also implements DLSS, letting it render at a lower resolution, and using AI supersampling to restore details. Epic claims that DLSS at 4K looks better than even native 4K rendering.