ASUS has given their GeForce RTX 2080 STRIX OC a large overclock out of the box over the Founders Edition; unfortunately, memory has been left at default. Thanks to its overclock, in real-life, at 4K, the card is 3% faster than the RTX 2080 Founders Edition, which is not that much, to be honest. With those performance levels, the card exceeds the performance of GTX 1080 Ti by 12%, making the card ideal for 1440p gaming, or 4K when you are willing to sacrifice some details settings to achieve 60 FPS. Compared to the RTX 2080 Ti, the 2080 STRIX is around 25% behind. Against the Radeon RX Vega 64, which is the fastest graphics card AMD has on offer, the performance uplift is 50%.
ASUS has upped the power input capabilities of their card to 2x 8-pin (the Founders Edition is 8+6). The power limit in BIOS has been increased as well, to 245 W. Still, like on all other Turing cards, the power limiter is still limiting maximum boost clocks, but they are a bit higher, which is reflected in the performance results.
Power draw is slightly increased on average in gaming; with the performance increase considered, the ASUS STRIX is actually a tiny bit more power efficient than the NVIDIA Founders Edition. Peak power consumption values are higher, by up to 50 W in Furmark, which is as expected since the board power limit has been raised to make use of the 8+8 pin power input capability.
Gaming noise levels with the stock BIOS are 1 dBA higher than with the Founders Edition, despite the much bigger cooler. Temperatures, on the other hand, are much lower, which of course won't do anything for you except to give you a smaller number to look at. The good thing is that ASUS included a dual-BIOS feature on their card, with the second BIOS enabling a "quiet" mode. In this setting, the fan curve is much less aggressive and actually perfect for this card. Noise levels now reach only 31 dBA, which is quieter than any GTX 1080 Ti custom design you can find. As expected from the new fan curve, temperature go up, but only to 75°C, which is perfectly fine for this card and still far off from the thermal cutoff point at which Boost will start reducing clocks. Modern graphics cards are designed for much higher temperatures; anything below 90°C is child's play for them. Technically, the lifetime of electronics is increased the lower temperature they run, but the difference is negligible in this case, and the card will be obsolete long before it fails either way.
Unlike the RTX 2080 Ti STRIX, both BIOSes have idle fan stop enabled, so you can enjoy the perfect noise-free experience during idle, Internet browsing, productivity, and light gaming. I see no reason not to use the "quiet" mode BIOS unless you really are obsessed with temperatures and don't care about noise.
Overclocking the ASUS RTX 2080 STRIX is more complicated, like on all Turing cards, and we gained an additional 6% in performance, which isn't a lot. Overclocking potential for both GPU and memory seems to be lower than what we've been seeing on other Turing cards, but I doubt ASUS is to blame for that, and our sample isn't a big winner in the silicon lottery.
NVIDIA GeForce RTX doesn't just give you more performance in existing games. It introduces RTX cores, which accelerate ray tracing—a rendering technique that can give you realism that's impossible with today's rasterization rendering. Unlike in the past, NVIDIA's new technology is designed to work with various APIs, from multiple vendors (Microsoft DXR, NVIDIA OptiX, Vulkan Vulkan RT), which will make it much easier for developers to get behind ray tracing. At this time, not a single game has RTX support, but the number of titles that will support it is growing by the day. We had the chance to check out a few demos and were impressed by the promise of ray tracing in games. I mentioned it before, but just to make sure: RTX will not turn games into fully ray-traced experiences.
Rather, the existing rendering technologies will be used to generate most of the frame, with ray tracing adding specific effects, like lighting, reflections, or shadows for specific game objects that are tagged as "RTX" by the developer. It is up to the game developers what effect to choose and implement; they may go with one or several, as long as they stay within the available performance budget of the RTX engine. NVIDIA clarified to us that games will not just have RTX "on"/"off", but rather, you'll be able to choose between several presets; for example, RTX "low", "medium", and "high". Also, unlike Gameworks, developers have full control over what and how they implement. RTX "only" accelerates ray generation, traversal, and hit calculation, which are the fundamentals, and the most complicated operations to develop; everything else is up to the developer, so I wouldn't be surprised if we see a large number of new rendering techniques developed over time as studios get more familiar with the technology.
The second big novelty of Turing is acceleration for artificial intelligence. While it was at first thought that it won't do much for gamers, the company devised a clever new anti-aliasing algorithm called DLSS (Deep Learning Super-Sampling), which utilizes Turing's artificial intelligence engine. DLSS is designed to achieve quality similar to temporal anti-aliasing and to solve some of its shortcomings, while coming with a much smaller performance hit at the same time. We tested several tech demos for this feature and had difficulty telling the difference between TAA and DLSS in most scenes. The difference only became obvious in cases where TAA fails; for example, when it estimates motion vectors incorrectly. Under the hood, DLSS renders the scene at lower resolution (typically 50%, so for 4K, 2880x1620), and feeds the frame to the tensor cores, which use a predefined deep neural network to enhance that image. For each DLSS game, NVIDIA receives early builds from game developers and trains that neural network to recognize common forms and shapes of the models, textures, and terrain, to build a "ground truth" database that is distributed through Game Ready driver updates. On the other hand, this means that gamers and developers are dependent on NVIDIA to train that network and provide the data with the driver for new games. Apparently, an auto-update mechanism exists that downloads new neural networks from NVIDIA without the need for a reboot or update to the graphics card driver itself.
The ASUS RTX 2080 STRIX OC is currently listed online for $870, which is another $70 price increase over the Founders Edition, which by itself is $100 more expensive than the reference designs we are hoping for at an MSRP of $700 (that may even never get released). The price increase over the Founders Edition is a bit much, especially when you consider that similarly positioned cards from other manufacturers are available for $830. The performance increase due to the out-of-the-box overclock certainly can't justify the increase alone (3% of $800 is $24). Of course, you're paying for the bigger cooler, but the card is still too expensive. If you are looking for the quietest RTX 2080, then your options are limited, though, and the ASUS RTX 2080 STRIX OC should make it onto your list.