The ASUS GeForce RTX 2080 Ti STRIX OC is the company's flagship RTX 2080 Ti. It comes with a decent overclock out of the box; unfortunately, memory has been left at default. Thanks to its overclock, in real-life, at 4K, the card is 3% faster than the RTX 2080 Ti Founders Edition, which is not that much, to be honest. With those performance levels, the card is the perfect choice for a 4K 60 FPS rig with games running the highest settings. Compared to the RTX 2080 FE, the performance increase is 31%. AMD's latest and greatest, the RX Vega 64, is only half as fast as the RTX 2080 Ti.
So, it looks like the performance increase is too small to really justify this card. What makes the difference, though, is the huge cooler that has plenty of cooling potential. With the stock BIOS, which runs a relatively aggressive fan curve, the temperature is only 65°C, which is definitely impressive. At just 37°C, idle temperature is extremely low, too, but the card is lacking the idle fan-stop feature, at least with the default BIOS.
ASUS was kind enough to include a dual-BIOS feature with their card, which is always handy when it comes to recovering from a bad BIOS flash. The second BIOS is configured to run the card in "quiet mode", which enables the highly sought after fan-stop-in-idle feature we all love so much, one that had presumably become standard, which has me wonder why ASUS didn't enable it by default this time around. When the "quiet" BIOS is active, gaming noise levels of the STRIX OC are truly impressive. Only 31 dBA for a card with such massive performance is unbelievable. Yes, temperatures are higher with 75°C, up by 10°C from what the default BIOS runs at, but other than that number, there is no difference between these temperatures. Modern graphics cards are designed for much higher temperatures, anything below 90°C is child's play for them. Technically, the lifetime of electronics is increased the lower temperature they run, but the difference is negligible in this case, and the card will be obsolete long before it fails either way. With the default BIOS, gaming noise levels are comparable to the Founders Edition, which again, wouldn't impress me enough to shell out the extra money for the STRIX OC. However, the "quiet" BIOS has the same clock profile as the default BIOS, so you're not losing out on performance.
Just like all other Turing cards, the ASUS RTX 2080 Ti STRIX sits in its power limit all the time during gaming. This means that the highest boost clocks are never reached during regular gameplay, which is in stark contrast to Pascal, where custom-designs were almost always running at peak boost clocks. ASUS also didn't upgrade their power input capabilities (which MSI did, at the cost of higher power draw). To me, this looks like a more balanced approach that ensures a lower power draw, thus resulting in less heat output so the heatsink won't have to work as hard.
Overclocking the ASUS RTX 2080 Ti STRIX is more complicated, like on all Turing cards, and we gained an additional 10.9% in performance. Overclocking potential is similar to other Turing cards we tested today, maybe slightly better, but the differences are small, with most cards reaching around 2100 MHz on the GPU and between 1950 and 2050 MHz on the memory.
NVIDIA GeForce RTX doesn't just give you more performance in existing games. It introduces RT cores, which accelerate ray tracing—a rendering technique that can give you realism that's impossible with today's rasterization rendering. Unlike in the past, NVIDIA's new technology is designed to work with various APIs, from multiple vendors (Microsoft DXR, NVIDIA OptiX, Vulkan Vulkan RT), which will make it much easier for developers to get behind ray tracing. At this time, not a single game has RTX support, but the number of titles that will support it is growing by the day. We had the chance to check out a few demos and were impressed by the promise of ray tracing in games. I mentioned it before, but just to make sure: RTX will not turn games into fully ray-traced experiences.
Rather, the existing rendering technologies will be used to generate most of the frame, with ray tracing adding specific effects, like lighting, reflections, or shadows for specific game objects that are tagged as "RTX" by the developer. It is up to the game developers what effect to choose and implement; they may go with one or several as long as they stay within the available performance budget of the RTX engine. NVIDIA clarified to us that games will not just have RTX "on"/"off", but rather, you'll be able to choose between several presets; for example, RTX "low", "medium", and "high". Also, unlike Gameworks, developers have full control over what and how they implement. RTX "only" accelerates ray generation, traversal, and hit calculation, which are the fundamentals, and the most complicated operations to develop; everything else is up to the developer, so I wouldn't be surprised if we see a large number of new rendering techniques developed over time as studios get more familiar with the technology.
The second big novelty of Turing is acceleration for artificial intelligence. While it was at first thought that it won't do much for gamers, the company devised a clever new anti-aliasing algorithm called DLSS (Deep Learning Super-Sampling), which utilizes Turing's artificial intelligence engine. DLSS is designed to achieve quality similar to temporal anti-aliasing and to solve some of its shortcomings, while coming with a much smaller performance hit at the same time. We tested several tech demos for this feature and had difficulty telling the difference between TAA and DLSS in most scenes. The difference only became obvious in cases where TAA fails; for example, when it estimates motion vectors incorrectly. Under the hood, DLSS renders the scene at lower resolution (typically 50%, so for 4K, 2880x1620), and feeds the frame to the tensor cores, which use a predefined deep neural network to enhance that image. For each DLSS game, NVIDIA receives early builds from game developers and trains that neural network to recognize common forms and shapes of the models, textures, and terrain, to build a "ground truth" database that is distributed through Game Ready driver updates. On the other hand, this means that gamers and developers are dependent on NVIDIA to train that network and provide the data with the driver for new games. Apparently, an auto-update mechanism exists that downloads new neural networks from NVIDIA without the need for a reboot or update to the graphics card driver itself.
With a price of $1,300, the ASUS RTX 2080 Ti STRIX OC is the most expensive RTX 2080 Ti card at this time. This $100 price increase over the Founders Edition can only be justified if you look at the card for its low noise potential. Since reference designs were promised to retail at $999, the ASUS is 30% more expensive—just for a custom design! I could see myself spending that money, though, because noise matters a lot to me, which means I'd switch the card to the "quiet" BIOS and never look back.