MSI GeForce RTX 2080 Ti Duke 11 GB Review 10

MSI GeForce RTX 2080 Ti Duke 11 GB Review

(10 Comments) »

Value and Conclusion

  • The MSI GeForce RTX 2080 Ti Duke is listed online at $1200.
  • Fastest graphics card, 4K 60 Hz is second-nature, 4K 120 Hz possible with lower settings
  • Quiet in gaming, noiseless in idle
  • Same price as Founders Edition
  • Decent overclocking potential
  • Deep-learning feature set
  • DLSS is an effective new AA method
  • Highly energy efficient
  • HDMI 2.0b, DisplayPort 1.4, 8K support
  • Backplate and anti-sagging brace included
  • High overall pricing
  • Only 2% faster than Founders Edition
  • No Windows 7 support for RTX, requires Windows 10 Fall 2018 Update
  • Bogged down by power limits
  • Memory not overclocked
  • High non-gaming power consumption (fixable, says NVIDIA)
Our exhaustive coverage of the NVIDIA GeForce RTX 20-series "Turing" debut also includes the following reviews:
NVIDIA GeForce RTX 2080 Ti Founders Edition 11 GB | NVIDIA GeForce RTX 2080 Founders Edition 8 GB | ASUS GeForce RTX 2080 Ti STRIX OC 11 GB | ASUS GeForce RTX 2080 STRIX OC 8 GB | Palit GeForce RTX 2080 Gaming Pro OC 8 GB | MSI GeForce RTX 2080 Gaming X Trio 8 GB | MSI GeForce RTX 2080 Ti Gaming X Trio 11 GB | NVIDIA RTX and Turing Architecture Deep-dive

The MSI GeForce RTX 2080 Ti Duke is the second-strongest RTX 2080 Ti card in MSI's custom-design arsenal. It doesn't come with nearly as massive a cooler as the RTX 2080 Ti Gaming X Trio and doesn't feature as big an overclock, but makes up for that by retailing at Founders Edition pricing. The card's rated boost clock increase of 30 MHz over the Founders Edition turns into a 63 MHz clock increase after boost when monitoring our whole test suite. In FPS numbers, this results in 2% higher performance over the 2080 Ti FE, which is not really a lot. Compared to the RTX 2080, the performance uplift is 32%; AMD's flagship, the Radeon RX Vega 64, is about half the performance of the RTX 2080 Ti. With those performance numbers, the card is an excellent choice for 4K 60 FPS gaming at highest details.

NVIDIA only made small changes in their Boost 4.0 algorithm compared to what we saw with Pascal. For example, instead of dropping all the way to base clock when the card reaches its temperature target, there is now a grace zone in which temperatures drop slowly towards the base clock, which is reached when a second temperature cut-off point is hit. Temperatures of the MSI RTX 2080 Ti Duke are good with only 71°C under load and 72°C after manual overclocking.

However, just like every other Turing card we tested today, the MSI Duke will sit in its power limit almost all the time during gaming. This means that the highest boost clocks are never reached during regular gameplay, which is in stark contrast to Pascal, where custom-designs were almost always running at peak boost clocks. Unlike the Gaming X Trio, MSI has chosen to stick with the 8+8 power delivery configuration, which seems to have a positive effect on keeping power draw low. Yes, the RTX 2080 Ti Duke does consume more power than the Founders Edition and can't fully make up for that with higher performance, but the difference isn't big enough to be concerning.

Rather, it seems MSI's design strikes a great balance between power and cooling even though it uses a much weaker thermal solution than the Gaming X Trio. As mentioned before, temperatures are very good and so is fan noise. The card runs at 34 dBA under load, which is great, even more so when you take into account the incredible performance the RTX 2080 Ti offers. Non-gaming fan noise is perfect since the card comes with the idle-fan-off feature that switches off the GPU's fans completely in idle and even light gaming.

Overclocking the MSI RTX 2080 Ti Duke is more complicated, like on all Turing cards, and we gained an additional 9.4% in performance. Overclocking potential is similar to other Turing cards we tested today; GPU OC is a bit higher than average and memory a bit lower, but the silicon lottery plays the bigger role here.

NVIDIA GeForce RTX doesn't just give you more performance in existing games. It introduces RT cores, which accelerate ray tracing—a rendering technique that can give you realism that's impossible with today's rasterization rendering. Unlike in the past, NVIDIA's new technology is designed to work with various APIs, from multiple vendors (Microsoft DXR, NVIDIA OptiX, Vulkan Vulkan RT), which will make it much easier for developers to get behind ray tracing. At this time, not a single game has RTX support, but the number of titles that will support it is growing by the day. We had the chance to check out a few demos and were impressed by the promise of ray tracing in games. I mentioned it before, but just to make sure: RTX will not turn games into fully ray-traced experiences.

Rather, existing rendering technologies will be used to generate most of the frame, with ray tracing adding specific effects, like lighting, reflections, or shadows for specific game objects that are tagged as "RTX" by the developer. It is up to the game developers what effect to choose and implement; they may go with one or several as long as they stay within the available performance budget of the RTX engine. NVIDIA clarified to us that games will not just have RTX "on"/"off", but rather, you'll be able to choose between several presets; for example, RTX "low", "medium", and "high". Also, unlike Gameworks, developers have full control over what and how they implement. RTX "only" accelerates ray generation, traversal, and hit calculation, which are the fundamentals, and the most complicated operations to develop; everything else is up to the developer, so I wouldn't be surprised if we see a large number of new rendering techniques developed over time as studios get more familiar with the technology.

The second big novelty of Turing is acceleration for artificial intelligence. While it was at first thought that it won't do much for gamers, the company devised a clever new anti-aliasing algorithm called DLSS (Deep Learning Super-Sampling), which utilizes Turing's artificial intelligence engine. DLSS is designed to achieve quality similar to temporal anti-aliasing and to solve some of its shortcomings, while coming with a much smaller performance hit at the same time. We tested several tech demos for this feature and had difficulty telling the difference between TAA and DLSS in most scenes. The difference only became obvious in cases where TAA fails; for example, when it estimates motion vectors incorrectly. Under the hood, DLSS renders the scene at lower resolution (typically 50%, so for 4K, 2880x1620), and feeds the frame to the tensor cores, which use a predefined deep neural network to enhance that image. For each DLSS game, NVIDIA receives early builds from game developers and trains that neural network to recognize common forms and shapes of the models, textures, and terrain, to build a "ground truth" database that is distributed through Game Ready driver updates. On the other hand, this means that gamers and developers are dependent on NVIDIA to train that network and provide the data with the driver for new games. Apparently, an auto-update mechanism exists that downloads new neural networks from NVIDIA without the need for a reboot or update to the graphics card driver itself.

MSI is pricing the RTX 2080 Ti Duke at the same price as the Founders Edition: $1200. This makes it a great alternative for everyone looking to buy a RTX 2080 Ti that doesn't break the bank even more due to a custom-design tax. It's also attractive to people who have three slots of space for their graphics card and want to make use of those, instead of opting for the Founders Edition. For me, the RTX 2080 Ti Duke is the hidden champion in MSI's lineup as it improves significantly over the Founders Edition, yet does so without a price increase.
Editor's Choice
Discuss(10 Comments)
View as single page
Dec 23rd, 2024 01:38 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts