ZOTAC GeForce RTX 3090 Trinity Review 22

ZOTAC GeForce RTX 3090 Trinity Review

(22 Comments) »

Value and Conclusion

  • The Zotac GeForce RTX 3090 Trinity will retail at NVIDIA MSRP of $1499.
  • 10% faster than RTX 3080
  • 60 FPS 4K gaming a reality now
  • Idle fan stop
  • Fantastic memory overclocking potential
  • 24 GB VRAM
  • Adjustable RGB lighting
  • No price increase over reference
  • Second-generation hardware-accelerated raytracing
  • Support for HDMI 2.1, AV1 decode
  • DLSS improved
  • PCI-Express 4.0
  • SLI support
  • New GeForce Features: 8K, Reflex, Broadcast, G-SYNC 360, and RTX-IO
  • 8 nanometer production process
  • Very high price
  • Held back by board power limit
  • High heat output
  • Overclocking more complicated due to power limit
  • No manual power limit adjustment allowed
  • SLI useless without implicit multi-GPU
NVIDIA has announced the GeForce RTX 3090 at the beginning of this month, and we are finally allowed to show benchmarks. If this whole "Ampere" thing is new to you, definitely check out our GeForce RTX 3080 Founders Edition review and NVIDIA Ampere GPU Architecture article for some background information. We have four GeForce RTX 3090 reviews for you today: ASUS RTX 3090 STRIX OC, Gigabyte RTX 3090 Eagle OC, MSI RTX 3090 Gaming X Trio, and ZOTAC RTX 3090 Trinity.

Zotac's GeForce RTX 3090 Trinity comes at the NVIDIA MSRP of $1,499—there's no cost increase, but also no factory overclock. When averaged over our whole benchmarking suite at 4K resolution, the RTX 3090 Trinity beats the RTX 3080 by 10%, which is not nearly as much as expected. The RTX 3080 has 8704 shaders and the RTX 3090 has 10496, which is 20% more, so where's the rest of the performance? In order to keep power/heat/noise at reasonable levels, the power limit has been set only marginally higher than on the RTX 3080. The RTX 3080 Founders Edition came with a 320 W board power limit, and the Zotac RTX 3090 Trinity barely raises that cap to 350 W—there's your 10%. It looks like Ampere performance mostly scales with the power limit, and not clocks or shaders.

Still, the performance offered by the RTX 3090 is impressive; the Trinity is 45% faster than the RTX 2080 Ti and 72% faster than the RTX 2080 Super. AMD's Radeon RX 5700 XT is less than half as fast—the performance uplift vs. the 3090 is 217%! AMD Big Navi better be a success. With those performance numbers RTX 3090 is definitely suited for 4K resolution gaming. Many games will run at over 90 FPS. With the highest details in 4K, nearly all run at over 60 FPS—only Control is slightly below that, but DLSS will easily boost FPS beyond that.

With RTX 3090, NVIDIA is introducing "playable 8K", which rests on several pillars. In order to connect an 8K display, you previously had to use multiple cables. Now, you can just use a single HDMI 2.1 cable. At a higher resolution, VRAM usage goes up, but RTX 3090 has you covered, offering 24 GB of memory, which is more than twice that of the 10 GB RTX 3080. Last but not least, on the software side, they added the capability to capture 8K gameplay with Shadow Play. In order to improve framerates (remember 8K processes 16x as many pixels as Full HD), NVIDIA created DLSS 8K, which renders the game at 1440p native and scales the output by x3 in each direction with machine learning. All of these technologies are still in their infancy—game support is limited and displays are expensive, we'll look into this in more detail in the future.

We've seen Zotac's Trinity design before, in our RTX 3080 review. It looks good because of the clean color theme dominated by black with silver highlights. The RTX 3090 Trinity is basically the same card, just with additional memory chips installed on the back of the PCB and a slightly more powerful VRM. Zotac's triple-slot, triple-fan thermal solution works well to keep temperatures and noise at sane levels, even with the 350 W heat output of the card. With 76°C under heavy load and 37 dBA, the cooler is reasonably cool and quiet. Other, more premium RTX 3090 variants we've tested today are able to improve both on noise and temperatures, offering a factory overclock at the same time, but also cost more. Just like all other GeForce 30 cards, the Trinity comes with the idle-fan-stop feature, which completely turns off the fans in idle, productivity, browsing, and video playback—the perfect noise-free experience.

24 GB VRAM is definitely future-proof, but I doubt you will really ever need that much memory. Sure, more is always better, but unless you are using professional applications, you'll have a hard time finding a noteworthy performance difference between 10 GB and 24 GB. Games won't be an issue because you'll run out of shading power long before you run out of VRAM, just like with older cards today, which can't handle 4K no matter how much VRAM they have. Next-generation consoles also don't have as much VRAM, so it's hard to image you'll miss out on any meaningful gaming experiences if you have less than 24 GB VRAM. NVIDIA demonstrated several use cases in their reviewer's guide: OctaneRender, DaVinci Resolve, and Blender can certainly benefit from more memory, as can GPU compute applications, but these are very niche use cases. I'm not aware of any creators who were stuck and couldn't create because they ran out of VRAM. On the other hand, the RTX 3090 could definitely turn out to be a good alternative to Quadro, or Tesla, unless you need double-precision math (you don't).

The GeForce RTX 3090 is the only graphics card in the Ampere family that features an NVLink interface for SLI. Implicit multi-GPU (the classic SLI you know) is not available—only explicit multi-GPU is supported. Explicit multi-GPU requires that game developers invest their own time and money to add support for the technology, which simply isn't going to happen because of the tiny market size for the feature. Only a handful of games and benchmarks support explicit multi-GPU. To use even this mode, you'll need to buy a new-generation NVLink bridge separately; the NVLink cable from your RTX 20-series cards won't physically fit. And with NVIDIA reportedly stopping the development of SLI profiles for newer games from 2021 for GPUs that support implicit multi-GPU, we can safely conclude that the age of multi-GPU gaming is over. Buying a pair of RTX 3090 cards for multi-GPU would cater to a very tiny niche, mostly professionals.

Overclocking the Zotac Trinity was definitely held back by the card's power limit. Unlike other cards tested today, Zotac doesn't allow any manual power limit adjustments—you have 350 W available, that's it. Just like on Turing, NVIDIA's Boost algorithm complicates overclocking because you can no longer dial in specific frequencies. On the RTX 3090, the effect is amplified because the delta between the power limit during normal games and light loads boosting much higher is bigger than on other cards. Still, we managed to achieve 3.3% in real-life performance gains with overclocking.

Pricing of the RTX 3090 is just way too high, a tough pill to swallow. At a starting price of $1500, it is more than twice as expensive as the RTX 3080, but not nearly twice as fast. The performance uplift is actually surprisingly small, mostly because of the power limit we already talked about. Sure, an additional 14 GB of GDDR6X memory aren't free, but it just doesn't feel right to ask for that much money. On the other hand, the card is significantly better than the RTX 2080 Ti in every regard, and it sold for well over $1000, too. NVIDIA emphasizes that the RTX 3090 is a Titan replacement—Titan RTX launched at $2500, so $1500 must be a steal for the new RTX 3090. Part of the disappointment with the price is due to the RTX 3080 being so impressive at such disruptive pricing. If the RTX 3080 were $1000, the $1500 wouldn't feel as crazy—I would say $1000 is a fair price for the RTX 3090. Either way, Turing showed us that people are willing to pay up to have the best, and I have no doubt all RTX 3090 cards will sell out within a day, just as with the RTX 3080.
Discuss(22 Comments)
View as single page
Nov 25th, 2024 00:35 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts