Thursday, November 3rd 2022
AMD Radeon RX 7900 XTX Performance Claims Extrapolated, Performs Within Striking Distance of RTX 4090
AMD on Thursday launched the Radeon RX 7900 XTX and RX 7900 XT RDNA3 graphics cards. With these, the company claims to have repeated its feat of a 50+ percent performance/Watt gain over the previous-generation, which propelled the RX 6000-series to competitiveness with NVIDIA's fastest RTX 30-series SKUs. AMD's performance claims for the Radeon RX 7900 XTX put the card at anywhere between 50% to 70% faster than the company's current flagship, the RX 6950 XT, when tested at 4K UHD resolution. Digging through these claims, and piecing together relevant information from the Endnotes, HXL was able to draw an extrapolated performance comparison between the RX 7900 XTX, the real-world tested RTX 4090, and previous-generation flagships RTX 3090 Ti and RX 6950 XT.
The graphs put the Radeon RX 7900 XTX menacingly close to the GeForce RTX 4090. In Watch_Dogs Legion, the RTX 4090 is 6.4% faster than the RX 7900 XTX. Cyberpunk 2077 and Metro Exodus see the two cards evenly matched, with a delta under 1%. The RTX 4090 is 4.4% faster with Call of Duty: Modern Warfare II (2022). Accounting for the pinch of salt usually associated with launch-date first-party performance claims; the RX 7900 XTX would end up within 5-10% of the RTX 4090, but pricing changes everything. The RTX 4090 is a $1,599 (MSRP) card, whereas the RX 7900 XTX is $999. Assuming the upcoming RTX 4080 (16 GB) is around 10% slower than the RTX 4090; the main clash for this generation will be between the RTX 4080 and RX 7900 XTX. Even here, AMD gets ahead with pricing, as the RTX 4080 was announced with an MSRP of $1,199 (exactly 20% pricier than the RX 7900 XTX). With the FSR 3.0 Fluid Motion announcement, AMD also blunted NVIDIA's DLSS 3 Frame Generation performance advantage.
Source:
harukaze5719 (Twitter)
The graphs put the Radeon RX 7900 XTX menacingly close to the GeForce RTX 4090. In Watch_Dogs Legion, the RTX 4090 is 6.4% faster than the RX 7900 XTX. Cyberpunk 2077 and Metro Exodus see the two cards evenly matched, with a delta under 1%. The RTX 4090 is 4.4% faster with Call of Duty: Modern Warfare II (2022). Accounting for the pinch of salt usually associated with launch-date first-party performance claims; the RX 7900 XTX would end up within 5-10% of the RTX 4090, but pricing changes everything. The RTX 4090 is a $1,599 (MSRP) card, whereas the RX 7900 XTX is $999. Assuming the upcoming RTX 4080 (16 GB) is around 10% slower than the RTX 4090; the main clash for this generation will be between the RTX 4080 and RX 7900 XTX. Even here, AMD gets ahead with pricing, as the RTX 4080 was announced with an MSRP of $1,199 (exactly 20% pricier than the RX 7900 XTX). With the FSR 3.0 Fluid Motion announcement, AMD also blunted NVIDIA's DLSS 3 Frame Generation performance advantage.
164 Comments on AMD Radeon RX 7900 XTX Performance Claims Extrapolated, Performs Within Striking Distance of RTX 4090
I expect it to beat RTX 4090 in performance per dollar though. Just for the record, AMD have never offered better drivers than Nvidia. That's not saying Nvidia is perfect though.
Lmao that would be a neat trick, considering it only has 59% of the cores of a 4090. The 7900 XTX will *slaughter* the 4080 in raster, and might even come close in RT, and it costs $200 less.
However, it's objective to say that Nvidia drivers are worst in Linux.
Nvidia's Linux drivers have been rock solid for over a decade, even more solid than their Windows drivers, and have consistently offered the highest level of API compliance.
What you are reciting is typical forum nonsense coming from people who don't use AMD's "open source" Linux drivers to any real extent, fueled by ideology because people think one is completely free and open and the other is proprietary and evil, when the reality is both are partially open. The truth is the "open" Mesa/Gallium drivers are bloated and abstracted drivers, full of workarounds and are a complete mess.
Things like how a game looks and feels are incredibly subjective and yeah, you should absolutely see it with your own eyes and feel the controls to form an opinion that's actually worth something.
So I ask for a reason, people who have zero experience with it, and choose to be negative about it, I put those opinions in one pile, but if they have constructive thoughts to share, I'll listen. People who bought a 4090 obviously run the risk of exhibiting confirmation bias, but their opinion on it would still carry far more merit given they leverage experience. Optimal to me would be unbiased people understanding how it works and then checking it out for themselves and giving a subjective assessment of it.
We will see but I expect there will be a paddock of room between Nvidia's 4090 and 4080 which both the 7900XT and 7900XTX will sit comfortably in.
They are right, RTX 3080 10 GB is a competitor to both the RX 6800 XT 16 GB and RX 6900 XT 16 GB as far as the performance chart shows us, since the performance deltas are low.
This is without knowing and deeper analysis on the nvidia shenanigans about lowered textures quality because of insufficient VRAM amount in some games and under certain maxed out settings.
But personally, I love it, but not for gaming, for rendering as using OptiX (which uses both RTX & HW accelerated denoising to accelerate 3D rendering), the results are massive improvements, the 7900 XTX perform lower than 3090Ti in RT, and the 4090 is massively faster, so a 4070/4080 will be very good, but overpriced.
The only reason to actually buy a 4090 at the moment is for CUDA application support where there's a very genuine potential for it to be cost-effective over the 3090 and/or Quadro RTX6000/8000 cards. That's only if your income depends on GPU performance, and even in a company where we have people that need those cards, we don't buy many of them because they're really hard to justify compared to just farming the work out to a group of lesser cards. The caveats are literally "something that requires a large contiguous VRAM allocation" and "is needed ASAP for a deadline or submittal". Niche within niche within the 3D rendering industry. I don't know how niche that is but it's definitely not a mainstream scenario IME.
I think this was pretty promising, also the one huge advantage that AMD has over Nvidia here is that while the Nvidia AD102 die is 608mm2 the graphics portion of the 7900XT/X is only 300mm2, besides being a solid cost advantage it also means that AMD can release something with a much bigger GPU portion of the SOC and stack the chiplets to take the cache from 96mb to 192mb.
And yes RT performance of 7900XTX is known (by AMD claims) to be ~1.8x over the 6950XT that will place it around the 3090/Ti.
Its just math... +50% per CU +20% more CUs
1.0 + 50% = 1.5 + 20% = 1.8x
What I'm interested in way more than who will take the crown eventually (I couldn't care less) is that we can get solid performance gains with low power (<250W) on the sub 600$ segment, like a 7700XT.
Anyway, it's good to see that AMD is keeping up the pressure on Nvida.
It seems RDNA3's SPs having double the FP32 Tflop/clock by being dual issue is yielding less than desired performance uplift (of course AMD will tell you that it's early days and in the future with drivers updates and with newer games optimized more for RDNA3's architecture it will get better...)
In any case even if 6nm Navi33 can hit the same 3GHz clocks as 5nm models and the reference model has 2.85GHz boost for example, it will likely won't be more than 1.5X vs 6600XT in FHD so not being able to match 6900XT FHD performance and in QHD RX 6800XT will be much faster (in 4K even RX 6800 will be faster too)
RX 6800 is $479 and 6800XT $535 right now in Newegg and are 16GB cards, I would advise anyone looking for ≤$499 cards to buy at Black Friday/Cyber Monday offers, likely it will be a wash or better offers vs Q1 2023 releases (Full Navi32 at $649 for example) regarding 4K raster performance/$ (RDNA3 upcoming model's SRPs vs RDNA2 Black Friday deals)