Thursday, November 3rd 2022
AMD Radeon RX 7900 XTX Performance Claims Extrapolated, Performs Within Striking Distance of RTX 4090
AMD on Thursday launched the Radeon RX 7900 XTX and RX 7900 XT RDNA3 graphics cards. With these, the company claims to have repeated its feat of a 50+ percent performance/Watt gain over the previous-generation, which propelled the RX 6000-series to competitiveness with NVIDIA's fastest RTX 30-series SKUs. AMD's performance claims for the Radeon RX 7900 XTX put the card at anywhere between 50% to 70% faster than the company's current flagship, the RX 6950 XT, when tested at 4K UHD resolution. Digging through these claims, and piecing together relevant information from the Endnotes, HXL was able to draw an extrapolated performance comparison between the RX 7900 XTX, the real-world tested RTX 4090, and previous-generation flagships RTX 3090 Ti and RX 6950 XT.
The graphs put the Radeon RX 7900 XTX menacingly close to the GeForce RTX 4090. In Watch_Dogs Legion, the RTX 4090 is 6.4% faster than the RX 7900 XTX. Cyberpunk 2077 and Metro Exodus see the two cards evenly matched, with a delta under 1%. The RTX 4090 is 4.4% faster with Call of Duty: Modern Warfare II (2022). Accounting for the pinch of salt usually associated with launch-date first-party performance claims; the RX 7900 XTX would end up within 5-10% of the RTX 4090, but pricing changes everything. The RTX 4090 is a $1,599 (MSRP) card, whereas the RX 7900 XTX is $999. Assuming the upcoming RTX 4080 (16 GB) is around 10% slower than the RTX 4090; the main clash for this generation will be between the RTX 4080 and RX 7900 XTX. Even here, AMD gets ahead with pricing, as the RTX 4080 was announced with an MSRP of $1,199 (exactly 20% pricier than the RX 7900 XTX). With the FSR 3.0 Fluid Motion announcement, AMD also blunted NVIDIA's DLSS 3 Frame Generation performance advantage.
Source:
harukaze5719 (Twitter)
The graphs put the Radeon RX 7900 XTX menacingly close to the GeForce RTX 4090. In Watch_Dogs Legion, the RTX 4090 is 6.4% faster than the RX 7900 XTX. Cyberpunk 2077 and Metro Exodus see the two cards evenly matched, with a delta under 1%. The RTX 4090 is 4.4% faster with Call of Duty: Modern Warfare II (2022). Accounting for the pinch of salt usually associated with launch-date first-party performance claims; the RX 7900 XTX would end up within 5-10% of the RTX 4090, but pricing changes everything. The RTX 4090 is a $1,599 (MSRP) card, whereas the RX 7900 XTX is $999. Assuming the upcoming RTX 4080 (16 GB) is around 10% slower than the RTX 4090; the main clash for this generation will be between the RTX 4080 and RX 7900 XTX. Even here, AMD gets ahead with pricing, as the RTX 4080 was announced with an MSRP of $1,199 (exactly 20% pricier than the RX 7900 XTX). With the FSR 3.0 Fluid Motion announcement, AMD also blunted NVIDIA's DLSS 3 Frame Generation performance advantage.
164 Comments on AMD Radeon RX 7900 XTX Performance Claims Extrapolated, Performs Within Striking Distance of RTX 4090
As for "I can grab a random leak on the internet, that shows that 4090 is only 100/73 => 37% faster than 4080", oh well.
My original post (you replied to my post) was about what potential performance a AD103 based RTX 4080 model could achieve if Nvidia decided to change specs (full die and +4% higher clocks than the current config was my proposal) and for this RTX 4080 config (my original proposal) I quoted that 4090 should have been +39% faster based on specs, but in reality the difference would be only +25% in 5800X TPU testbed with the current games selection, because in this particular setup 4090 realize around -10% of it's true potential.
+25% means 4090=125% and full AD103 4080 100% (or 4090 100% and full AD103 4080 80% , it's the same thing)
The Time Spy results that I quoted as an indication, if valid, shows that even in synthetic results the difference between 4090 (100) and current slower 4080 config (73) is much less than what you claim.
If TPU doesn't change testbed the average difference will be even less in games.(slightly different, around 74-75%)
No point to argue, reviews will come in a few weeks anyway and we will see who's assumption will prove true.
If 408 is a 40% cutdown from 4090, then 4090 is 166% of 4080.
As for "what is faster" and magical shaders that do much better in 4080 (with slower mem and what not) than in 4090, we'll see soon enough.
Isn't it far more important how much performance you get per Dollar and how it compares vs. the competition and its predecessor?
I find it funny how the typical complaint over the years has been the opposite; too little difference between the three highest tiers. Quite often, a 60/60Ti model has been "too close" to the 70/80 (before the 90 models) models, and sometimes the 70 model has been very close to the 80 model (e.g. GTX 970).
These days the 90 model is a much bigger step up than the old Titan models used to be. But they haven't done that by making the mid-range models worse, so what's the problem then?
i said 4090 is +40% faster than 4080 (the base of comparison is 4080 in this case, if 4080=100% then 4090 = 100+40=140%)
It very basic math-logic stuff really, I don't know why it confuse you...
In order to evaluate the GPU you should see the GPU score only.
$300 for a card that gets 100 FPS in a game average (it does not matter what game etc.) That will be our starting point.
then you have a new gen card release and the same tier card costs $420 and you get 150FPS.
another gen $540 for 200FPS. and another $660 for 250FPS. The performance per $ is better every gen but it is still a mid range card. It is the same card for which you have paid 300$ merely 4 years ago. The other aspect is, the four year old game has 2 new releases and each one normally halves the FPS of a graphics card. this means you dont get 250FPS with your $660 card which is a mid range card nonetheless. Don't get me wrong, you still have plenty of FPS but the problem is you paid $660 for a card that has around 125FPS in a game compared to $300 for 100FPS in a game 4 years ago?
Check Far Cry franchise (from FarCry 4 to 6) and 980 vs 1080 and 2080 with MSRP prices $550(dropped to 500 in 6 months) , $600, $799 (it dropped to $700 1 year later) respectively. This is just to illustrate the problem.
That is exactly what NV has been doing for years. Now you get a mid range card like 4070 for how many $$$ today? Advertised as 4080 to be exact. You can still say the performance per $ is good but is it worth to pay that much for the card?
I didn't notice myself either way but, IF IT'S TRUE, then that would explain why the XTX isn't called a 7950. Could AMD be trolling us an nVidia and planing to launch THE REAL N31 chip @ a later date?
That would also mean higher prices for the lower cards though, and that isn't a good prospect to look forward to ...
And RDNA2 6900XT and 6950XT are both Navi21 chips.
On the contrary I do believe that AMD will introduce a bigger more expensive die down the road into 2023 but I don’t know what might be called.
Like i said, i didn't notice if they specifically referred to the two 7900 cards as using N31 chips or not, but it would make SOME sense, i think.
Such a stunt would MOST CERTAINLY catch nVidia with their pants down ... That's the most likely scenario, i agree.