Thursday, November 3rd 2022

AMD Radeon RX 7900 XTX Performance Claims Extrapolated, Performs Within Striking Distance of RTX 4090

AMD on Thursday launched the Radeon RX 7900 XTX and RX 7900 XT RDNA3 graphics cards. With these, the company claims to have repeated its feat of a 50+ percent performance/Watt gain over the previous-generation, which propelled the RX 6000-series to competitiveness with NVIDIA's fastest RTX 30-series SKUs. AMD's performance claims for the Radeon RX 7900 XTX put the card at anywhere between 50% to 70% faster than the company's current flagship, the RX 6950 XT, when tested at 4K UHD resolution. Digging through these claims, and piecing together relevant information from the Endnotes, HXL was able to draw an extrapolated performance comparison between the RX 7900 XTX, the real-world tested RTX 4090, and previous-generation flagships RTX 3090 Ti and RX 6950 XT.

The graphs put the Radeon RX 7900 XTX menacingly close to the GeForce RTX 4090. In Watch_Dogs Legion, the RTX 4090 is 6.4% faster than the RX 7900 XTX. Cyberpunk 2077 and Metro Exodus see the two cards evenly matched, with a delta under 1%. The RTX 4090 is 4.4% faster with Call of Duty: Modern Warfare II (2022). Accounting for the pinch of salt usually associated with launch-date first-party performance claims; the RX 7900 XTX would end up within 5-10% of the RTX 4090, but pricing changes everything. The RTX 4090 is a $1,599 (MSRP) card, whereas the RX 7900 XTX is $999. Assuming the upcoming RTX 4080 (16 GB) is around 10% slower than the RTX 4090; the main clash for this generation will be between the RTX 4080 and RX 7900 XTX. Even here, AMD gets ahead with pricing, as the RTX 4080 was announced with an MSRP of $1,199 (exactly 20% pricier than the RX 7900 XTX). With the FSR 3.0 Fluid Motion announcement, AMD also blunted NVIDIA's DLSS 3 Frame Generation performance advantage.
Source: harukaze5719 (Twitter)
Add your own comment

164 Comments on AMD Radeon RX 7900 XTX Performance Claims Extrapolated, Performs Within Striking Distance of RTX 4090

#101
efikkan
DeeJay1001At worst 10% worse performance than a 4090.
Do we have enough data to conclude that? Have you considered if the titles shown were cherry-picked?

I expect it to beat RTX 4090 in performance per dollar though.
DeeJay1001no shitty nvidia drivers
Just for the record, AMD have never offered better drivers than Nvidia. That's not saying Nvidia is perfect though.
Posted on Reply
#102
hpr484
Way to steal the exact graphs and charts from Linus’s video with estimated performance projections. Couldn’t even bother to modify the colors to make it look like your own charts or give LTT a shoutout?
Outback BronzeHopefully needs a lot less power to get those numbers too.

Who bought a launch price 4090?
Reference model 7900xt has a power draw of 300 watts on a 2 x 8 pin connector, 7900xtx has 330 watts, also on 2x8
Chrispy_I mean, for $999 if it's even only 75% the performance of a 4090 it's way better performance/$...
As the trend seems to go generation-to-generation, I’d expect the top of the line AMD to compete with 2nd best NVIDIA, but priced at mid range. That is, 6900xt competing with 3080 performance but 3070(ti) pricing. I expect the 7900xtx to beat the 4080, with the 7900xt to be about the same as the 4080
Posted on Reply
#103
Koth87
"Assuming the upcoming RTX 4080 (16 GB) is around 10% slower than the RTX 4090"

Lmao that would be a neat trick, considering it only has 59% of the cores of a 4090. The 7900 XTX will *slaughter* the 4080 in raster, and might even come close in RT, and it costs $200 less.
Posted on Reply
#104
Sound_Card
efikkanJust for the record, AMD have never offered better drivers than Nvidia. That's not saying Nvidia is perfect though
I beg to differ, don't fall for the memes.
However, it's objective to say that Nvidia drivers are worst in Linux.
Posted on Reply
#105
efikkan
Sound_CardI beg to differ, don't fall for the memes.
However, it's objective to say that Nvidia drivers are worst in Linux.
That is blatantly false.
Nvidia's Linux drivers have been rock solid for over a decade, even more solid than their Windows drivers, and have consistently offered the highest level of API compliance.
What you are reciting is typical forum nonsense coming from people who don't use AMD's "open source" Linux drivers to any real extent, fueled by ideology because people think one is completely free and open and the other is proprietary and evil, when the reality is both are partially open. The truth is the "open" Mesa/Gallium drivers are bloated and abstracted drivers, full of workarounds and are a complete mess.
Posted on Reply
#106
s1nn3r86
Lol what ? The 4080 is way more than 10 % behind the 4090. That card is so gimped compared to 90. It will be closer to 20-30% slower than the 90. The xtx will eat it alive in raster,I don't think the xtx will be too far behind on ray tracing but that's definitely in the 4080s favour. But if you're like and don't care about ray tracing the xtx is by far the better card. The xt will probably be closer to the 4080 while being $300 cheaper. All nvidia need to do is lower the price down to $1,000 though and the xt will be irrelevant because of the ray tracing advantage
Posted on Reply
#107
AnotherReader
efikkanThat is blatantly false.
Nvidia's Linux drivers have been rock solid for over a decade, even more solid than their Windows drivers, and have consistently offered the highest level of API compliance.
What you are reciting is typical forum nonsense coming from people who don't use AMD's "open source" Linux drivers to any real extent, fueled by ideology because people think one is completely free and open and the other is proprietary and evil, when the reality is both are partially open. The truth is the "open" Mesa/Gallium drivers are bloated and abstracted drivers, full of workarounds and are a complete mess.
I can't soeak for anyone else, but my GTX 670 has given me many headaches for Linux whereas my Vega 64 has always worked fine.
Posted on Reply
#108
Tsukiyomi91
MarsM4NIf Moore's Law Is Dead's sources are right, the only paper launch here will be the RTX4080. 20-40% less day 1 stock & less resupplies than the RTX4090, LOL. :oops: Assuming it's not getting "unlaunched" (like the RTX4080 12GB) to save face from getting slapped around by cheaper AMD cards, lol.

assuming if it's right.
Posted on Reply
#109
wolf
Performance Enthusiast
TheinsanegamerNDoes one need to jump into a volcano to verify if it is indeed hot?
I'm curious if you don't know that this is a logical fallacy or you're being purposely intellectually dishonest?

Things like how a game looks and feels are incredibly subjective and yeah, you should absolutely see it with your own eyes and feel the controls to form an opinion that's actually worth something.

So I ask for a reason, people who have zero experience with it, and choose to be negative about it, I put those opinions in one pile, but if they have constructive thoughts to share, I'll listen. People who bought a 4090 obviously run the risk of exhibiting confirmation bias, but their opinion on it would still carry far more merit given they leverage experience. Optimal to me would be unbiased people understanding how it works and then checking it out for themselves and giving a subjective assessment of it.
Posted on Reply
#110
HD64G
That pricing and power tuning for RDNA3 GPUs was a great decision by AMD. They learned from the Zen4 launch and the critics about them pushing wattage unreasonably high to get the crown. They should have a "Rage" mode for their CPUs but the stock power limits should be lower than 170W where the efficiency curve would show the CPUs as all round great vs the power very hungry ones from Intel. So, they decided to limit the ref GPUs to reasonable power limits losing by ~10% vs the mega-GPU that the 4090 is and win in all metrics (efficiency, vfm, size). RT will remain a gimmick for years and with the downsampling or frame-generating techs it will become a normality sooner for some people. Also, RTX 4080 and lower (including RTX30 series) are DOA with the prices AMD went for their new GPUs. Moreover, we don't know the performance of the RDA3 GPUs at 1080P and 1440P. If they perform similar to RDNA2 they will win over nVIDIA again. FInally, the AIBs will go for close to 3GHz clocks which will bettle hard vs the 4090.
Posted on Reply
#111
mama
hpr484Way to steal the exact graphs and charts from Linus’s video with estimated performance projections. Couldn’t even bother to modify the colors to make it look like your own charts or give LTT a shoutout?


Reference model 7900xt has a power draw of 300 watts on a 2 x 8 pin connector, 7900xtx has 330 watts, also on 2x8


As the trend seems to go generation-to-generation, I’d expect the top of the line AMD to compete with 2nd best NVIDIA, but priced at mid range. That is, 6900xt competing with 3080 performance but 3070(ti) pricing. I expect the 7900xtx to beat the 4080, with the 7900xt to be about the same as the 4080
6900XT competing with a 3080XT? What drugs are you on? The 6800XT is the natural competitor. Your theory is bonkers.

We will see but I expect there will be a paddock of room between Nvidia's 4090 and 4080 which both the 7900XT and 7900XTX will sit comfortably in.
Posted on Reply
#112
ARF
mamaWhat drugs are you on?
This is so much rude :oops: :(

They are right, RTX 3080 10 GB is a competitor to both the RX 6800 XT 16 GB and RX 6900 XT 16 GB as far as the performance chart shows us, since the performance deltas are low.



This is without knowing and deeper analysis on the nvidia shenanigans about lowered textures quality because of insufficient VRAM amount in some games and under certain maxed out settings.
Posted on Reply
#113
Xajel
fancuckerHonestly the fact they didn't compare it directly to the 4090 shows you it's beneath it. And the aggressive pricing tells the story of the bad ray tracing performance. Pretty much another Nvidia win across the board this generation. Sorry AMD.
Honestly, RT is not everything. While most like the visuals but IRL most people doesn't use it. You can see for your self in NV reddit comments.
But personally, I love it, but not for gaming, for rendering as using OptiX (which uses both RTX & HW accelerated denoising to accelerate 3D rendering), the results are massive improvements, the 7900 XTX perform lower than 3090Ti in RT, and the 4090 is massively faster, so a 4070/4080 will be very good, but overpriced.
Posted on Reply
#114
ACE76
luchesAMD must deliver for the sake the buyers and the market. But these are very bold claims considering much less Tflops, Frequency,etc. Reviews can't come sooner !
AMD hasn't lied of fudged a single benchmark of their products at launch for the last 5+ years...every single number they published was independently verified every year.
HD64GThat pricing and power tuning for RDNA3 GPUs was a great decision by AMD. They learned from the Zen4 launch and the critics about them pushing wattage unreasonably high to get the crown. They should have a "Rage" mode for their CPUs but the stock power limits should be lower than 170W where the efficiency curve would show the CPUs as all round great vs the power very hungry ones from Intel. So, they decided to limit the ref GPUs to reasonable power limits losing by ~10% vs the mega-GPU that the 4090 is and win in all metrics (efficiency, vfm, size). RT will remain a gimmick for years and with the downsampling or frame-generating techs it will become a normality sooner for some people. Also, RTX 4080 and lower (including RTX30 series) are DOA with the prices AMD went for their new GPUs. Moreover, we don't know the performance of the RDA3 GPUs at 1080P and 1440P. If they perform similar to RDNA2 they will win over nVIDIA again. FInally, the AIBs will go for close to 3GHz clocks which will bettle hard vs the 4090.
Nobody is buying these cards for 1080p gaming. That resolution is useless with current gen hardware and doesn't even belong in benchmarks anymore. The only resolution that matters going forward is 4k.
wolfI'm curious if you don't know that this is a logical fallacy or you're being purposely intellectually dishonest?

Things like how a game looks and feels are incredibly subjective and yeah, you should absolutely see it with your own eyes and feel the controls to form an opinion that's actually worth something.

So I ask for a reason, people who have zero experience with it, and choose to be negative about it, I put those opinions in one pile, but if they have constructive thoughts to share, I'll listen. People who bought a 4090 obviously run the risk of exhibiting confirmation bias, but their opinion on it would still carry far more merit given they leverage experience. Optimal to me would be unbiased people understanding how it works and then checking it out for themselves and giving a subjective assessment of it.
I have a 4090 and a 6900xt. I also have a 10gb 3080. My daily driver until the 4090 released has been the 6900xt which is considerably better than the 3080. I only had a 3090ti for a short while and I honestly couldn't tell the difference in performance between that and the 6900xt. The 4090 is obviously better at the moment but I'll be getting a 7900XTX too. I have a LG C2 42 as my display and pretty much only game at 4k.
Posted on Reply
#115
Chrispy_
DeeJay1001If you buy a 4090 at this point you're a fool.
Honestly, buying a 4090 prior to December 5th when the review embargo on these lift was foolish anyway. For the first couple of months, the 4090 was always going to be scalped, overpriced, hard to find in stock etc, and you were buying blind, without any idea if it would be the best solution this generation. Impatience and zealotry are the only virtues by which 4090s have sold, so far.

The only reason to actually buy a 4090 at the moment is for CUDA application support where there's a very genuine potential for it to be cost-effective over the 3090 and/or Quadro RTX6000/8000 cards. That's only if your income depends on GPU performance, and even in a company where we have people that need those cards, we don't buy many of them because they're really hard to justify compared to just farming the work out to a group of lesser cards. The caveats are literally "something that requires a large contiguous VRAM allocation" and "is needed ASAP for a deadline or submittal". Niche within niche within the 3D rendering industry. I don't know how niche that is but it's definitely not a mainstream scenario IME.
Posted on Reply
#116
HD64G
ACE76Nobody is buying these cards for 1080p gaming. That resolution is useless with current gen hardware and doesn't even belong in benchmarks anymore. The only resolution that matters going forward is 4k.
Since you started the "nobody" talk let me deliver my suggestion: Nobody should pay so much for a gaming device. Only for professional reasons. And 1440P is a great res for everyone. 4K will not become mainstream even in 10 years since most people (>80%) aren't and will not be willing to spend so much for the monitor and GPU combo needed.
Posted on Reply
#117
ACE76
Chrispy_Honestly, buying a 4090 prior to December 5th when the review embargo on these lift was foolish anyway. For the first couple of months, the 4090 was always going to be scalped, overpriced, hard to find in stock etc, and you were buying blind, without any idea if it would be the best solution this generation. Impatience and zealotry are the only virtues by which 4090s have sold, so far.

The only reason to actually buy a 4090 at the moment is for CUDA application support where there's a very genuine potential for it to be cost-effective over the 3090 and/or Quadro RTX6000/8000 cards. That's only if your income depends on GPU performance, and even in a company where we have people that need those cards, we don't buy many of them because they're really hard to justify compared to just farming the work out to a group of lesser cards. The caveats are literally "something that requires a large contiguous VRAM allocation" and "is needed ASAP for a deadline or submittal". Niche within niche within the 3D rendering industry. I don't know how niche that is but it's definitely not a mainstream scenario IME.
You could sell a 4090 used at higher than MSRP. I have one and while it's a great card, I doubt I'll be keeping it long term. Too much power draw and the power cable issues are scary. I'm waiting to see 3rd party reviews on the 7900xtx but if its in the 4090 ballpark, I see zero reason to keep the 4090.
Posted on Reply
#118
medi01
medi01OK, got it.

My humble perspective: 7900XTX/XT, if claimed figures are true, is an amazing, wipes-the-floor-with-competitor product that actually is NOT a direct competitor of 4090 as:

1) It is quite a bit smaller (300mm2 of 5nm and 37*6 of 6nm, totals at around 522mm2, quite down fro 600mm2 of 4090)
2) Is about 2 times cheaper (the "$1600" 4090 is 2300 Euro here, cough, even that 1.6k price point is BS, it should cost around 1900)
3) Is way cheaper to produce (600mm2 monolith vs 300mm2 + 6 small thingies at N6)
4) is fed by only two 8-pin connectors


Its direct competitors are 4080 16GB, and 4080 12GB (RIP :D)
(This product has literally KILLED one of its two competitors)

So, with all that in mind, it would make sense to present 'vs 4080' angle in the presentation.
Exception there is no 4080 out yet.
AMD rep essentially confirming that 7900XTX is a 4080 wipe-the -floor-er , wasn't meant as necessarily 4090 competitor (heck, and why would it be, at half the price).

Posted on Reply
#119
doc7000
djuiceWhat we can extrapolate is that the 7900XTX would have around 50-60% more performance than the 6900XT in pure rasterization games, so without RT, and all that FSR/DLSS bullshit. So look at some benchmark on the 6900XT and take a guess, while its RT performance might not be up to par with the 4090, since its about a generation late in comparison, its still up to 50% greater than before, which probably put it in the ballpark of the 3090/3090ti RT performance, which is still far below that of the 4090.
Then take into account the price, it would probably be far superior than the 4080, in most games barring RT performance, then it's also cheaper, $999 vs $1199, so it definitely a better choice IMO. The 7900XTX might not be targeting the 4090 especially at it's price point, instead the 4080.
I would add that it is a 50% improvement per compute unit and the 7900XT has 4 more compute units then the 6900XT while the 7900XTX has 16 more compute units then the 6900XT, so ray tracing gains maybe higher then suggested.

I think this was pretty promising, also the one huge advantage that AMD has over Nvidia here is that while the Nvidia AD102 die is 608mm2 the graphics portion of the 7900XT/X is only 300mm2, besides being a solid cost advantage it also means that AMD can release something with a much bigger GPU portion of the SOC and stack the chiplets to take the cache from 96mb to 192mb.
Posted on Reply
#120
Zach_01
doc7000I would add that it is a 50% improvement per compute unit and the 7900XT has 4 more compute units then the 6900XT while the 7900XTX has 16 more compute units then the 6900XT, so ray tracing gains maybe higher then suggested.

I think this was pretty promising, also the one huge advantage that AMD has over Nvidia here is that while the Nvidia AD102 die is 608mm2 the graphics portion of the 7900XT/X is only 300mm2, besides being a solid cost advantage it also means that AMD can release something with a much bigger GPU portion of the SOC and stack the chiplets to take the cache from 96mb to 192mb.
Exactly and that (7950XTX..?) will be at the time of 4090Ti probably.
And yes RT performance of 7900XTX is known (by AMD claims) to be ~1.8x over the 6950XT that will place it around the 3090/Ti.
Its just math... +50% per CU +20% more CUs
1.0 + 50% = 1.5 + 20% = 1.8x

What I'm interested in way more than who will take the crown eventually (I couldn't care less) is that we can get solid performance gains with low power (<250W) on the sub 600$ segment, like a 7700XT.
Posted on Reply
#121
wolf
Performance Enthusiast
Chrispy_Impatience and zealotry are the only virtues by which 4090s have sold, so far.
Only a sith deals in absolutes. I wouldn't be so sure those are the only reasons, but I'll agree they're significant factors to 4090 sales.
Posted on Reply
#122
watzupken
CallandorWoTwhy the fuck is the heat not leaving my case out the port side??? all heat is being dumped in case with this design... oh shit... my poor cpu... fuck... imo heat should leave out the port side and the top... not just forced one way or the other...


Because if you pick up any GPU and look at the cooler fin orientation, it's anyway not going to allow hot air to escape from the rear vents. The only GPU models that vent hot air out from the rear vents are the blower type.

Anyway, it's good to see that AMD is keeping up the pressure on Nvida.
Posted on Reply
#123
wolf
Performance Enthusiast
ACE76I have a LG C2 42 as my display and pretty much only game at 4k.
Same, my 3080 continues to impress me 2 years in, but the desire for more performance can never be truly quenched. I'll be looking closely at the 7900XTX after release for sure, and hoping to see it shake up the market a bit and hopefully force more compelling prices from Nvidia around that price point too.
Posted on Reply
#125
ModEl4
If RX 7900XTX is -10% from RTX 4090, all Nvidia has to do is upgrade RTX 4080 to full die (336 TMUs/TCs from 304) and also upgrade clocks from 2505MHz to 2610MHz (what was RTX 3080 12GB clocks) and TDP to 350W and be just 10% slower in classic raster but faster in raytracing vs RX 7900XTX, probably it will be enough based on Nvidia's brand awareness.
It seems RDNA3's SPs having double the FP32 Tflop/clock by being dual issue is yielding less than desired performance uplift (of course AMD will tell you that it's early days and in the future with drivers updates and with newer games optimized more for RDNA3's architecture it will get better...)
In any case even if 6nm Navi33 can hit the same 3GHz clocks as 5nm models and the reference model has 2.85GHz boost for example, it will likely won't be more than 1.5X vs 6600XT in FHD so not being able to match 6900XT FHD performance and in QHD RX 6800XT will be much faster (in 4K even RX 6800 will be faster too)
RX 6800 is $479 and 6800XT $535 right now in Newegg and are 16GB cards, I would advise anyone looking for ≤$499 cards to buy at Black Friday/Cyber Monday offers, likely it will be a wash or better offers vs Q1 2023 releases (Full Navi32 at $649 for example) regarding 4K raster performance/$ (RDNA3 upcoming model's SRPs vs RDNA2 Black Friday deals)
Posted on Reply
Add your own comment
May 16th, 2024 22:35 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts