Thursday, November 3rd 2022

AMD Radeon RX 7900 XTX Performance Claims Extrapolated, Performs Within Striking Distance of RTX 4090

AMD on Thursday launched the Radeon RX 7900 XTX and RX 7900 XT RDNA3 graphics cards. With these, the company claims to have repeated its feat of a 50+ percent performance/Watt gain over the previous-generation, which propelled the RX 6000-series to competitiveness with NVIDIA's fastest RTX 30-series SKUs. AMD's performance claims for the Radeon RX 7900 XTX put the card at anywhere between 50% to 70% faster than the company's current flagship, the RX 6950 XT, when tested at 4K UHD resolution. Digging through these claims, and piecing together relevant information from the Endnotes, HXL was able to draw an extrapolated performance comparison between the RX 7900 XTX, the real-world tested RTX 4090, and previous-generation flagships RTX 3090 Ti and RX 6950 XT.

The graphs put the Radeon RX 7900 XTX menacingly close to the GeForce RTX 4090. In Watch_Dogs Legion, the RTX 4090 is 6.4% faster than the RX 7900 XTX. Cyberpunk 2077 and Metro Exodus see the two cards evenly matched, with a delta under 1%. The RTX 4090 is 4.4% faster with Call of Duty: Modern Warfare II (2022). Accounting for the pinch of salt usually associated with launch-date first-party performance claims; the RX 7900 XTX would end up within 5-10% of the RTX 4090, but pricing changes everything. The RTX 4090 is a $1,599 (MSRP) card, whereas the RX 7900 XTX is $999. Assuming the upcoming RTX 4080 (16 GB) is around 10% slower than the RTX 4090; the main clash for this generation will be between the RTX 4080 and RX 7900 XTX. Even here, AMD gets ahead with pricing, as the RTX 4080 was announced with an MSRP of $1,199 (exactly 20% pricier than the RX 7900 XTX). With the FSR 3.0 Fluid Motion announcement, AMD also blunted NVIDIA's DLSS 3 Frame Generation performance advantage.
Source: harukaze5719 (Twitter)
Add your own comment

164 Comments on AMD Radeon RX 7900 XTX Performance Claims Extrapolated, Performs Within Striking Distance of RTX 4090

#126
Chrispy_
wolfSame, my 3080 continues to impress me 2 years in, but the desire for more performance can never be truly quenched. I'll be looking closely at the 7900XTX after release for sure, and hoping to see it shake up the market a bit and hopefully force more compelling prices from Nvidia around that price point too.
I'll also be looking at the XTX pretty closely. The number of titles I play that actually have meaningful raytracing is just two, and it's not as if RDNA3 can't raytrace, it's just not going to be 4090-tier (or possibly even 4080 tier) when it comes to full lighting/shadow/occlusion raytracing. If you stick to raytraced reflections only, the AMD hardware is basically pretty competitive.

My interest will be in getting an XTX and tuning it to see if I can run it at 200-250W without losing too much performance. If The 7900XTX can't manage that, perhaps the 7800XT will do. My HTPC is currently rocking a 6700 10GB which sips about 125W under full load after an undervolt and I really don't think my case or slim furniture can handle anything more than 200-250W. The irony is that it's right underneath the only 4K display in the house, so it needs the most GPU power.
Posted on Reply
#127
efikkan
ModEl4It seems RDNA3's SPs having double the FP32 Tflop/clock by being dual issue is yielding less than desired performance uplift (of course AMD will tell you that it's early days and in the future with drivers updates and with newer games optimized more for RDNA3's architecture it will get better...)
And there we have it again; claims of better performance with better drivers and games.
It's the same old claim that AMD (or Intel) will catch up on Nvidia with better software over time, but it never happens. If driver overhead was holding back the performance, we would see a progressively growing overhead with the higher tier cards, holding them back to the point where high-end cards become almost pointless. The performance figures we've seen so far does not indicate this, and when reviews arrive, we can probably discredit that claim completely.

And no, (PC) games are not optimized for specific GPU architectures. Games are written using DirectX or Vulkan these days, neither are tailored to specific GPU architectures. Games may have some exclusive features requiring specific API extensions, but these don't skew the benchmark results.

BTW, did anyone catch the review embargo?
ModEl4RX 6800 is $479 and 6800XT $535 right now in Newegg and are 16GB cards, I would advise anyone looking for ≤$499 cards to buy at Black Friday/Cyber Monday offers, likely it will be a wash or better offers vs Q1 2023 releases (Full Navi32 at $649 for example) regarding 4K raster performance/$ (RDNA3 upcoming model's SRPs vs RDNA2 Black Friday deals)
I agree, I expect some good deals from both makers, so people better set some price notifications grab the best deals.
Posted on Reply
#128
HD64G
ModEl4If RX 7900XTX is -10% from RTX 4090, all Nvidia has to do is upgrade RTX 4080 to full die (336 TMUs/TCs from 304) and also upgrade clocks from 2505MHz to 2610MHz (what was RTX 3080 12GB clocks) and TDP to 350W and be just 10% slower in classic raster but faster in raytracing vs RX 7900XTX, probably it will be enough based on Nvidia's brand awareness.
It seems RDNA3's SPs having double the FP32 Tflop/clock by being dual issue is yielding less than desired performance uplift (of course AMD will tell you that it's early days and in the future with drivers updates and with newer games optimized more for RDNA3's architecture it will get better...)
In any case even if 6nm Navi33 can hit the same 3GHz clocks as 5nm models and the reference model has 2.85GHz boost for example, it will likely won't be more than 1.5X vs 6600XT in FHD so not being able to match 6900XT FHD performance and in QHD RX 6800XT will be much faster (in 4K even RX 6800 will be faster too)
RX 6800 is $479 and 6800XT $535 right now in Newegg and are 16GB cards, I would advise anyone looking for ≤$499 cards to buy at Black Friday/Cyber Monday offers, likely it will be a wash or better offers vs Q1 2023 releases (Full Navi32 at $649 for example) regarding 4K raster performance/$ (RDNA3 upcoming model's SRPs vs RDNA2 Black Friday deals)
Maybe you underestimate the gap between 4090 and 4080. it is close to 40%. Nothing can make 4080 get close to even the 7900XT. This time nVidia went full on to keep the crown with their halo GPU and the rest will get annihilated both in performance, power draw and vfm (will be saved only in RT and only for the games it fully utilises that). My 5c.
Posted on Reply
#129
optichippo
djuiceWhat we can extrapolate is that the 7900XTX would have around 50-60% more performance than the 6900XT in pure rasterization games, so without RT, and all that FSR/DLSS bullshit. So look at some benchmark on the 6900XT and take a guess, while its RT performance might not be up to par with the 4090, since its about a generation late in comparison, its still up to 50% greater than before, which probably put it in the ballpark of the 3090/3090ti RT performance, which is still far below that of the 4090.
Then take into account the price, it would probably be far superior than the 4080, in most games barring RT performance, then it's also cheaper, $999 vs $1199, so it definitely a better choice IMO. The 7900XTX might not be targeting the 4090 especially at it's price point, instead the 4080.
This right here is what I've suspected and been telling friends and coworkers. I believe the 7900XTX is going to be more on par with the 4080. Now if it does so happen to be that it comes somewhat closer to the 4090, then Nvidia is going to be in a bunch of trouble due to pricing and Display Port 2.1. I still have zero clue as to why Nvidia skipped out on including 2.1, which in itself is a selling point.
Zach_01BTW the 8K reference is not really 8K (7680x4320 = 33M pixels) but a widescreen 8K (7680x2160 = 16.5M pixels)
Yes, you are correct! 8K is not 4K x2, this is not how it works and I see how someone can be tricked into thinking this which is what AMD is clearlydoing here. 8K is 4k x4.
Posted on Reply
#130
ModEl4
HD64GMaybe you underestimate the gap between 4090 and 4080. it is close to 40%. Nothing can make 4080 get close to even the 7900XT. This time nVidia went full on to keep the crown with their halo GPU and the rest will get annihilated both in performance, power draw and vfm (will be saved only in RT and only for the games it fully utilises that). My 5c.
The theoretical difference (I don't mean FP32 diff : 512/336) between 4090 and full AD103 it's close to +40%, I agree, let's say +39% as an example.
My assumption is that in 5800X TPU testbed with the current game selection, RTX 4090 realizes around -10% from it's potential.
For example:
RTX 4090 theoretical 4K 139%
RTX 4090 realized 4K 125% (-10%)
Full AD103 with the quoted clocks 100%
I may be wrong, we will see what actual performance the current RTX 4080 model config will achieve (304 TMUs/TC vs 336 and -4% clocked vs my proposed AD103 specs) in relation with RTX 4090.
Posted on Reply
#131
ARF
GicaI'm waiting to see the AMD flagship that sells for $1000 and offers the performance of the 4090. It would be a pleasant surprise, but it's not Liza Su's style.
I don't know why so many people do not believe that AMD can undercut with good pricing quite substantially. After all, the chiplets were made exactly to cut the pricings.
AMD has a 300 sq. mm die vs nvidia's 2x larger die. Of course, AMD's product is around 60-70% of the cost of the nvidia's.
Posted on Reply
#132
thegnome
Impressive if it's really like that, given the 95W lower tdp and the much, much lower price. If this continues down the stack Lovelace will be the biggest joke Nvidia made since a while ago...
Posted on Reply
#133
ratirt
I really can't say the prices are great for AMD since 7900XT is going to bee $899. That is a lot. NV is just a kick in the teeth for consumers and that is not the end of the story since there will be 4090 Ti I suppose. Anyway, I will wait for the reviews and RT performance is still, not a deal breaker not a winning deal either. We are getting there but we are not there yet.
I only hope the AMD GPUs are as fast as advertised.
thegnomeImpressive if it's really like that, given the 95W lower tdp and the much, much lower price. If this continues down the stack Lovelace will be the biggest joke Nvidia made since a while ago...
Some reviewers claim it is already a big joke and a cash grab.
Posted on Reply
#134
robert3892
AMD has already stated that the 7900 XTX is a video card to challenge the RTX 4080 and not the RTX 4090.
Posted on Reply
#135
Calenhad
ModEl4If RX 7900XTX is -10% from RTX 4090, all Nvidia has to do is upgrade RTX 4080 to full die (336 TMUs/TCs from 304) and also upgrade clocks from 2505MHz to 2610MHz (what was RTX 3080 12GB clocks) and TDP to 350W and be just 10% slower in classic raster but faster in raytracing vs RX 7900XTX, probably it will be enough based on Nvidia's brand awareness.
Sooo, basically all they have to do is make a new GPU. Would this be a 4080 Super or ti? Because I promise you that unless the 4080 launch is next summer, they are already manufactured.
optichippoYes, you are correct! 8K is not 4K x2, this is not how it works and I see how someone can be tricked into thinking this which is what AMD is clearlydoing here. 8K is 4k x4.
It is either 8K halfheight or 4K ultrawide. Pick your poison! :P
Posted on Reply
#136
robert3892
NVIDIA does have the capability to make an RTX 4080 super or RTX 4080ti.
Posted on Reply
#137
HD64G
robert3892AMD has already stated that the 7900 XTX is a video card to challenge the RTX 4080 and not the RTX 4090.
They did the same for 6900XT vs 3080 but ended up matching 3090 @1440P & 4K
Posted on Reply
#138
medi01
ModEl4The theoretical difference (I don't mean FP32 diff : 512/336) between 4090 and full AD103 it's close to +40%
No.
4080 being 60% of 4090, means that if 4080 is 100%, 4090 is 166%
Posted on Reply
#140
AnotherReader
ARFnvidia will have serious problems with the so called RTX 4080. It's not worth even $700, they want >$1200 :kookoo: More NVIDIA GeForce RTX 4080 Custom Models Listed Online, Preliminary Prices Close To RTX 4090 (wccftech.com)




NVIDIA GeForce RTX 4080 3DMark TimeSpy scores have been leaked as well - VideoCardz.com







NVIDIA GeForce RTX 4080 Graphics Card Geekbench 5 Benchmark Leaks Out, Up To 15% Faster Than RTX 3090 Ti (wccftech.com)
These aren't relevant benchmarks from a gamer's perspective; these only show the compute performance. However, these still show the massive gap between the 4070 4080 and the 4090.
Posted on Reply
#141
Nopa
Zach_01Exactly and that (7950XTX..?) will be at the time of 4090Ti probably.
And yes RT performance of 7900XTX is known (by AMD claims) to be ~1.8x over the 6950XT that will place it around the 3090/Ti.
Its just math... +50% per CU +20% more CUs
1.0 + 50% = 1.5 + 20% = 1.8x
7950/7970 XTX >|~|< 4090 Ti > 4090 ~/a little > 7900 XTX > 4080 16GB >|~|< 7900 XT > "Unlaunched" 4080 12GB.

Very compelling to see how it all plays out once they all have been released.
Posted on Reply
#142
Vayra86
Sound_CardI beg to differ, don't fall for the memes.
However, it's objective to say that Nvidia drivers are worst in Linux.
It shows you're retired for a while now, because this is absolute nonsense.
efikkanThat is blatantly false.
Nvidia's Linux drivers have been rock solid for over a decade, even more solid than their Windows drivers, and have consistently offered the highest level of API compliance.
What you are reciting is typical forum nonsense coming from people who don't use AMD's "open source" Linux drivers to any real extent, fueled by ideology because people think one is completely free and open and the other is proprietary and evil, when the reality is both are partially open. The truth is the "open" Mesa/Gallium drivers are bloated and abstracted drivers, full of workarounds and are a complete mess.
.. and this is the truth.
HD64GMaybe you underestimate the gap between 4090 and 4080. it is close to 40%. Nothing can make 4080 get close to even the 7900XT. This time nVidia went full on to keep the crown with their halo GPU and the rest will get annihilated both in performance, power draw and vfm (will be saved only in RT and only for the games it fully utilises that). My 5c.
That is exactly why it appears they cancelled the 4080. Initially I thought they had to reposition because of their OWN marketing (after all how vague is such a large gap between two 4080 cards, and a different VRAM cap. to boot, these just aren't two similar cards in any way), but with the 7900XTX performance estimates out the door, we can easily defend the idea that 4080 12G turd was pulled back in because it would mean AMD had a much better story at the high end all the way through. After all, if they drop the number to an x70, now AMD's 'faster' cards are no longer all competing with (and in many cases performing over the level of) 4080's. Its a better marketing reality.

It is also highly likely the 4080 16G will get repositioned - in MSRP. $200 or even $300 just for somewhat better RT perf is steep. Too steep - and thats giving Nvidia benefit of the doubt that 4080 won't get eclipsed by AMD's 7900XT (yes XT). I honestly think the 4080 is going to be only situationally equal, and overall lower in raster perf, and even the 7900XT will be highly competitive with its performance, seeing the tiny gap between XTX and it.
Posted on Reply
#143
shovenose
HD64GSince you started the "nobody" talk let me deliver my suggestion: Nobody should pay so much for a gaming device. Only for professional reasons. And 1440P is a great res for everyone. 4K will not become mainstream even in 10 years since most people (>80%) aren't and will not be willing to spend so much for the monitor and GPU combo needed.
Well I for one find my new 4K monitor fantastic compared to 1080P. That said, I wonder how much of a difference it would have been vs 1440P. I just don’t think your statement is valid, just because you can’t justify spending so much on something for gaming doesn’t mean anything. I’m not made of money, I drive a 13 year old Ford Escape and live in a rented room, but I am saving up for a powerful GPU to match my monitor. Why? Because I love when games are pretty. Especially MSFS2020 which will be amazing once I get a card that can run 4K smoothly.
Posted on Reply
#144
Vayra86
shovenoseWell I for one find my new 4K monitor fantastic compared to 1080P. That said, I wonder how much of a difference it would have been vs 1440P. I just don’t think your statement is valid, just because you can’t justify spending so much on something for gaming doesn’t mean anything. I’m not made of money, I drive a 13 year old Ford Escape and live in a rented room, but I am saving up for a powerful GPU to match my monitor. Why? Because I love when games are pretty. Especially MSFS2020 which will be amazing once I get a card that can run 4K smoothly.
Its what you settle for in the end, we all make our choices. But there is also just laws of physics and ergonomics; 4K is not required in any way to get high graphical fidelity. I run a 3440x1440 (close enough, right...;)) panel and its really the maximum height that's comfortable to view, the width is already 'a thing' and only a curve makes it a good fit - 4K has 400 extra pixels in width and 700 in height.

4K struggles with efficiency because you're basically wasting performance on pixels you'll never notice at the supposed ideal view distance. You'll make a choice between a perf sacrifice for no uptick in graphical fidelity, versus sitting closer to see it all and killing your neck/back. At longer view distances, you can make do with lower res for the exact same experience. Another issue is scaling, 4K requires it for small text or its just unreadable.

Its a thing to consider ;) Not much more; the fact remains 4K is becoming more mainstream so there's simply more on offer, specifically also OLED. But the above is where the statement '1440p is enough' really comes from. Its a sweet spot, especially for a regular desktop setting. Couch gaming follows a different ruleset, really. But do consider also the advantages. I can still play comfortably at 3440x1440 on a GTX 1080... (!) 4K is going to absolutely murder this card though. Jumping on 4K is tying yourself to a higher expense to stay current on GPU, or sacrificing more FPS for wanted IQ.
ratirtSome reviewers claim it is already a big joke and a cash grab.
Nvidia has all opportunity to tweak the line up and the better half isn't even out... They always ran the risk of misfires because they release first.
Posted on Reply
#145
ARF
Vayra864K is becoming more mainstream
Only the PC environment is lagging behind the reality but 4K 100% dominates the markets in which it is allowed to develop.
Also, AMD already markets "8K" or 4K ultrawide experience with Radeon RX 7900 series GFX.

A local store with offers (quantity of offers is in brackets):

Posted on Reply
#146
ModEl4
CalenhadSooo, basically all they have to do is make a new GPU. Would this be a 4080 Super or ti? Because I promise you that unless the 4080 launch is next summer, they are already manufactured.
All GPCs are active in RTX 4080, they just disabled some SMs, all they have to do is re-enable them for the AD103 dies that can be fully utilized and the rest can be used in future cut-down AD103 based products (and also increase the clocks for the full AD103 parts)
And anyway my point wasn't what Nvidia will do but what it could achieve based on AD103 potential...
medi01No.
4080 being 60% of 4090, means that if 4080 is 100%, 4090 is 166%
According to leak, even an OC cut-down RTX 4080 (304TCs enabled vs 336TCs of my higher clocked full AD103 config...) appears to be only -20% slower vs RTX 4090 in 3DMark Time Spy Performance preset and -27% in Extreme 4K preset...
You do your math, I will do mine!
For example theoretical Shading performance delta alone is useless to extract performance difference between 2 models, it's much more complex than that...

Posted on Reply
#148
Vayra86
ARFOnly the PC environment is lagging behind the reality but 4K 100% dominates the markets in which it is allowed to develop.
Also, AMD already markets "8K" or 4K ultrawide experience with Radeon RX 7900 series GFX.

A local store with offers (quantity of offers is in brackets):

ARFOk.

NVIDIA GeForce RTX 4080 16 GB Specs | TechPowerUp GPU Database
Context, man, you might need to look that word up.

These posts make no sense whatsoever. 4080 isn't in the correct place in that chart, obviously, and 'local store offers' tell just about jack shit about where 4K is for gaming. Its marketing; you can find IoT devices like a fridge with '4K support'.

Resolution was, is and will always be highly variable. Especially now with FSR/DLSS. There is also a resolution for every use case, its not true the only way is up, enough is enough.
Posted on Reply
#149
ratirt
Vayra86Nvidia has all opportunity to tweak the line up and the better half isn't even out... They always ran the risk of misfires because they release first.
I dont think that was a misfire since that would suggest something unpredictable they are trying to tackle. NVidia's actions were intentional with the pricing or release of the 4090 as a first card which is obvious why. From what they have said so far about the pricing, all graphics cards are a joke. Then removing the 4080 12GB from the release because that was literally a flying circus.
Posted on Reply
#150
Sound_Card
Vayra86It shows you're retired for a while now, because this is absolute nonsense.
Even if what I say is untrue (I kinda doubt considering I follow the linux community) doesn't it suck to have stigmas?
Posted on Reply
Add your own comment
Nov 21st, 2024 11:18 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts