Thursday, November 3rd 2022

AMD Radeon RX 7900 XTX Performance Claims Extrapolated, Performs Within Striking Distance of RTX 4090

AMD on Thursday launched the Radeon RX 7900 XTX and RX 7900 XT RDNA3 graphics cards. With these, the company claims to have repeated its feat of a 50+ percent performance/Watt gain over the previous-generation, which propelled the RX 6000-series to competitiveness with NVIDIA's fastest RTX 30-series SKUs. AMD's performance claims for the Radeon RX 7900 XTX put the card at anywhere between 50% to 70% faster than the company's current flagship, the RX 6950 XT, when tested at 4K UHD resolution. Digging through these claims, and piecing together relevant information from the Endnotes, HXL was able to draw an extrapolated performance comparison between the RX 7900 XTX, the real-world tested RTX 4090, and previous-generation flagships RTX 3090 Ti and RX 6950 XT.

The graphs put the Radeon RX 7900 XTX menacingly close to the GeForce RTX 4090. In Watch_Dogs Legion, the RTX 4090 is 6.4% faster than the RX 7900 XTX. Cyberpunk 2077 and Metro Exodus see the two cards evenly matched, with a delta under 1%. The RTX 4090 is 4.4% faster with Call of Duty: Modern Warfare II (2022). Accounting for the pinch of salt usually associated with launch-date first-party performance claims; the RX 7900 XTX would end up within 5-10% of the RTX 4090, but pricing changes everything. The RTX 4090 is a $1,599 (MSRP) card, whereas the RX 7900 XTX is $999. Assuming the upcoming RTX 4080 (16 GB) is around 10% slower than the RTX 4090; the main clash for this generation will be between the RTX 4080 and RX 7900 XTX. Even here, AMD gets ahead with pricing, as the RTX 4080 was announced with an MSRP of $1,199 (exactly 20% pricier than the RX 7900 XTX). With the FSR 3.0 Fluid Motion announcement, AMD also blunted NVIDIA's DLSS 3 Frame Generation performance advantage.
Source: harukaze5719 (Twitter)
Add your own comment

164 Comments on AMD Radeon RX 7900 XTX Performance Claims Extrapolated, Performs Within Striking Distance of RTX 4090

#51
mama
The 8K stuff was just dumb. But if the card gets rasterization near the 4090 then for that price it is compelling. Raytracing looks like a bust on AMD at the moment, if that it's important.
Posted on Reply
#52
tabascosauz
CallandorWoTfair enough. if i had a smaller atx case i would be worried though. if you have giant case like mine, not so worried.
I push 300W at stock, and flow through backplate as well, without front intakes. Never was an issue at 19L. The concern is still more for RAM OC than anything, and so far I haven't seen any DDR5 that is temp sensitive to the level of 8Gb Bdie.

I like the 7900XT reference design. I/O plate looks like it's got a good amount of attachment points to a rigid-looking cooler, unlike most TUF cards (saggy garbage). Preventing sag starts at the I/O plate.

And the USB-C port is a nice bonus both as a data port and for KVM on Gigabyte monitors. To this day I still don't understand why Nvidia got rid of it. So what if there weren't mamy VR users? I'd much rather have one than an extra DP or HDMI.
Posted on Reply
#53
TheLostSwede
News Editor
kongaThe Metro Exodus "4K High" numbers here are highly suspect. In TPU's own review, they found the 4090 to be 2.5x the 6950 XT. So why is the performance so low here?
He based it on numbers from Igor's Lab.
CallandorWoToddly enough my new NZXT monitor actually does support type c. might give that a go, be weird having my monitor and gpu hooked up type c to type c.

does anyone know if freesync is still supported over type c connection?
It's not USB-C though, it's DP over a USB-C cable, so everything your card and display can do over DP, it can do over USB-C.
So yes, Freesync/G-Sync works over USB-C cables.
Posted on Reply
#54
HTC
Right now, nVidia has A TON of 3000 series inventory to get rid of due to crypto crash, which is why they priced 4090 as they did. Don't kid yourselves: nVidia's plan was to price the upcoming 4090 Ti @ LEAST in the $2000 range.

By pricing the 7900 cards aggressively like this, they make it A LOT MORE difficult for nVidia to get rid of those cards, which is why stuff like this is already happening.

AMD is in a position to make it VERY HARD for nVidia to get rid of that inventory: if they make their 6000 series cards EVEN CHEAPER than they are now and NOT BY A LITTLE, they MAY be able to clear out THEIR OWN inventory of 6000 series cards which they too have (though nowhere near to the extent of nVidia's).

Should AMD opt to go this route, it won't be cheap for them and would MOST CERTAINLY greatly reduce their profits BUT, it would also substantially increase their market share in the process WHILE forcing nVidia to resort to SERIOUS price cuts themselves in order to get rid of their own current inventories, but this would cost A LOT MORE to nVidia than it would to AMD (my guess would be around 10 times more).

An approach like this would be MUCH MORE EFFECTIVE if the 7900 series were to match (or more) the 4090 @ least in raster but that appears not to be the case and, as expected, they still trail in RT.

Speaking of RT: does it seem like they closed the gap a bit, or do you guys think it has increased further?
Posted on Reply
#55
Chomiq
HTCSpeaking of RT: does it seem like they closed the gap a bit, or do you guys think it has increased further?
RT still comes with massive performance cost. If I had to pick between >100 fps without RT at 1440p or < 80 with RT gimmicks I'd pick >100 fps any day.
Posted on Reply
#56
vmarv
Outback BronzeHopefully needs a lot less power to get those numbers too.
Who bought a launch price 4090?
First of all lot of people who play simulations in VR and have already spent 4-5 thousands bucks with the hardware and software needed for their hobby.
If you play rFactor 2 in VR or with a triple screen and money isn't a problem, why not? You can always sell your old 3090 and recover some money.
Posted on Reply
#57
wolf
Better Than Native
MarsM4NIf Moore's Law Is Dead's
Lost me and proably the majority of the TPU user base right there.
HTCSpeaking of RT: does it seem like they closed the gap a bit, or do you guys think it has increased further?
From what I gather right now, it looks like the gap is about the same, perhaps a little wider, doesn't seem narrower.

It should still be pretty acceptably preforming to a lot of people on these cards, but I believe the trend of "if RT matters to you, buy Nvidia" continuing.

Seems like a pretty fun split of iT's a GiMmIcK vs why would I spent $900+ on a GPU that sucks at it?

Given Ampere already did RT acceptably to many people, myself included, RDNA3 should allow a lot more AMD users to try it out without utterly crumbling FPS.
Posted on Reply
#58
TheinsanegamerN
wolfI don't see them doing any better than DLSS 3 imo, given it needs reflex to be on too and roughly match the native performance latency, I fear it'll be objectively worse as FSR currently is overall.

Also I'm curious, have you personally played on a 4090 with DLSS 3 to actually feel/see it, or going off review/analysis numbers?
Does one need to jump into a volcano to verify if it is indeed hot?
Posted on Reply
#59
phill
I'm so looking forward to the reviews... If its in striking distance and its near 50% of the price..... Why would anyone consider a 4090 with the problems they "apparently" have with the power plugs and such..

I think its amazing how AMD have bounced back with the GPU performance. Of course, I'll wait to see TPUs and W1zzards review of it...
Posted on Reply
#60
MarsM4N
wolfLost me and proably the majority of the TPU user base right there.
And the award for objective criticism goes to, ... :respect: Just by looking at Nvidia's actions, I think he's pretty much spot on. No?

Nvidia is sitting on stockpiles of 30xx cards, that's why they are dropping 40xx cards in homeopathic doses, pricing them like gold, labeling 4070's as 4080's, .... so it totally would fit into the picture that they also create an artificial shortage of 4080 supplies to keep 30xx prices up (paper launch). Just common sense, it's a pattern. What's there to disagree?
Posted on Reply
#61
Bomby569
Way to vague presentation, with Ryzen Lisa and her team clearly mentioned the competition in the presentation and compared it to the ryzen, slides objectively comparing with the i9 12900k. It's weird there was no mention of the competition this time, no comparisons with nvidia. I wonder why? i'm afraid of those very vague 1.5x and 1.7x against only their last gen cards.

Let's wait and see.
Posted on Reply
#62
medi01
tracker1If the RTX performance exceeds 3080…
Per shared figures, it beats 3090Ti in a number of games.
Bomby569no comparisons with nvidia. I wonder why?
What has been shown hints at 7090XTX being about 15% slower than 4090 in Raster.

How can you spin that in a presentation?

Even though it's an amazing product, if what was shown is true, there is still that built-in "meh, but it's not as halo as that €2300+ halo from that other manufacturer".
Posted on Reply
#63
Bomby569
medi01What has been shown hints at 7090XTX being about 15% slower than 4090 in Raster.

How can you spin that in a presentation?

Even though it's an amazing product, if what was shown is true, there is still that built-in "meh, but it's not as halo as that €2300+ halo from that other manufacturer".
you lost me, i just said they didn't directly compare to nvidia like they did with the 12900k, bar charts.
Posted on Reply
#65
HTC
wolfFrom what I gather right now, it looks like the gap is about the same, perhaps a little wider, doesn't seem narrower.
Was referring to percentages, but i guess we'll find out when reviews go live.
wolfIt should still be pretty acceptably preforming to a lot of people on these cards, but I believe the trend of "if RT matters to you, buy Nvidia" continuing.
Considering i'm still using an RX 480 AND use linux, it's safe to say it doesn't matter to me personally ...
Posted on Reply
#66
zo0lykas
gasolinaTPU test ......
Look at system spec..
Posted on Reply
#67
Godrilla
For me the 8k Samsung Neo G9 ultrawide with dp 2.1 was the most impressive thing from the whole show.

www.techpowerup.com/img/PXyv3fznZUC75Ucj.jpg
and the list of Vendors coming with DP 2.1 monitors in 2023
www.techpowerup.com/img/bKcGqqrE17E9vQNt.jpg
I guess time will tell if gamers want crazy rasterization with decent RT performance ( 7000 series) or Good rt performance with less Rasterization performance (4080 16 gigs) or both with best rt performance(4090). Unfortunately with the performance crown I don't see the 4090 moving down in price, but the 4080 at $1200 might drop when Ampere supplies dry up.
I was lucky enough to sell my 3090 at $800 last week before it fell even more in value.
Still a shame that 4000 series lacks dp 2.1 although hdmi 2.1 is still plenty of good on my 48 inch cx oled for now.

Question for the people complaining about the pricing are you going to be buying a dp 2.1 monitor for $2k?
Posted on Reply
#68
HD64G
RTX4090 and RX7900XTX will be both bottlenecked from the CPU limits and thus they will have the same FPS in many games. So, on average they will have a performance difference <10% even @4K and might be the other way around @1080P and 1440P since the sw scheduler of nVidia has a much bigget overhead than the hw one of AMD. So, for 60% of the money one will get >90% of raster performance, 20-25% less power draw and probably a highly oc-able GPU with the latest DP protocol for ultra high Hz and res monitors. Also the RTX4080 is DOA with its $1200, even if the price goes to $800 since it will be clearly slower (at all res) than the $900 RX7800XT in all apart from some RT-heavy games.

To sum it up, this market situation reminds me of the battle between GTX480 and the 5870. nVidia made a huge die to keep the perf crown but lost in all other metrics. Also in market share and financially since those dies cost much more to make.
Posted on Reply
#69
ToTTenTranz
GicaI'm waiting for the tests. I don't see how you can compare now.
If AMD announced $999 for flaghsip, it surely has some weaknesses (RT, productivity, who knows). As for the processors, while they were in the top with the 5000 series, they raised the prices sky high.
It could just be that AMD is simply pricing their hardware to a market holding a recession, where demand for high-end graphics cards is at an all time low because people are going out more and playing less videogames in their homes, and the second-hand market is still getting flooded with sub-$1000 3090 and lower GPUs that deliver a lot more performance than any game really demands even at 4K.

When Nvidia announced their RTX40 pricing many people accused them of living in an echo-chamber. It's not really fair to doubt AMD's new hardware solely because they didn't price their new hardware in an echo-chamber.
Posted on Reply
#70
wolf
Better Than Native
MarsM4NAnd the award for objective criticism goes to
Maybe some things he says are right, all I hear when he talks is nails on a chalkboard. Glad you enjoy it :peace:
Posted on Reply
#71
AnotherReader
Bomby569you lost me, i just said they didn't directly compare to nvidia like they did with the 12900k, bar charts.
It's not difficult to understand. If they had the lower performing 4080 16 GB to compare with, they would have done so, but the 4090 is too fast.
Posted on Reply
#72
pavle
Gar to have the reviews already, can't say either way. Should be enough to be very near the fake_frames_wonder(TM) (=4090) though.
Posted on Reply
#73
nguyen
phillI'm so looking forward to the reviews... If its in striking distance and its near 50% of the price..... Why would anyone consider a 4090 with the problems they "apparently" have with the power plugs and such..

I think its amazing how AMD have bounced back with the GPU performance. Of course, I'll wait to see TPUs and W1zzards review of it...
I think people are going out buying 4090 after watching the slides, 4090 will remain the gaming king for the forseeable future. Like 3090 vs 3080, there were people who never considered the 3080 10GB despite it being only 15% slower and cost less than half of 3090 :)
Posted on Reply
#74
Soul_
Bomby569don't you mean widescreen 4k?
When term 'k' was introduced, industry went away from 'p'. Where they differ is, 'k' is count of columns, vs 'p' is count of rows. So, 7680x2160 is 8k ultrawide. This is similar to 5k ultrawides from Samsung today, which are still 1440p.
Posted on Reply
#75
skates
Icon CharlieInteresting.... The problem with me is that I have is that I heard a great deal of market speak that made me want to puke.

If they deliver what they have stated at the price, then these will give Ngreedia a Headache as why should a normal gamer buy a $1600 video card when you can get one that might be 10%+ less in performance for 600+ DOLLARS LESS.

Cautiously Optimistic.
My local Microcenter finally has a 4090 in stock. They've been out of stock since minutes after launch. It's a Zotac and they want 1699 for it. So, the AMD would be 700 less. That 700 savings could be set aside to help pay for a new monitor that can run DP2.1. A Zotac, for 1699...
Outback BronzeHopefully needs a lot less power to get those numbers too.

Who bought a launch price 4090?
Outback BronzeHopefully needs a lot less power to get those numbers too.

Who bought a launch price 4090?
A bunch of Discord Bros in the 4090 channel who buy multiple cards so they can thrash them on Port Royal and post screens for the meme reward points. That's who. And one reason why I'm staying away from the 4090 because those cards will just be returned when they get a hold of an FE model.
HTCWas referring to percentages, but i guess we'll find out when reviews go live.


Considering i'm still using an RX 480 AND use linux, it's safe to say it doesn't matter to me personally ...
Same here, I don''t care about RT at all, I'd rather invest in a monitor which can support DP2.1 and the card too...
Posted on Reply
Add your own comment
Nov 21st, 2024 12:03 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts