• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Announces the $999 Radeon RX 7900 XTX and $899 RX 7900 XT, 5nm RDNA3, DisplayPort 2.1, FSR 3.0 FluidMotion

these cards aren't for everyone. these are for high end 4k high refresh or 1440p extreme high refresh users. if you don't care about playing AAA titles maxed out with either of those settings, then budget cards for you.
The problem is that the budget cards will cost $5-600 too.
 
I am interested in the review from here. Honestly considering the RX 7900 XT this round!
 
Let me take a guess here, 7900XTX will be:

5-10% faster than 4080 in Rasterization
25% slower than 4080 in RayTracing

1000usd seems like a fair price
 
Let me take a guess here, 7900XTX will be:

5-10% faster than 4080 in Rasterization
25% slower than 4080 in RayTracing

1000usd seems like a fair price

Nobody can help you, you can believe whatever you want. AMD showed off at least 50 percent faster than 6950 XT performance today. That is not 10 percent faster than the 4080, the card that is less than 60 percent of the 4090. It could be 30 percent faster than it.

There is no ADA lovelace competitor released (ie 4080). AMD cant run benchmarks against it because they dont have one and against a 4090 wont make sense to show graphs of 4090 beating their cards in most games. The most logical sense would be against their previous gen.
Besides, these reveals are meant to create FOMO , their thinking is -how do i make these cards look their best without running into legal issues? not so much - how do they stack against all other options. Even Nvidia when they unveiled 40 series, they showed some meaningless graphs where 4090 was 4x faster lol.

Exactly there is no point in showing comparisons against a card that costs 60 percent more. They need to wait for the RTX 4080 to release before they can release marketing materials showing comparisons.

For what and going off what, though? Every function/feature for the 4090 was released at full launch. AMD came half-cocked like always. e.g. FSR 3.0 ETA is sometime in 2023 (WHAT?!), AI cores(?) uhm... but not devs/app controlled like Tensor's core (just GPU engine controlled, so, half AI?) nerfed titles in favour for AMD future promoted games didn't show hardly nothing performance wise (btw, why no vids, TPU?) Too many unknowns for RDNA3 and that shows insecurity and NVIDIA, and Intel, can smell fear.

AMD didn't show anything solid. Just a PowerPoint show, with tiny gaming vids with no real performance metrics, nothing lengthy anyway.
They showed charts claiming 50-70 percent faster performance. What more do you want. "didn't show anything solid" um, what?

I think the pricing right with the 4080 16GB will be another 3080Ti vs 6900 because at the end of the day, NVIDIA comes with the full package of features (insane RT numbers, etc.) and obviously better in neural networking prowess.

The only thing that irks me about my Strik 4090 (birthday gift from the wife and kids) is the lack for DP2.1.
The 4080 is the most cutdown '80 card released in the last decade. The 3080 was the least cutdown. I think you will be shocked at how bad the 4080 really is compared to your 3080-set expectations.
 
Last edited:
320-bit wide memory bus (two of the MCDs are disabled). The two disabled MCDs are not "missing"
@btarunr Is this correct? I calculate many times only one MCD is disabled...
 
looks promising!!

But nope I'm not paying a dime over $800 for my next GPU upgrade. I still consider $500 a lot of money for a high performance GPU and already giving in to spending $800 for my next upgrade has left me a little displeased. I seriously thought post-covid by now we'd be closer to home at $800 for some top performing ~2022 newer Gen cards.... or are we still waiting for a double dose + booster vaccine for the GPU pandemic?

So no 7800** release for the close of 2022? If correct, that's a little disappointing. Or should i keep my hopes up with the end-2022 upgrade plan?
 
looks promising!!

But nope I'm not paying a dime over $800 for my next GPU upgrade. I still consider $500 a lot of money for a high performance GPUs and already giving in to spending $800 for my next upgrade has left me a little distasteful. I seriously thought post-covid by now we'd be closer to home at $800 for some top performing ~2022 newer Gen cards.... or are we still waiting for a double dose + booster vaccine for the GPU pandemic?

So no 7800** release for the close of 2022? If correct, that's a little disappointing. Or should i keep my hopes up with the end-2022 upgrade plan?

Might as well grab a 6800XT for 550usd now if you don't care about RT, upscaling, video encoding, etc...
 
Still a bit pricey, but hey better than nvidia's pricing.. look forward to your reviews
 
Knew the presentation will be full of sidekicks, lol. :laugh: $999/$899 & 355W/300W sounds really good for the top dogs.
Somehow I have a feeling they will shred the RTX4080(16GB), esp. for the price. Wouldn't be surprised if it's getting "unlaunched" too, LMAO.


What's with the three copper coloured "blades" in the heatsink?

In some pictures the three fins are red, in some more copper'ish. Maybe it's a distinguishing feature of the XT & XTX.
But one is for shure, the three colored fans stand for "RDNA3". ;)
 
Following the pricing/performance scheme it looks like 7700 will be a very good, affordable choice for my medium-sized needs. Looking forward to it in a few months.
 
Reviews will be exciting for sure! Can't wait to see how these clock. Not that I'll partake but I like to live vicariously through others hehe.
 
so far, Intel and AMD has placed the new DP 2.1 for their GPUs when NoVideo puts an old DP 1.4 on the newest GPU while charging a premium.
 
Let me take a guess here, 7900XTX will be:

5-10% faster than 4080 in Rasterization
25% slower than 4080 in RayTracing

1000usd seems like a fair price
Let me take a guess here. You've been wearing green for too long, clouding your math skills & vision.
But I take the below with a bag of salt too. Still, salt included, let's consider they end up 20% below what's shown here. At 999; Nvidia is toast... or molten toast, depending on your power delivery.

So they lose 25% in RT, even including that AND -10~20% overall raster perf they're massively competitive, its just a much better deal throughout, and the performance is there regardless. Also, as others mentioned.... DP 1.4 on the 'most expensive card' is unforgivable.


The only thing you got right in your post here is making the more sensible comparison with a 4080; but that card is so far below the 4090, it just doesn't even compete with 7900XTX, and barely touches the 899 XT. Nvidia better keep swinging those Ampere cards, because Ada right now is DOA from top to bottom, is my view. They'll need an aggressive price strategy to keep it afloat, not the current MSRPs.

1667548113265.png


1667548128022.png
 
Last edited:
Now lets see the actual performance numbers.
 
Let me take a guess here. You've been wearing green for too long, clouding your math skills & vision.
But I take the below with a bag of salt too. Still, salt included, let's consider they end up 20% below what's shown here. At 999; Nvidia is toast... or molten toast, depending on your power delivery.

So they lose 25% in RT, even including that AND -10~20% overall raster perf they're massively competitive, its just a much better deal throughout, and the performance is there regardless. Also, as others mentioned.... DP 1.4 on the 'most expensive card' is unforgivable.


From TPU review
resident-evil-village-rt-3840-2160.png
metro-exodus-rt-3840-2160.png
watch-dogs-legion-rt-3840-2160.png
 
The clues are all there, transitor count is much lower compared to Nvidia, still playing catchup in tehnologies like upscaling and raytracing and the price says it all.
In a game like cyberpunk with raytracing and upscaling enabled i expect AMD to lose big time, Nvidia is already making a cut down RTX 4090 to slightly beat 7900 XTX.
But does it matter ? most people won't buy these absurdly huge power hungry cards, what it matters is what they do in the low and mainstream area.
 
So, basically, AMD just did a switch-a-roo naming-scheme here, in my opinion, huh?

Actually 7900xt = now a 7900xtx

Actually 7800xt = now a 7900xt

"It would look weird and it will expose us for charging $899 just for our 7800xt but, hey, let's do an NVIDIA but... a tad less... and our fans will defend is... Yeah!... the Board!" :laugh:

Not at all, the 6900 xt has a bus width of 320 like the standard 7900xt. The 7900 XTX has a bigger bus width and A LOT more cores than previous gen. They could have used a bigger naming number just for this totally legit.
 
What is the correct number of shader units? 6144 according official AMD site, or 12288?
 
It looks like AMD is following Nvidia with the top-down release. It makes sense as RDNA2 still has plenty of horsepower in the middle and low segments. It's a shame I don't care about flagship products, although $999 sounds a lot better than Nvidia's $1600. It really makes me wonder where the 7700 and 7600 series will land in specs, performance and price.
 
It's pretty obvious. The numbers they provided were 50-70% faster than their current gen. That makes then 7900 XTX around 0-20% slower (depending on the game) in pure rasterization in games than the 4090. I expect they would be comparing it to the 4080, if it was out.

This was first and foremost a marketing presentation and at this time there's just nothing from Nvidia at $900-1000 price range that makes for a good showcase, in terms of performance, of their GPU's.
But they could have still compared it to the previous gen nvidia. Last Gen they compared the 2080ti vs the 6800 Seems like they're hiding the performance. Never seen them be so secretive about gpu performance since the days of Polaris where they crossfired them vs the gtx 1080

IF they use the 3090Ti people will ask why not use the 4090. Given the 4090 has a 60% price premium they are not really targeting the same market and there is no 4080 16GB to compare against at a closer price point so they just compared to their old flagship.
Come on. They used a 3090 last gen to compare it to their flagship 6900xt But they couldn't do it now? Everything about their 'up to fps' looked shady as fuck
 
But they could have still compared it to the previous gen nvidia. Last gen they compared the 6800 to the 2080ti.. Seems like they're hiding the performance. Never seen them be so secretive about gpu performance since the days of Polaris where they crossfired them vs the gtx 1080


Come on. They used the 2080ti for comparison 2 years ago But they couldn't do it now? Everything about their 'up to fps' looked shady as fuck
To be honest, I prefer straight up specs and performance numbers instead of comparisons with the competition during a product launch. Comparisons easily end up being a dirty shit-talk about the competition, like Intel demonstrated a couple years ago. It's just disingenuous. Talking about your own product shows more confidence. Product launches are nothing more than teasers anyway. As someone whose job involves holding presentations, I think AMD did a superb job here.

Edit: Link fixed.
 
Last edited:
To be honest, I prefer straight up specs and performance numbers instead of comparisons with the competition during a product launch. Comparisons easily end up being a dirty shit-talk about the competition, like Intel demonstrated a couple years ago. It's just disingenuous. Talking about your own product shows more confidence.
And the way they did it now isn't? It look like they're hiding the actual performance against Nvidia's flagship. Which probably means the 4090 could be way faster than anyone seems to realize. The way they started doing those product advertisements for the display port cable and later on for the monitor... Like jesus christ did none of you's alarm bells go off?
 
For what and going off what, though? Every function/feature for the 4090 was released at full launch. AMD came half-cocked like always. e.g. FSR 3.0 ETA is sometime in 2023 (WHAT?!), AI cores(?) uhm... but not devs/app controlled like Tensor's core (just GPU engine controlled, so, half AI?) nerfed titles in favour for AMD future promoted games didn't show hardly nothing performance wise (btw, why no vids, TPU?) Too many unknowns for RDNA3 and that shows insecurity and NVIDIA, and Intel, can smell fear.

AMD didn't show anything solid. Just a PowerPoint show, with tiny gaming vids with no real performance metrics, nothing lengthy anyway.
We don't need that, Nvidia can keep its fake frames and other disposable features like its self-destruct function.

AMD just delivered what most of us wanted, a card with a lot of performance at a reasonable price, have a nice day.
 
better in neural networking prowess*.
And also at opening Chakras*

*your prowess in Chakra opening need to be as terrible as with neural networking, not to chuckle at both of these statements.

It won't. AMD's own performance difference claims relative to 6950XT put the 7900XTX at roughly 10% below 4090.
6950XT beats last gen's NV's best in a number of titles.
7900XTX is very likely to beat 4090 in them, despite being a way smaller chip, with a way more modest power budget.
 
And also at opening Chakras*

*your prowess in Chakra opening need to be as terrible as with neural networking, not to chuckle at both of these statements.


6950XT beats last gen's NV's best in a number of titles.
7900XTX is very likely to beat 4090 in them, despite being a way smaller chip, with a way more modest power budget.
If that was the case they'd have shown the cherry picked benchmarks with their sponsored games. But they didn't.
The same games the 6950xt beat the previous gen nvidia flagships have all been AMD's sponsored titles.
 
Back
Top