Thursday, November 3rd 2022

AMD Announces the $999 Radeon RX 7900 XTX and $899 RX 7900 XT, 5nm RDNA3, DisplayPort 2.1, FSR 3.0 FluidMotion

AMD today announced the Radeon RX 7900 XTX and Radeon RX 7900 XT gaming graphics cards debuting its next-generation RDNA3 graphics architecture. The two new cards come at $999 and $899—basically targeting the $1000 high-end premium price point.
Both cards will be available on December 13th, not only the AMD reference design, which is sold through AMD.com, but also custom-design variants from the many board partners on the same day. AIBs are expected to announce their products in the coming weeks.

The RX 7900 XTX is priced at USD $999, and the RX 7900 XT is $899, which is a surprisingly small difference of only $100, for a performance difference that will certainly be larger, probably in the 20% range. Both Radeon RX 7900 XTX and RX 7900 XT are using the PCI-Express 4.0 interface, Gen 5 is not supported with this generation. The RX 7900 XTX has a typical board power of 355 W, or about 95 W less than that of the GeForce RTX 4090. The reference-design RX 7900 XTX uses conventional 8-pin PCIe power connectors, as would custom-design cards, when they come out. AMD's board partners will create units with three 8-pin power connectors, for higher out of the box performance and better OC potential. The decision to not use the 16-pin power connector that NVIDIA uses was made "well over a year ago", mostly because of cost, complexity and the fact that these Radeons don't require that much power anyway.

The reference RX 7900-series board design has the same card height as the RX 6950 XT, but is just 1 cm longer, at 28.7 cm. It is also strictly 2.5 slots thick. There's some white illuminated elements, which are controllable, using the same software as on the Radeon RX 6000 Series. Both cards feature two DisplayPort 2.1 outputs, one HDMI 2.1a and one USB-C.
This is AMD's first attempt at a gaming GPU made of chiplets (multiple logic dies on a multi-chip module). The company has built MCM GPUs in the past, but those have essentially been the GPU die surrounded by HBM stacks. The new "Navi 31" GPU at the heart of the RX 7900 XTX and RX 7900 XT features seven chiplets—a central large graphics compute die (GCD), surrounded by six memory control-cache dies (MCDs). The GCD is built on the TSMC 5 nm EUV silicon fabrication process—the same one on which AMD builds its "Zen 4" CCDs—while the MCDs are each fabricated on the TSMC 6 nm process.

The GCD contains the GPU's main graphics rendering machinery, including the front-end, the RDNA3 compute units, the Ray Accelerators, the display controllers, the media engine and the render backends. The GCD physically features 96 RDNA3 Unified Compute Units (CUs), for 6,144 stream processors. All 96 of these are enabled on the RX 7900 XTX. The RX 7900 XT has 84 out of 96 unified compute units enabled, which works out to 5,376 stream processors. The new RDNA3 next-generation compute unit introduces dual-issue stream processors, which essentially double their throughput generation-over-generation. This is a VLIW approach, AMD does not double the rated shader count though, so it's 6144 for the full GPU (96 CU x 64 shaders per CU, not 128 shaders per CU).

Each of the six MCDs contains a 64-bit wide GDDR6 memory interface, and 16 MB of Infinity Cache memory. Six of these MCDs add up to the GPU's 384-bit wide memory interface, and 96 MB of total Infinity Cache memory. The GCD addresses the 384-bit wide memory interface as a contiguous addressable block, and not 6x 64-bit. Most modern GPUs for the past decade have had multiple on-die memory controllers making up a larger memory interface, "Navi 31" moves these to separate chiplets. This approach reduces the size of the main GCD tile, which will help with yield rates. The Radeon RX 7900 XTX is configured with 24 GB of GDDR6 memory across the chip's entire 384-bit wide memory bus, while the RX 7900 XT gets 20 GB of GDDR6 memory across a 320-bit wide memory bus (one of the MCDs is disabled). The disabled MCD isn't not "missing", but there's some dummy silicon dies there to provide stability for the cooler mounting.

Each CU also features two AI acceleration components that provide a 2.7x uplift in AI inference performance over SIMD, and a second-generation RT accelerator that provides new dedicated instructions, and a 50% performance uplift in ray tracing performance. The AI cores are not exposed through software, software developers cannot use them directly (unlike NVIDIA's Tensor Cores), they are used exclusively by the GPU internal engines. Later today AMD will give us a more technical breakdown of the RDNA3 architecture.
For the RX 7900 XTX, AMD is broadly claiming an up to 70% increase in traditional raster 3D graphics performance over the previous-generation flagship RX 6950 XT at 4K Ultra HD native resolution; and an up to 60% increase in ray tracing performance. These gains should be good to catch RTX 4080, but AMD was clear that they are not targeting RTX 4090 performance, which comes at a much higher price point, too.
AMD is attributing its big 54% performance/Watt generational gains to a revolutionary asynchronous clock domain technology that runs the various components on the GCD at different frequencies, to minimize power draw. This seems similar in concept to the "shader clock" on some older NVIDIA architectures.
AMD also announced FSR 3.0, the latest generation of its performance enhancement, featuring Fluid Motion technology. This is functionally similar to DLSS 3 Frame Generation, promising a 100% uplift in performance at comparable quality—essentially because the GPU is generating every alternate frame without involving its graphics rendering pipeline.
The new dual-independent media-acceleration engines enable simultaneous encode and decode for AVC and HEVC formats; hardware-accelerated encode and decode for AV1, and AI-accelerated enhancements. The new AMD Radiance Display Engine introduces native support for DisplayPort 2.1, with 54 Gbps display link bandwidth, and 12 bpc color. This enables resolutions of up to 8K @ 165 Hz with a single cable; or 4K @ 480 Hz with a single cable.
The "Navi 31" GPU in its full configuration has a raw compute throughput of 61 TFLOPs, compared to 23 TFLOPs of the RDNA2-based Navi 21 (a 165% increase). The shader and front-end of the GPU operate at different clock speeds, with the shaders running at up to 2.30 GHz, and the front-end at up to 2.50 GHz. This decoupling has a big impact on power-savings, with AMD claiming a 25% power-saving as opposed to running both domains at the same 2.50 GHz clock.
AMD claims the Radeon RX 7900 XTX to offer a 70% performance increase over the RX 6950 XT.

The complete slide-deck follows.
Add your own comment

336 Comments on AMD Announces the $999 Radeon RX 7900 XTX and $899 RX 7900 XT, 5nm RDNA3, DisplayPort 2.1, FSR 3.0 FluidMotion

#176
mahoney
mb194dcWhy would you trust the manufacturer canned benches anyway?

Just wait for the reviews under controlled conditions.
Because you could at least have an interpretation of the performance. We have nothing so far. Absolutely nothing
Posted on Reply
#177
AusWolf
mahoneyBecause you could at least have an interpretation of the performance.
No you couldn't. Manufacturer benches are always inaccurate and biased.
mahoneyWe have nothing so far. Absolutely nothing
We have comparisons with the 6950 XT.
Posted on Reply
#178
mahoney
AusWolfNo you couldn't. Manufacturer benches are always inaccurate and biased.


We have comparisons with the 6950 XT.
With Raytracing and FSR enabled :roll:
Posted on Reply
#179
AusWolf
mahoneyWith Raytracing and FSR enabled :roll:
Yep. And your point is...?
Posted on Reply
#180
mahoney
AusWolfYep. And your point is...?
Are you perhaps an AMD fanboy? If you had any logic at all you'd se 1+1=... But it seems you can't
Posted on Reply
#181
AusWolf
mahoneyAre you perhaps an AMD fanboy? If you had any logic at all you'd se 1+1=... But it seems you can't
Oh, the fanboy card! I was expecting this to come out sooner or later. :rolleyes:

Instead of accusations, you could maybe perhaps try to explain what you want? Just saying...

I think I've explained my point pretty well, being: manufacturer benchmarks are always flawed and biased one way or another. Intel, Nvidia, AMD, it doesn't matter. Whether they give you any numbers or not, you should always wait for independent reviews before you draw conclusions. Then you just proved it by pointing out that they used RT and FSR in the launch video. Yes, they did. Yes, it's flawed. It's always been!
Posted on Reply
#182
birdie
  • Have AMD GPU announcements always been so light on actual raw data? Almost all the charts don't include raw FPS. AMD has always touted itself for transparency, now instead of raw FPS we get FSR FPS.
  • FSR 3.0 implements DLSS 3.0 while there's no word on increased latency. Weird. At least NVIDIA adds NVIDIA Reflex to partially mitigate the issue. Also, vague "available in 2023" sounds like they were not ready for DLSS 3.0 but they had to counter it.
  • I don't understand how to read DXR performance. Looks like AMD will again only compete with previous generation NVIDIA cards.
  • Funnily AMD did not actually explain why they used/needed to use the chiplet design.
I don't give two shits about high fps 4K, 8K and other perks of being a rich 1st-world country citizen. I'm looking forward to something which costs below $350 and rivals at the very least RTX 3070.

According to Steam Hardware survey cards under $350 are what is driving progress, not these expensive tech toys for the rich. I don't understand all this clamor about top-tier cards. > 95% of gamers cannot afford them, and with them you also need a top-tier CPU and quite an expensive monitor.

P.S. My next GPU will be RDNA 3.0 because I've grown tired of NVIDIA Linux drivers, NVIDIA's pricing and NVIDIA's product segmentation. The company has seemingly stopped caring about budget users.
Posted on Reply
#183
mahoney
AusWolfOh, the fanboy card! I was expecting this to come out sooner or later. :rolleyes:

Instead of accusations, you could maybe perhaps try to explain what you want? Just saying...

I think I've explained my point pretty well, being: manufacturer benchmarks are always flawed and biased one way or another. Intel, Nvidia, AMD, it doesn't matter. Whether they give you any numbers or not, you should always wait for independent reviews before you draw conclusions. Then you just proved it by pointing out that they used RT and FSR in the launch video. Yes, they did. Yes, it's flawed. It's always been!
They barely showed them this time and when they did they used RT+FSR ffs. THAT'S MY POINT
What are they hiding and are not willing to show us?
Posted on Reply
#184
80-watt Hamster
mahoneyThey barely showed them this time and when they did they used RT+FSR ffs. THAT'S MY POINT
What are they hiding and are not willing to show us?
We'll know in about five weeks. Let's all chill until then.
Posted on Reply
#185
skates
I'm in for the XTX if it is within 15% of the 4090. Reason being lower power requirements, lower cost, DP2.1 (getting a new monitor next year). I've never gone AMD for GPU, always NVIDIA, but I'm just overall tired of their pricing, availability and I just don't want to support them anymore.
Posted on Reply
#186
btk2k2
Some quick and dirty analysis.



So here is the cost / frame according to Techspot / HUB. Using their chart because they used a 5800X3D in the review and wizzard showed that at 4K there was an advantage in using that CPU over the vanilla 5800X so this chart has a bit less 4K bottlenecking for the 4090.

Given the 54% increase in perf/watt provided by AMD was 7900XTX @300W vs the 6900XT @300W we can do some funny math to get an estimate or we can just take that and apply it to the 6950XT which has slightly worse perf/watt than the 6900XT does and ignore that the 7900XTX has higher power draw than the 6950XT. This means I am going to apply a 1.54x scaling factor to the 6950XT score to estimate 7900XTX performance in this suite of games. Given AMD showed 3 games that averaged 1.57x more performance in raster it seems fair enough without being overly pessimistic or optimistic.

So with that out of the way the 7900XTX would get an estimated 131 fps in the above suite. The 4080 looks to be about 20% ahead of the 3090Ti according to the NV charts they showed (which while only 3 games seemed to be ballpark where the 4090 raster performance improvement landed so not cherry picked by the looks of it) giving it an estimated 109 fps in the above. This is all raster performance obviously. Anyway to get to the point it gives us the following

4090 cost / frame = $11.11
4080 cost / frame = $11.01
7900XTX cost / frame = $7.63

Quite an advantage for AMD there, even vs the price reduced current gen stuff.

What about RT though. Well going through the techspot numbers the 4090 has a 4k native RT scaling factor of 0.46x. The 3090Ti has a scaling factor of 0.42x and the 6950XT had a scaling factor of 0.31x. I will use 0.46x for the 4080 and 0.31x for the 7900XTX. Actual numbers may be worse given how cut down the 4080 is and that the RT scaling for the 7900XTX looked to scale worse than than the raster improvement but it is the best estimate we have. Anyway that ends up with the following.

4090 RT cost / frame $24.24
4080 RT cost / frame $24.00
7900XTX RT cost / frame $24.37
3090Ti RT cost / frame $28.95

So 7900XTX is priced about inline with the RT performance of the 4k series but offers a large raster perf/$ advantage. The 4090 does and the 4080 looks like it will offer better absolute RT performance and no real premium so it looks to me like we as customers have options based on our wants which is always nice.

This does leave an opening for the 4070 to actually offer the best RT bang for buck performance if priced right. If you take raster perf of that card to be about 3080Ti and the 4090 scaling factor it looks like performance will be a bit worse than the 7900XTX but if priced at $600-700 it would be better RT perf/$ than the AMD cards but worse raster perf/$. At that price point 7800XT vs 4070 could easily be a case of go 4070 if you want better RT performance or go 7800XT if you want better raster performance.

EDIT: Picture does not seem to display after posting, link added as well incase issue is not just my side

EDIT2: Thank TheLostSwede,
Posted on Reply
#187
Soul_
TheLostSwedeWhat's with the three copper coloured "blades" in the heatsink?
RDNA3 so 3 stripes? My best guess.
Posted on Reply
#188
Hxx
mahoneyThey barely showed them this time and when they did they used RT+FSR ffs. THAT'S MY POINT
What are they hiding and are not willing to show us?
dude u are missing the point of these reveals. They are not a technical deep dive into fps against competitors. Never were and never will be. They are marketing fluff meant to create FOMO and strategically held to gain market share. Thats it. Anything they say will be biased . the only thing that matters is the 10 second ending when they revealed pricing and availability... thats all that is worth knowing. Same for nvidia and their 4x faster 4090 reveal which was a load of crap of course. No point in overthinking about what theyre hiding and not willing to show etc.
Posted on Reply
#189
TheLostSwede
News Editor
btk2k2EDIT: Picture does not seem to display after posting, link added as well incase issue is not just my side
It's because it's webp, they don't display properly in the forums, so you have to change it into a jpg or png.
Posted on Reply
#190
medi01
mahoneyoh boy... When you're confident in something you show it if not you don't.

Why do i have to keep repeating myself

When they launched 6000 series they compared them with rival gpu's - 6800 vs 2080ti

When they launched the 5000 series they also compared them with their rival gpu's - 5700 vs RTX 2060

When they launched the Radeon 7 - compared it to the 2080

Even when they launched Vega - they at least compared it to their previous flagship Fury X

Nothing suspicious with not showing any actual benchmarks right? RIGHT?!?! :oops:
You realize that rival is MISSING at the moment, don't you?

And I don't mean the knocked out rival:
www.techpowerup.com/forums/threads/nvidia-cancels-geforce-rtx-4080-12gb-to-relaunch-it-with-a-different-name.299859/page-14#post-4858912
:roll:

I mean the remaining one, that overpriced piece of something called 4080.
:D

PS

AMD's statements applied to TPU charts at 4k:

Posted on Reply
#191
Vayra86
mahoneyThey're not. All the showed was some fps with 'up to'. What are we supposed to take from that? Average FPS? Max FPS?
Like i said if they were confident in their product they'd have done an 8 game gpu benchmark with their cherry picked games but they didn't which means 4090 probably absolutely stomps on this thing. Even Greymon tweeted 1h before the launch Nvidia is still king...
Nvidia may still be the king, but a lot of people question the relevance, as do I.

I dont need a 4090. I dont even want one. I need a solid, not too power hungry GPU that runs games proper at a good price/perf. Fuck RT. Honestly; and similar things apply to other perceived value where there is none (DLSS3). And that is all NV wrote this gen. Im not missing a thing going Red... only overinflated marketing to keep DOA tech afloat.
Posted on Reply
#192
mahoney
medi01You realize that rival is MISSING at the moment, don't you?

And I don't mean the knocked out rival:
www.techpowerup.com/forums/threads/nvidia-cancels-geforce-rtx-4080-12gb-to-relaunch-it-with-a-different-name.299859/page-14#post-4858912
:roll:

I mean the remaining one, that overpriced piece of something called 4080.
:D

PS

AMD's statements applied to TPU charts at 4k:

Oh so the 4090 wasn't the rival ? Oh ok because it seems all the bullshit speculations just a few days ago from Adored - said the performance leap was gonna be generational and Moore's Law is dead who also bs'ed the big performance gains...
Also TPU benchmarks are flawed since Wizzard used the 5800x. There's almost a 7% difference between the 5800x and the 12900k at 4k
www.techpowerup.com/review/rtx-4090-53-games-ryzen-7-5800x-vs-core-i9-12900k/2.html
It's clear as day how many amd fanboys there are can't even see how sus the whole presentation was.
Hxxdude u are missing the point of these reveals. They are not a technical deep dive into fps against competitors. Never were and never will be. They are marketing fluff meant to create FOMO and strategically held to gain market share. Thats it. Anything they say will be biased . the only thing that matters is the 10 second ending when they revealed pricing and availability... thats all that is worth knowing. Same for nvidia and their 4x faster 4090 reveal which was a load of crap of course. No point in overthinking about what theyre hiding and not willing to show etc.
I've shown you proof that in the last 5 years they've ALWAYS shown their cherrypicked benchmarks. But ok
Posted on Reply
#193
Vayra86
mahoneyOh so the 4090 wasn't the rival ? Oh ok because it seems all the bullshit speculations just a few days ago from Adored - said the performance leap was gonna be generational and Moore's Law is dead who also bs'ed the big performance gains...
Also TPU benchmarks are flawed since Wizzard used the 5800x. There's almost a 7% difference between the 5800x and the 12900k at 4k
www.techpowerup.com/review/rtx-4090-53-games-ryzen-7-5800x-vs-core-i9-12900k/2.html
It's clear as day how many amd fanboys there are can't even see how sus the whole presentation was.
Holy moly dude, maybe start your own channel? We get it, you have doubts. Thats the MO for every release in tech land, at this point... the numbers/ reviews are what counts. People have been confirming that over the course of two + pages and you still rant ahead :D

Is it forbidden to have an opinion on what AMD presented that differs from the YT echo chamber? I smell sheep
Posted on Reply
#194
medi01
mahoneyOh so the 4090 wasn't the rival ?
Ah. OK, it was. That is why 4080 12GB was murdered by AMD in the process.
Figures. :roll:
mahoneybullshit speculations just a few days ago from Adored
Oh. And that is relevant in this thread, because? :D
Posted on Reply
#195
mahoney
medi01Ah. OK, it was. That is why 4080 12GB was murdered by AMD in the process.
Figures. :roll:


Oh. And that is relevant in this thread, because? :D
No idea why you mentioned the 4080. This was supposed to go head to head with the 4090 just like the 6900xt did with the 3090
Because AMD fanboys love to hype the shit out of AMD's products.
RDNA3 was supposed to show us amazing leaps in performance yet it hasn't. Seems like RDNA2 was it.
Posted on Reply
#196
ARF
mahoneyNo idea why you mentioned the 4080. This was supposed to go head to head with the 4090 just like the 6900xt did with the 3090
Because AMD fanboys love to hype the shit out of AMD's products.
RDNA3 was supposed to show us amazing leaps in performance yet it hasn't. Seems like RDNA2 was it.
My next GFX will be Radeon RX 7900 XT 20 GB. Of course, that I am not happy with the card's naming, could have been called RX 7800 XT but we have to deal with the reality.
I am against the big green troll and to feed him, so I am not giving a coin to nvidia.
Posted on Reply
#197
Hxx
mahoneyI've shown you proof that in the last 5 years they've ALWAYS shown their cherrypicked benchmarks. But ok
Nvidia has not released a competitor to the 7900xt. The 4080 is not out . A 4090 is faster and logically amd will not show a bunch of graphs with 4090 being faster and it’s also not even in the same price tier. So they showed a bunch of fluff and compared against their prior lineup.nothing new or unusual .

Now you may ask yourself why they didn’t compare against 3xxx series from nvidia ? Because perception. If they did that then folks would take it as AMD competing with 3000 series instead of 4000 series. Maybe not folks on this forum but those who are new to the hobby

It’s all just one big marketing dick measuring contest and nothing new here
Posted on Reply
#198
neatfeatguy
I forget how easy it is for one side to get triggered when these presentations or possible leaks come out. This is some good shit here, listening to hardcore fans of Nvidia trying hard to defend the 4090 and then seeing what the hardcore AMD fans are rebutting with.

Wish I had some popcorn.

I like the more realistic pricing provided by AMD here, but I'm not thrilled about pricing still being so high when compared to just a few generations back. I know technology advances and that prices go up (inflation, scarcity, demand, wage increases, etc), but when we were all seeing high end cards such as the 980Ti at MSRP of $650 or the AMD R9 Fury X MSRP of $649.
How about the GTX 1080 MSRP of $599 (and the 1080Ti MSRP of $$699) or the Vega 64 MSRP of $499.

Oh well, I guess I just dwell on the old pricing of better days and keep hoping things will settle down more. At least AMD isn't trying to rake people over the coals with extreme pricing like Nvidia is doing now.

I truly hope the 7900 cards put's Nvidia to shame. Even if they can't quite match the 4090, but if they can kick the crap out of the 4080 16GB and do it for $200-300 less, that would be awesome.
Posted on Reply
#199
dragontamer5788
neatfeatguyI like the more realistic pricing provided by AMD here, but I'm not thrilled about pricing still being so high when compared to just a few generations back. I know technology advances and that prices go up (inflation, scarcity, demand, wage increases, etc), but when we were all seeing high end cards such as the 980Ti at MSRP of $650 or the AMD R9 Fury X MSRP of $649.
How about the GTX 1080 MSRP of $599 (and the 1080Ti MSRP of $$699) or the Vega 64 MSRP of $499.
So something to note. Back 10 years ago, when you cut a transistor from 24nm to 16nm, you also made it 50% cheaper (per transistor). So back then, they could offer better-and-better GPUs for cheaper-and-cheaper, because the economics supported it. Today, 7nm to 5nm is not cheaper at all. It gets you better performance (faster, less power usage, etc. etc.), but the shear effort that goes into the 5nm process blows away your budget entirely.

I understand this is the reality now. Chips will get more expensive as they get more advanced from here on out. Moore's Law is dead (again, but in a different way).
Posted on Reply
#200
ARF
HxxNvidia has not released a competitor to the 7900xt. The 4080 is not out . A 4090 is faster and logically amd will not show a bunch of graphs with 4090 being faster and it’s also not even in the same price tier.
How do you know? The TPU performance chart clearly shows that RTX 4090 is only 53% faster than the RX 6950 XT, while AMD's slide shows up to 70% performance increase.
With the same success I can tell you that RX 7900 XTX will be faster than RTX 4090 and RTX 4090 needs a 50% price reduction, otherwise no one will buy it except the diehard core nvidia fanboys/girls.
Posted on Reply
Add your own comment
May 4th, 2024 06:34 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts