Thursday, November 3rd 2022

AMD Announces the $999 Radeon RX 7900 XTX and $899 RX 7900 XT, 5nm RDNA3, DisplayPort 2.1, FSR 3.0 FluidMotion

AMD today announced the Radeon RX 7900 XTX and Radeon RX 7900 XT gaming graphics cards debuting its next-generation RDNA3 graphics architecture. The two new cards come at $999 and $899—basically targeting the $1000 high-end premium price point.
Both cards will be available on December 13th, not only the AMD reference design, which is sold through AMD.com, but also custom-design variants from the many board partners on the same day. AIBs are expected to announce their products in the coming weeks.

The RX 7900 XTX is priced at USD $999, and the RX 7900 XT is $899, which is a surprisingly small difference of only $100, for a performance difference that will certainly be larger, probably in the 20% range. Both Radeon RX 7900 XTX and RX 7900 XT are using the PCI-Express 4.0 interface, Gen 5 is not supported with this generation. The RX 7900 XTX has a typical board power of 355 W, or about 95 W less than that of the GeForce RTX 4090. The reference-design RX 7900 XTX uses conventional 8-pin PCIe power connectors, as would custom-design cards, when they come out. AMD's board partners will create units with three 8-pin power connectors, for higher out of the box performance and better OC potential. The decision to not use the 16-pin power connector that NVIDIA uses was made "well over a year ago", mostly because of cost, complexity and the fact that these Radeons don't require that much power anyway.

The reference RX 7900-series board design has the same card height as the RX 6950 XT, but is just 1 cm longer, at 28.7 cm. It is also strictly 2.5 slots thick. There's some white illuminated elements, which are controllable, using the same software as on the Radeon RX 6000 Series. Both cards feature two DisplayPort 2.1 outputs, one HDMI 2.1a and one USB-C.
This is AMD's first attempt at a gaming GPU made of chiplets (multiple logic dies on a multi-chip module). The company has built MCM GPUs in the past, but those have essentially been the GPU die surrounded by HBM stacks. The new "Navi 31" GPU at the heart of the RX 7900 XTX and RX 7900 XT features seven chiplets—a central large graphics compute die (GCD), surrounded by six memory control-cache dies (MCDs). The GCD is built on the TSMC 5 nm EUV silicon fabrication process—the same one on which AMD builds its "Zen 4" CCDs—while the MCDs are each fabricated on the TSMC 6 nm process.

The GCD contains the GPU's main graphics rendering machinery, including the front-end, the RDNA3 compute units, the Ray Accelerators, the display controllers, the media engine and the render backends. The GCD physically features 96 RDNA3 Unified Compute Units (CUs), for 6,144 stream processors. All 96 of these are enabled on the RX 7900 XTX. The RX 7900 XT has 84 out of 96 unified compute units enabled, which works out to 5,376 stream processors. The new RDNA3 next-generation compute unit introduces dual-issue stream processors, which essentially double their throughput generation-over-generation. This is a VLIW approach, AMD does not double the rated shader count though, so it's 6144 for the full GPU (96 CU x 64 shaders per CU, not 128 shaders per CU).

Each of the six MCDs contains a 64-bit wide GDDR6 memory interface, and 16 MB of Infinity Cache memory. Six of these MCDs add up to the GPU's 384-bit wide memory interface, and 96 MB of total Infinity Cache memory. The GCD addresses the 384-bit wide memory interface as a contiguous addressable block, and not 6x 64-bit. Most modern GPUs for the past decade have had multiple on-die memory controllers making up a larger memory interface, "Navi 31" moves these to separate chiplets. This approach reduces the size of the main GCD tile, which will help with yield rates. The Radeon RX 7900 XTX is configured with 24 GB of GDDR6 memory across the chip's entire 384-bit wide memory bus, while the RX 7900 XT gets 20 GB of GDDR6 memory across a 320-bit wide memory bus (one of the MCDs is disabled). The disabled MCD isn't not "missing", but there's some dummy silicon dies there to provide stability for the cooler mounting.

Each CU also features two AI acceleration components that provide a 2.7x uplift in AI inference performance over SIMD, and a second-generation RT accelerator that provides new dedicated instructions, and a 50% performance uplift in ray tracing performance. The AI cores are not exposed through software, software developers cannot use them directly (unlike NVIDIA's Tensor Cores), they are used exclusively by the GPU internal engines. Later today AMD will give us a more technical breakdown of the RDNA3 architecture.
For the RX 7900 XTX, AMD is broadly claiming an up to 70% increase in traditional raster 3D graphics performance over the previous-generation flagship RX 6950 XT at 4K Ultra HD native resolution; and an up to 60% increase in ray tracing performance. These gains should be good to catch RTX 4080, but AMD was clear that they are not targeting RTX 4090 performance, which comes at a much higher price point, too.
AMD is attributing its big 54% performance/Watt generational gains to a revolutionary asynchronous clock domain technology that runs the various components on the GCD at different frequencies, to minimize power draw. This seems similar in concept to the "shader clock" on some older NVIDIA architectures.
AMD also announced FSR 3.0, the latest generation of its performance enhancement, featuring Fluid Motion technology. This is functionally similar to DLSS 3 Frame Generation, promising a 100% uplift in performance at comparable quality—essentially because the GPU is generating every alternate frame without involving its graphics rendering pipeline.
The new dual-independent media-acceleration engines enable simultaneous encode and decode for AVC and HEVC formats; hardware-accelerated encode and decode for AV1, and AI-accelerated enhancements. The new AMD Radiance Display Engine introduces native support for DisplayPort 2.1, with 54 Gbps display link bandwidth, and 12 bpc color. This enables resolutions of up to 8K @ 165 Hz with a single cable; or 4K @ 480 Hz with a single cable.
The "Navi 31" GPU in its full configuration has a raw compute throughput of 61 TFLOPs, compared to 23 TFLOPs of the RDNA2-based Navi 21 (a 165% increase). The shader and front-end of the GPU operate at different clock speeds, with the shaders running at up to 2.30 GHz, and the front-end at up to 2.50 GHz. This decoupling has a big impact on power-savings, with AMD claiming a 25% power-saving as opposed to running both domains at the same 2.50 GHz clock.
AMD claims the Radeon RX 7900 XTX to offer a 70% performance increase over the RX 6950 XT.

The complete slide-deck follows.
Add your own comment

336 Comments on AMD Announces the $999 Radeon RX 7900 XTX and $899 RX 7900 XT, 5nm RDNA3, DisplayPort 2.1, FSR 3.0 FluidMotion

#101
GunShot
thesmokingmanNvidia price cuts... will they do it?
For what and going off what, though? Every function/feature for the 4090 was released at full launch. AMD came half-cocked like always. e.g. FSR 3.0 ETA is sometime in 2023 (WHAT?!), AI cores(?) uhm... but not devs/app controlled like Tensor's core (just GPU engine controlled, so, half AI?) nerfed titles in favour for AMD future promoted games didn't show hardly nothing performance wise (btw, why no vids, TPU?) Too many unknowns for RDNA3 and that shows insecurity and NVIDIA, and Intel, can smell fear.

AMD didn't show anything solid. Just a PowerPoint show, with tiny gaming vids with no real performance metrics, nothing lengthy anyway.
Posted on Reply
#102
TheinsanegamerN
CallandorWoTI think $899 is a fair price for the 7800 XT. I mean it's a literally a game changer for someone who like me who is just sick of playing high end AAA games at 1440p at 70 fps ish... when all we want is to play them at 165 fps 1440p. I know that sounds stupid as fuck, but I really do like 165 fps 165hz gaming, so eh, it is what it is. :toast:

I'm set for a long time now, assuming I can beat the bots and get one December 13th. Doubt if I can. :(
I mean that's why I bought my 6800xt, to get that sweet 1440p144. I think the 7900xt should have been a $799 part tho. $100 less then the xtx is the same mistake AMD made with the 6800 vs the xt, the xt was so much faster there was no reason to go with the mildly cheaper 6800.

I dont plan on buying either tho. There's not much that can stress the 6800xt and halo infinite is basically dead, so meh.
Posted on Reply
#103
cvaldes
GunShotAMD didn't show anything solid. Just a PowerPoint show, with tiny gaming vids with no real performance metrics, nothing lengthy anyway.
For sure AMD knows exactly how the RX 7900 XTX stacks up against RTX 4090.

AMD realizes that anyone sane is going to wait for third party reviewers to run the tests anyhow. Manufacturer claims don't need to be taken with a grain of salt -- they need to be taken with a mountain of salt.

They took the unassailable approach by comparing 7900 with their own previous generation products.

Presumably RDNA3 will not beat Ada Lovelace in a head-to-head showdown of game graphics benchmarks. Radeon will probably win the performance-per-watt and performance-per-dollar metrics but those just don't have the same on-screen glamour as real-time gameplay with an fps counter in the corner of the screen.
Posted on Reply
#104
Space Lynx
Astronaut
TheinsanegamerNI mean that's why I bought my 6800xt, to get that sweet 1440p144. I think the 7900xt should have been a $799 part tho. $100 less then the xtx is the same mistake AMD made with the 6800 vs the xt, the xt was so much faster there was no reason to go with the mildly cheaper 6800.

I dont plan on buying either tho. There's not much that can stress the 6800xt and halo infinite is basically dead, so meh.
the games i play need a 7800 xt to reach the frames i want at 1440p. but yeah everyone has a different use case.

i agree about the $100. honestly I think should have done $799 7800 xt and $1099 7800 XTX, because there is always going to be that elite segment who buys the high end no matter what, basically they would be splitting the difference on profit, while still catering a high end card to the majority gamer.
Posted on Reply
#105
Zach_01
TheinsanegamerNI think the 7900xt should have been a $799 part tho. $100 less then the xtx is the same mistake AMD made with the 6800 vs the xt, the xt was so much faster there was no reason to go with the mildly cheaper 6800.
Pure marketing....
This way the XTX has more value and most people will go for this one. Its "only" a 100 more.

No mistake here
Posted on Reply
#106
Punkenjoy
GunShotAMD didn't show anything solid. Just a PowerPoint show, with tiny gaming vids with no real performance metrics, nothing lengthy anyway.
To be honest, they showed about the same thing as every one else in those events. It's almost always the same.

But i agree that right now there is no rush for Nvidia to reduce the price of the 4090 or even the 4080. They might lose people that are incline to buy AMD but many people will never, or at least won't until it become "cool".

Most people wanted AMD to get cheaper card so Nvidia have to reduce their price. Well it won't work like that. If Nvidia know you want an Nvidia card, they will charge you as much for it no matter what AMD does.
Posted on Reply
#107
GunShot
cvaldesFor sure AMD knows exactly how the RX 7900 XTX stacks up against RTX 4090.

AMD realizes that anyone sane is going to wait for third party reviewers to run the tests anyhow. Manufacturer claims don't need to be taken with a grain of salt -- they need to be taken with a mountain of salt.

They took the unassailable approach by comparing 7900 with their own previous generation products.

Presumably RDNA3 will not beat Ada Lovelace in a head-to-head showdown of game graphics benchmarks. Radeon will probably win the performance-per-watt and performance-per-dollar metrics but those just don't have the same on-screen glamour as real-time gameplay with an fps counter in the corner of the screen.
I think the pricing right with the 4080 16GB will be another 3080Ti vs 6900 because at the end of the day, NVIDIA comes with the full package of features (insane RT numbers, etc.) and obviously better in neural networking prowess.

The only thing that irks me about my Strik 4090 (birthday gift from the wife and kids) is the lack for DP2.1.
Posted on Reply
#108
RedBear
PunkenjoyMost people wanted AMD to get cheaper card so Nvidia have to reduce their price. Well it won't work like that. If Nvidia know you want an Nvidia card, they will charge you as much for it no matter what AMD does.
I'm not sure if that's what people really want, maybe some were hoping for even cheaper AMD cards with greater value, but if you're correct they should have just watched at the current midrange, AMD has been offering better value for months now since the collapse of cryptocurrencies, Nvidia still isn't lowering its prices significantly.
Posted on Reply
#109
demirael
GunShotFor what and going off what, though? Every function/feature for the 4090 was released at full launch. AMD came half-cocked like always. e.g. FSR 3.0 ETA is sometime in 2023 (WHAT?!)
Having FSR 3.0 in 2023 is fine if it looks better than nVidia's mess called Frame Generation.
Posted on Reply
#110
GunShot
PunkenjoyTo be honest, they showed about the same thing as every one else in those events. It's almost always the same.

But i agree that right now there is no rush for Nvidia to reduce the price of the 4090 or even the 4080. They might lose people that are incline to buy AMD but many people will never, or at least won't until it become "cool".

Most people wanted AMD to get cheaper card so Nvidia have to reduce their price. Well it won't work like that. If Nvidia know you want an Nvidia card, they will charge you as much for it no matter what AMD does.
I agree but AMD did not reduce its prices, though if you think about it. Instead, one of its SKUs stayed neutral in price (6900 intial MSRP $1000) the other SKU was jacked-up $250 (6800xt intial MSRP $650). It seems like AMD was just betting on that many wouldn't notice if they threw a whole bunch of unconfirmed numbers at their slideshow, you know, just like they did with their 7000 series CPUs until the backlash from TRUE gamers exposed them. :laugh:
Posted on Reply
#111
Space Lynx
Astronaut
demiraelHaving FSR 3.0 in 2023 is fine if it looks better than nVidia's mess called Frame Generation.
playing a game on PC is turning into a worse experience than writing a college essay.

frame cap certain games or y happens.
vysnc in certan games or x happens.
which is better for this game dlss or fsr i have both, lets check a review, oh this one is better than that one for this game, but vice versa for spiderman
is RT worth it in this game or not, flick on flick off flick on flick off. i can't decide. fuck.
in order to enable frame generation you also must bla bla bla bla
oh you forgot to turn gsync on in windowed mode, its only turned on in fullscreen by default
bla bla bla bla
im losing my fucking mind
:roll:
Posted on Reply
#112
outpt
just kinda thumbed though this and wonder if FSR3 is backward compatible with rdna2 ?
Posted on Reply
#113
Space Lynx
Astronaut
outptjust kinda thumbed though this and wonder if FSR3 is backward compatible with rdna2 ?
from what i can tell, yes.
Posted on Reply
#114
GunShot
demiraelHaving FSR 3.0 in 2023 is fine if it looks better than nVidia's mess called Frame Generation.
Nah, I do cooked stuff, not bloody rare. :laugh:
Posted on Reply
#115
AsRock
TPU addict
TheLostSwedeWhat's with the three copper coloured "blades" in the heatsink?
A nice touch ?, maybe VRMs around that area ?. Thinking direct contact with them or close at least as it looks a little off.
trsttteDon't see any comparison to team green so they probably are far from the performance crown...

... but are cheaper, use less power and have up to date interfaces (DP2.1 and still maintain the USB-C virtuallink). Sounds like a winner (not in marketing but in practise probably pretty good)



Copper is pretty close to red, speed stripes :D
Yeah but copper is better so better than your average strips, well unless they are fake haha.
Posted on Reply
#116
Easo
Why the hell are so many people happy here? Just because these cards cost less than NVidia ones? The price is still high, as was expected plus pretty much no one will find them at MSRP.
So again - why?
Posted on Reply
#117
Space Lynx
Astronaut
EasoWhy the hell are so many people happy here? Just because these cards cost less than NVidia ones? The price is still high, as was expected plus pretty much no one will find them at MSRP.
So again - why?
these cards aren't for everyone. these are for high end 4k high refresh or 1440p extreme high refresh users. if you don't care about playing AAA titles maxed out with either of those settings, then budget cards for you.

its no different than someone who drives a 60k mercedes, and 30k ford.

they are essentially the same thing, just different experiences. no need to complain, just get what your use case is. if you don't care about any of that and only play at 1080p or 60 fps is fine for you, then you are golden with a 6700 xt
Posted on Reply
#118
uftfa
I'm shocked that most think the cards are priced well. 2 years ago, everyone was rightly incensed about the $999 MSRP of the 6900XT. Not sure if this is just fatigue from the GPU-apocalypse or a fallout of nvidia's 4080 pricing insanity.

These cards are further apart than 6900XT AND 6800XT. Just like the "4080 12GB" AMD saw an opening after nvidia's $1200 4080, and decided to price their 7800XT at $900 and name it 7900XT while moving the 7900XT to an XTX.

In the end, we're going to end up paying about 40-50% more from the 6800XT MSRP to get 50-60% extra performance. Minimal change in value. I'm just gonna skip this entire generation and wait 2 years hoping NV/AMD have learnt a lesson and prices come back to the real world.
RavenmasterNvidia is fucked if AMD's card can outperform the 4090 at that price.
It won't. AMD's own performance difference claims relative to 6950XT put the 7900XTX at roughly 10% below 4090.
Posted on Reply
#119
Space Lynx
Astronaut
uftfaI'm shocked that most think the cards are priced well. 2 years ago, everyone was rightly incensed about the $999 MSRP of the 6900XT.
not me. i thought the 6800 xt was also fairly priced. its the scalpers charging 1500 for that 999 that was bs.
Posted on Reply
#120
Punkenjoy
GunShotI agree but AMD did not reduce its prices, though if you think about it. Instead, one of its SKUs stayed neutral in price (6900 intial MSRP $1000) the other SKU was jacked-up $250 (6800xt intial MSRP $650). It seems like AMD was just betting on that many wouldn't notice if they threw a whole bunch of unconfirmed numbers at their slideshow, you know, just like they did with their 7000 series CPUs until the backlash from TRUE gamers exposed them. :laugh:
Well nobody doing GPU or CPU for charity. This is why we must hope for good competition and not too much price fixing...


It's a big more obvious when you think the 6800XT had 10% less CU than the top SKU and the 7900XT have 12.5% less. Also, Both 6800XT and 6900XT had the same amount of infinity cache, the same amount of Memory and the same bandwidth.

that one is a bit fishy to be honest and i see no point buying the non XTX version if you go for a 7900
Posted on Reply
#121
uftfa
CallandorWoTnot me. i thought the 6800 xt was also fairly priced. its the scalpers charging 1500 for that 999 that was bs.
Of course, 6800XT was fairly priced, I don't think there's much argument about that.

6900XT at $999 however was NOT. Barely a 10% uplift over 6800XT for 50% higher price. Just like 3080 -> 3090 on nvidia's side.
Posted on Reply
#122
Space Lynx
Astronaut
uftfaOf course, 6800XT was fairly priced, I don't think there's much argument about that.

6900XT at $999 however was NOT. Barely a 10% uplift over 6800XT for 50% higher price. Just like 3080 -> 3090 on nvidia's side.
i understand what you are saying. eh, it is what it is mate. im just happy i can retire from the hardware side of the hobby and enjoy gaming for the next decade. i hope i can get one. those damn bots and third party sellers are the ones who are really ruining everything imo.
Posted on Reply
#123
Hxx
cvaldesPresumably RDNA3 will not beat Ada Lovelace in a head-to-head showdown of game graphics benchmarks. Radeon will probably win the performance-per-watt and performance-per-dollar metrics but those just don't have the same on-screen glamour as real-time gameplay with an fps counter in the corner of the screen.
There is no ADA lovelace competitor released (ie 4080). AMD cant run benchmarks against it because they dont have one and against a 4090 wont make sense to show graphs of 4090 beating their cards in most games. The most logical sense would be against their previous gen.
Besides, these reveals are meant to create FOMO , their thinking is -how do i make these cards look their best without running into legal issues? not so much - how do they stack against all other options. Even Nvidia when they unveiled 40 series, they showed some meaningless graphs where 4090 was 4x faster lol.
Posted on Reply
#124
ymdhis
CallandorWoTthese cards aren't for everyone. these are for high end 4k high refresh or 1440p extreme high refresh users. if you don't care about playing AAA titles maxed out with either of those settings, then budget cards for you.
The problem is that the budget cards will cost $5-600 too.
Posted on Reply
#125
GhostRyder
I am interested in the review from here. Honestly considering the RX 7900 XT this round!
Posted on Reply
Add your own comment
Nov 23rd, 2024 06:09 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts