Thursday, November 3rd 2022

AMD Announces the $999 Radeon RX 7900 XTX and $899 RX 7900 XT, 5nm RDNA3, DisplayPort 2.1, FSR 3.0 FluidMotion

AMD today announced the Radeon RX 7900 XTX and Radeon RX 7900 XT gaming graphics cards debuting its next-generation RDNA3 graphics architecture. The two new cards come at $999 and $899—basically targeting the $1000 high-end premium price point.
Both cards will be available on December 13th, not only the AMD reference design, which is sold through AMD.com, but also custom-design variants from the many board partners on the same day. AIBs are expected to announce their products in the coming weeks.

The RX 7900 XTX is priced at USD $999, and the RX 7900 XT is $899, which is a surprisingly small difference of only $100, for a performance difference that will certainly be larger, probably in the 20% range. Both Radeon RX 7900 XTX and RX 7900 XT are using the PCI-Express 4.0 interface, Gen 5 is not supported with this generation. The RX 7900 XTX has a typical board power of 355 W, or about 95 W less than that of the GeForce RTX 4090. The reference-design RX 7900 XTX uses conventional 8-pin PCIe power connectors, as would custom-design cards, when they come out. AMD's board partners will create units with three 8-pin power connectors, for higher out of the box performance and better OC potential. The decision to not use the 16-pin power connector that NVIDIA uses was made "well over a year ago", mostly because of cost, complexity and the fact that these Radeons don't require that much power anyway.

The reference RX 7900-series board design has the same card height as the RX 6950 XT, but is just 1 cm longer, at 28.7 cm. It is also strictly 2.5 slots thick. There's some white illuminated elements, which are controllable, using the same software as on the Radeon RX 6000 Series. Both cards feature two DisplayPort 2.1 outputs, one HDMI 2.1a and one USB-C.
This is AMD's first attempt at a gaming GPU made of chiplets (multiple logic dies on a multi-chip module). The company has built MCM GPUs in the past, but those have essentially been the GPU die surrounded by HBM stacks. The new "Navi 31" GPU at the heart of the RX 7900 XTX and RX 7900 XT features seven chiplets—a central large graphics compute die (GCD), surrounded by six memory control-cache dies (MCDs). The GCD is built on the TSMC 5 nm EUV silicon fabrication process—the same one on which AMD builds its "Zen 4" CCDs—while the MCDs are each fabricated on the TSMC 6 nm process.

The GCD contains the GPU's main graphics rendering machinery, including the front-end, the RDNA3 compute units, the Ray Accelerators, the display controllers, the media engine and the render backends. The GCD physically features 96 RDNA3 Unified Compute Units (CUs), for 6,144 stream processors. All 96 of these are enabled on the RX 7900 XTX. The RX 7900 XT has 84 out of 96 unified compute units enabled, which works out to 5,376 stream processors. The new RDNA3 next-generation compute unit introduces dual-issue stream processors, which essentially double their throughput generation-over-generation. This is a VLIW approach, AMD does not double the rated shader count though, so it's 6144 for the full GPU (96 CU x 64 shaders per CU, not 128 shaders per CU).

Each of the six MCDs contains a 64-bit wide GDDR6 memory interface, and 16 MB of Infinity Cache memory. Six of these MCDs add up to the GPU's 384-bit wide memory interface, and 96 MB of total Infinity Cache memory. The GCD addresses the 384-bit wide memory interface as a contiguous addressable block, and not 6x 64-bit. Most modern GPUs for the past decade have had multiple on-die memory controllers making up a larger memory interface, "Navi 31" moves these to separate chiplets. This approach reduces the size of the main GCD tile, which will help with yield rates. The Radeon RX 7900 XTX is configured with 24 GB of GDDR6 memory across the chip's entire 384-bit wide memory bus, while the RX 7900 XT gets 20 GB of GDDR6 memory across a 320-bit wide memory bus (one of the MCDs is disabled). The disabled MCD isn't not "missing", but there's some dummy silicon dies there to provide stability for the cooler mounting.

Each CU also features two AI acceleration components that provide a 2.7x uplift in AI inference performance over SIMD, and a second-generation RT accelerator that provides new dedicated instructions, and a 50% performance uplift in ray tracing performance. The AI cores are not exposed through software, software developers cannot use them directly (unlike NVIDIA's Tensor Cores), they are used exclusively by the GPU internal engines. Later today AMD will give us a more technical breakdown of the RDNA3 architecture.
For the RX 7900 XTX, AMD is broadly claiming an up to 70% increase in traditional raster 3D graphics performance over the previous-generation flagship RX 6950 XT at 4K Ultra HD native resolution; and an up to 60% increase in ray tracing performance. These gains should be good to catch RTX 4080, but AMD was clear that they are not targeting RTX 4090 performance, which comes at a much higher price point, too.
AMD is attributing its big 54% performance/Watt generational gains to a revolutionary asynchronous clock domain technology that runs the various components on the GCD at different frequencies, to minimize power draw. This seems similar in concept to the "shader clock" on some older NVIDIA architectures.
AMD also announced FSR 3.0, the latest generation of its performance enhancement, featuring Fluid Motion technology. This is functionally similar to DLSS 3 Frame Generation, promising a 100% uplift in performance at comparable quality—essentially because the GPU is generating every alternate frame without involving its graphics rendering pipeline.
The new dual-independent media-acceleration engines enable simultaneous encode and decode for AVC and HEVC formats; hardware-accelerated encode and decode for AV1, and AI-accelerated enhancements. The new AMD Radiance Display Engine introduces native support for DisplayPort 2.1, with 54 Gbps display link bandwidth, and 12 bpc color. This enables resolutions of up to 8K @ 165 Hz with a single cable; or 4K @ 480 Hz with a single cable.
The "Navi 31" GPU in its full configuration has a raw compute throughput of 61 TFLOPs, compared to 23 TFLOPs of the RDNA2-based Navi 21 (a 165% increase). The shader and front-end of the GPU operate at different clock speeds, with the shaders running at up to 2.30 GHz, and the front-end at up to 2.50 GHz. This decoupling has a big impact on power-savings, with AMD claiming a 25% power-saving as opposed to running both domains at the same 2.50 GHz clock.
AMD claims the Radeon RX 7900 XTX to offer a 70% performance increase over the RX 6950 XT.

The complete slide-deck follows.
Add your own comment

336 Comments on AMD Announces the $999 Radeon RX 7900 XTX and $899 RX 7900 XT, 5nm RDNA3, DisplayPort 2.1, FSR 3.0 FluidMotion

#26
dragontamer5788
mb194dcReviews will be interesting for performance. Can't trust the slides too much.
Looks like good Rasterization, bad Raytracing, mediocre compute? There's probably a good market for this.
Posted on Reply
#27
Space Lynx
Astronaut
dragontamer5788Looks like good Rasterization, bad Raytracing, mediocre compute? There's probably a good market for this.
the only market for me. 165hz 165 fps ultra setting in red dead 2 with no frame drop. here i come.

ultra smooth gameplay baby. fuck RT
Posted on Reply
#28
natr0n
xt and xtx brings back memories of yesteryear
Posted on Reply
#29
Denver
RavenmasterNvidia is fucked if AMD's card can outperform the 4090 at that price.


Comparing the techpowerup numbers and the performance gain announced in some titles, the conclusion is that the 7900XTX is only about 10% slower than the 4090 in CB2077 and WD Legion.

Of course, the methodologies must be different making the comparison invalid, but it is already a good speculative basis. I also suspect that AMD could tie or beat the 4090 by pushing the TDP far beyond the efficiency curve, but that would result in a very small advantage and the design would have to be more robust... I'm happy with the path AMD chose, must be the most surprising release in recent years.
Posted on Reply
#30
btk2k2
Denver

Comparing the techpowerup numbers and the performance gain announced in some titles, the conclusion is that the 7900XTX is only about 10% slower than the 4090 in CB2077 and WD Legion.

Of course, the methodologies must be different making the comparison invalid, but it is already a good speculative basis. I also suspect that AMD could tie or beat the 4090 by pushing the TDP far beyond the efficiency curve, but that would result in a very small advantage and the design would have to be more robust... I'm happy with the path AMD chose, must be the most surprising release in recent years.
Pushing past the sane part of the v/f curve is what AIBs are for.

Feels like 4870 again only not quite as aggressive on price. Still the $999 part should be a nice amount faster than the $1,200 4080 16GB in raster.
Posted on Reply
#31
Crackong
2 x 8pin
They actually pointed that out as a selling point..and made me laughed
They do catched up with news.


Posted on Reply
#32
Space Lynx
Astronaut
natr0nxt and xtx brings back memories of yesteryear
don't worry, the bots and third party scalper bandit merchants will make sure none of us have one until summer 2023. it was a fun hour partying watching Lisa tear it up, but back to reality. shit.
Posted on Reply
#33
ARF
Crackong2 x 8pin
They actually pointed that out as a selling point..and made me laughed
They do catched up with news.


The more I look at them (8-pin PCIe power cables), the more I appreciate them :)
Posted on Reply
#34
Space Lynx
Astronaut
ARFThe more I look at them (8-pin PCIe power cables), the more I appreciate them :)
AMDoes what NvDon't - Common Sense
Posted on Reply
#35
N3utro
This is VAT excluded prices right? Which means around 1200€ tax included in EU for a 7900 XTX?

If so then this would mean the 7900 XTX would cost 300€ less than a 16GB 4080 while beeing more powerful, that's impressive.
Posted on Reply
#36
oxrufiioxo
Impressive rasterized performance
Good price
Good power efficiency
Meh RT

The only difference this generation it seems is the 4080 is way more gimped vs the 4090 than the 3080 vs the 3090 but otherwise not much difference with the two top cards other than Nvidia actually improved RT performance even further while RDNA3 only seems to have gotten a 1:1 increase vs rasterized performance.

Still not appealing to me even at 600 usd cheaper but hopefully reviews paint a better picture than amd marketing slides.

The cheaper 7900XT looks like it was originally going to be the 7800XT but they wanted to increase the price by 250 usd...
Posted on Reply
#37
Ravenas
I'm impressed by this card upfront. The size and connectors. AMD is asking me for 25 more Watts, rather than 125, or 275 depending on which BIOS you're using on the 4090. The 4090 size seems rather large in comparison.

Price performance and power consumption are two main focuses. 10 FPS lower average fps and saving $600 would be a win to me.
Posted on Reply
#38
Upgrayedd
CallandorWoTthe only market for me. 165hz 165 fps ultra setting in red dead 2 with no frame drop. here i come.

ultra smooth gameplay baby. fuck RT
Why fuck RT?
Posted on Reply
#39
dir_d
RavenmasterNvidia is fucked if AMD's card can outperform the 4090 at that price.
They don't have to, they just really need to be within 15% of Nvidia at 4k and its a HUGE win.
Posted on Reply
#40
medi01
Looks about 15% slower than 4090 (as expected).

Perf is right where expected from 20% shader bump.

And, it seems MCM design did work, so competitor will either need to kiss goodbye to a chunk of market, or kiss goodbye to planned margins.
Posted on Reply
#41
Space Lynx
Astronaut
UpgrayeddWhy fuck RT?
because its no different than the Nvidia Physx days. Physx looks great in games true, I loved it in Batman, but it made the fps so low i couldn't enjoy the game.

RT is no different. i just want smooth gameplay. just give me that first, then work on these additional things.
Posted on Reply
#42
medi01
UpgrayeddWhy fuck RT?
Because the only reliable result form switching it on is "it tanks your FPS".
As for visuals, even proponents of it go into "oh, it's because games didn't implement it right..."

Years after introduction RT remains a clumsy gimmick.
Posted on Reply
#43
Space Lynx
Astronaut
dir_dThey don't have to, they just really need to be within 15% of Nvidia at 4k and its a HUGE win.
nvidia could leave the gaming market and still be fine my dudes. their money comes from AI, healthcare, data analysis, etc. we are chump change to nvidia. which is why they no longer cater to us and only the ultra rich. even though without decades of support from gamers, they wouldn't be where they are today.

funny how life works eh
Posted on Reply
#44
dir_d
CallandorWoTnvidia could leave the gaming market and still be fine my dudes. their money comes from AI, healthcare, data analysis, etc. we are chump change to nvidia. which is why they no longer cater to us and only the ultra rich. even though without decades of support from gamers, they wouldn't be where they are today.

funny how life works eh
AMD has CDNA for that as well they just dont make giant presentations about it like Nvida does. Granted Nvida is a bigger player in the AI scene but they arent the only one.
Posted on Reply
#45
EatingDirt
TossI predict: nvidia faster up to 20% at 4k, and slower in 1080p.
RavenmasterNvidia is fucked if AMD's card can outperform the 4090 at that price.
From the numbers AMD provided, (~1.7x faster than current gen flagship) the 7900 TXT is about 13% slower than the 4090 in rasterization performance. I assume it's why they didn't actually show a comparison with the 4090. Looks like whatever they attempted to do to optimize their raytracing path didn't work, as it looks flat across the board, which is certainly disappointing from my point of view.
Posted on Reply
#46
nikoya
willcf15Assuming the 7900XT is indeed within shooting range of the 4080 16GB (which it seems like it should be based on silicon) and cards are available for MSRP, I'll be switching teams. Even if NVidia backs off their insane pricing, I'll be going AMD on principle because it seems like they haven't lost touch with reality (both with pricing and power draw). The cost difference will nearly cover a shiny new Freesync monitor, and the 6-year-old Gsync one I'm using now will become a long-needed secondary. Well played AMD!
You won't regret.
I prefer by far the AMD Software interface over the NVidia setup tab+GForce Experience. I think it is much much better.

-AMD everything is under 1 SW, even Tuning, recording ect
- NVidia 1old interface for settings. then GF Experience need login I hated that.
- NVidia need also afterburner+Riva tuner
thats basically 4 different interfaces/sw to manage. It was fun at the begining.

Now I prefer the comfort/clarity of AMD, it's loaded with hundreds of settings which I love and easy to navigate and looks super nice.

Then regarding bugs and crash... Over 1 year and half on AMD, I never had any noticeable crash that I can remember.

For all these reasons if the Perf/Price is on par with NVidia, I'll 100% stay with AMD.
Posted on Reply
#47
TheoneandonlyMrK
Ahhh my GPU,, hopefully.

Still a bit expensive for my liking but given the market and specifications, probably fair, I'm already looking round for something to sell lol.

Do I Neeeeed a car?! , I do have two lungs ATM I suppose, kidney anyone,. Ob-.
Posted on Reply
#48
cvaldes
N3utroThis is VAT excluded prices right?
US MSRP never includes sales tax because the latter depends on location.

AMD did not provide pricing for anywhere other than the USA today.
Posted on Reply
#49
oxrufiioxo
CallandorWoTnvidia could leave the gaming market and still be fine my dudes. their money comes from AI, healthcare, data analysis, etc. we are chump change to nvidia. which is why they no longer cater to us and only the ultra rich. even though without decades of support from gamers, they wouldn't be where they are today.

funny how life works eh
They are so far ahead in mindshare they can afford to price much higher than AMD.

If amd could price their cards the same as Nvidia they would. They are not giving us a 7900XTX at 1000 usd out of the kindness of their hearts.

Look at their cpu division the minute they caught up to intel prices went to $h1+.

The same will happen if they ever have overall parity with nvidia on the gpu side of things.
Posted on Reply
#50
Space Lynx
Astronaut
well i sure as fuck am not spending $1200 on a rtx 3080. so here is to hoping i can beat the fucking bots on december 13th.

if i i do, im retiring from the hobby for a solid 10 years. fuck the noise, time to game.
Posted on Reply
Add your own comment
Nov 23rd, 2024 06:33 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts