Thursday, November 3rd 2022

AMD Announces the $999 Radeon RX 7900 XTX and $899 RX 7900 XT, 5nm RDNA3, DisplayPort 2.1, FSR 3.0 FluidMotion

AMD today announced the Radeon RX 7900 XTX and Radeon RX 7900 XT gaming graphics cards debuting its next-generation RDNA3 graphics architecture. The two new cards come at $999 and $899—basically targeting the $1000 high-end premium price point.
Both cards will be available on December 13th, not only the AMD reference design, which is sold through AMD.com, but also custom-design variants from the many board partners on the same day. AIBs are expected to announce their products in the coming weeks.

The RX 7900 XTX is priced at USD $999, and the RX 7900 XT is $899, which is a surprisingly small difference of only $100, for a performance difference that will certainly be larger, probably in the 20% range. Both Radeon RX 7900 XTX and RX 7900 XT are using the PCI-Express 4.0 interface, Gen 5 is not supported with this generation. The RX 7900 XTX has a typical board power of 355 W, or about 95 W less than that of the GeForce RTX 4090. The reference-design RX 7900 XTX uses conventional 8-pin PCIe power connectors, as would custom-design cards, when they come out. AMD's board partners will create units with three 8-pin power connectors, for higher out of the box performance and better OC potential. The decision to not use the 16-pin power connector that NVIDIA uses was made "well over a year ago", mostly because of cost, complexity and the fact that these Radeons don't require that much power anyway.

The reference RX 7900-series board design has the same card height as the RX 6950 XT, but is just 1 cm longer, at 28.7 cm. It is also strictly 2.5 slots thick. There's some white illuminated elements, which are controllable, using the same software as on the Radeon RX 6000 Series. Both cards feature two DisplayPort 2.1 outputs, one HDMI 2.1a and one USB-C.
This is AMD's first attempt at a gaming GPU made of chiplets (multiple logic dies on a multi-chip module). The company has built MCM GPUs in the past, but those have essentially been the GPU die surrounded by HBM stacks. The new "Navi 31" GPU at the heart of the RX 7900 XTX and RX 7900 XT features seven chiplets—a central large graphics compute die (GCD), surrounded by six memory control-cache dies (MCDs). The GCD is built on the TSMC 5 nm EUV silicon fabrication process—the same one on which AMD builds its "Zen 4" CCDs—while the MCDs are each fabricated on the TSMC 6 nm process.

The GCD contains the GPU's main graphics rendering machinery, including the front-end, the RDNA3 compute units, the Ray Accelerators, the display controllers, the media engine and the render backends. The GCD physically features 96 RDNA3 Unified Compute Units (CUs), for 6,144 stream processors. All 96 of these are enabled on the RX 7900 XTX. The RX 7900 XT has 84 out of 96 unified compute units enabled, which works out to 5,376 stream processors. The new RDNA3 next-generation compute unit introduces dual-issue stream processors, which essentially double their throughput generation-over-generation. This is a VLIW approach, AMD does not double the rated shader count though, so it's 6144 for the full GPU (96 CU x 64 shaders per CU, not 128 shaders per CU).

Each of the six MCDs contains a 64-bit wide GDDR6 memory interface, and 16 MB of Infinity Cache memory. Six of these MCDs add up to the GPU's 384-bit wide memory interface, and 96 MB of total Infinity Cache memory. The GCD addresses the 384-bit wide memory interface as a contiguous addressable block, and not 6x 64-bit. Most modern GPUs for the past decade have had multiple on-die memory controllers making up a larger memory interface, "Navi 31" moves these to separate chiplets. This approach reduces the size of the main GCD tile, which will help with yield rates. The Radeon RX 7900 XTX is configured with 24 GB of GDDR6 memory across the chip's entire 384-bit wide memory bus, while the RX 7900 XT gets 20 GB of GDDR6 memory across a 320-bit wide memory bus (one of the MCDs is disabled). The disabled MCD isn't not "missing", but there's some dummy silicon dies there to provide stability for the cooler mounting.

Each CU also features two AI acceleration components that provide a 2.7x uplift in AI inference performance over SIMD, and a second-generation RT accelerator that provides new dedicated instructions, and a 50% performance uplift in ray tracing performance. The AI cores are not exposed through software, software developers cannot use them directly (unlike NVIDIA's Tensor Cores), they are used exclusively by the GPU internal engines. Later today AMD will give us a more technical breakdown of the RDNA3 architecture.
For the RX 7900 XTX, AMD is broadly claiming an up to 70% increase in traditional raster 3D graphics performance over the previous-generation flagship RX 6950 XT at 4K Ultra HD native resolution; and an up to 60% increase in ray tracing performance. These gains should be good to catch RTX 4080, but AMD was clear that they are not targeting RTX 4090 performance, which comes at a much higher price point, too.
AMD is attributing its big 54% performance/Watt generational gains to a revolutionary asynchronous clock domain technology that runs the various components on the GCD at different frequencies, to minimize power draw. This seems similar in concept to the "shader clock" on some older NVIDIA architectures.
AMD also announced FSR 3.0, the latest generation of its performance enhancement, featuring Fluid Motion technology. This is functionally similar to DLSS 3 Frame Generation, promising a 100% uplift in performance at comparable quality—essentially because the GPU is generating every alternate frame without involving its graphics rendering pipeline.
The new dual-independent media-acceleration engines enable simultaneous encode and decode for AVC and HEVC formats; hardware-accelerated encode and decode for AV1, and AI-accelerated enhancements. The new AMD Radiance Display Engine introduces native support for DisplayPort 2.1, with 54 Gbps display link bandwidth, and 12 bpc color. This enables resolutions of up to 8K @ 165 Hz with a single cable; or 4K @ 480 Hz with a single cable.
The "Navi 31" GPU in its full configuration has a raw compute throughput of 61 TFLOPs, compared to 23 TFLOPs of the RDNA2-based Navi 21 (a 165% increase). The shader and front-end of the GPU operate at different clock speeds, with the shaders running at up to 2.30 GHz, and the front-end at up to 2.50 GHz. This decoupling has a big impact on power-savings, with AMD claiming a 25% power-saving as opposed to running both domains at the same 2.50 GHz clock.
AMD claims the Radeon RX 7900 XTX to offer a 70% performance increase over the RX 6950 XT.

The complete slide-deck follows.
Add your own comment

336 Comments on AMD Announces the $999 Radeon RX 7900 XTX and $899 RX 7900 XT, 5nm RDNA3, DisplayPort 2.1, FSR 3.0 FluidMotion

#301
TheoneandonlyMrK
RandallFlaggThis is actually the thing people are arguing about. AMD pretty clearly has ceded the high end and will have nothing to compete beyond a 4080 16GB - and maybe not even that.

For this reason the price comparisons vs the 4090 are also fallacies. A $999 AMD flagship card that probably lands below the $1199 4080 16GB in performance. They are in the same ballpark in price, given that Nvidia has always commanded a 10-20% premium.

Not that it matters to 99% of folks, who are not getting a 4090 anyway. Most of the people arguing here have older mid or upper midrange GPUs (now low end) and probably aren't in the market for a new one anyway, so just arguing about something they aren't buying from either corporation.
The irony your spouting while casually making ridiculous statements and calling others out for bias while being biased, no one Knows what wins out of those two GPU and no 4080 data is out so hyperbolic bias at that.


Then to read your following posts, pure comedy.

I'll await reviews and the 4080, then debate which won.
Posted on Reply
#304
fb020997
wheresmycar6800 XT doesn't cut it for me.... i prefer a wider performance gap over my existing card for a couple of graphically intense games which im invested in (1440/144). The 7800** sounds like a plan for a devised ~$800 budget but unfortunately it wasn't announced for release which sucks. Anyway, it looks like the GPU upgrade plan of action has shifted from 2022 to 2023. Actually i'm kinda proud of myself too... finally drew a line by setting a budget with realistically meaningful performance targets in mind.
I’m in the same boat as you. My Vega 64 is showing her age, so I’m wanting a big upgrade. I’m also targeting your same resolution (1440p144), as a new 27” display is in the upgrade list. My only hope is that the 78xx GPUs won’t be much more than 650-700€, as I also need a waterblock (after 5 years of a full custom loop, I’m never going back to air).
BTW, my Vega allowed me to play lots of great games, very proud of that purchase.
Posted on Reply
#306
Nopa
ModEl4Is this the reason?


Just kidding!
The below is serious:
  • New AMD Radiance Display™ Engine – Provides 12 bit-per-channel color for up to 68 billion colors and higher refresh rate displays compared to AMD RDNA 2 architecture and includes support for DisplayPort 2.1 and HDMI 2.1a.
Radeon IQ is gonna be insane especially when gaming on a 4K QD-OLED.
Posted on Reply
#307
mechtech

[URL='https://www.techpowerup.com/300632/amd-announces-the-usd-999-radeon-rx-7900-xtx-and-usd-899-rx-7900-xt-5nm-rdna3-displayport-2-1-fsr-3-0-fluidmotion'] DisplayPort 2.1[/URL]

That's interesting. One would have guessed at ver 2.0.
Posted on Reply
#308
Ravenas
RavenasI'm impressed by this card upfront. The size and connectors. AMD is asking me for 25 more Watts, rather than 125, or 275 depending on which BIOS you're using on the 4090. The 4090 size seems rather large in comparison.

Price performance and power consumption are two main focuses. 10 FPS lower average fps and saving $600 would be a win to me.
On the flip side, I'm really disappointed with these releases from Nvidia and AMD regarding no PCI 5.0 support. That could have been something that AMD rubbed in Nvidia's face on this launch. Not many reviewers list this as a negative unfortunately.

I think I'm more disappointed in Nvidia regarding this because the 4090 is supposed to be an ultra-high-end GPU priced at $1600, yet lacks PCI 5.0 and DP 2.0+, where the AMD 7900 XTX is a 4080 competitor.

What incentives me to jump from x570 to x670/z790 when the benefits of DDR5 are arguably negligible in 4K gaming... Obviously not PCI 5.0 support. The only reason I'm even considering the launches is because it edges me closer to 4K 120 hz/240hz gaming support consistent 120fps/240 fps in the games I play most frequently.

RTX 4090 & 53 Games: Core i9-13900K vs Ryzen 7 5800X3D Review | TechPowerUp
Posted on Reply
#309
Nopa
PCIe 5.0 x16 may helps gain 1-5 FPS over PCIe 4.0 x16.
DP 2.0 is really a must given that we're in Q4 2022.
Posted on Reply
#310
Ravenas
NopaPCIe 5.0 x16 may helps gain 1-5 FPS over PCIe 4.0 x16.
DP 2.0 is really a must given that we're in Q4 2022.
That's kind of just over simplifying the 5.0 potential. Hell, take away resizable bar too while we're at it, we're only getting 5 extra FPS and who cares if we have platform that supports it and we paid money for it. The proposition of me giving a retailer $1600 out of my account generation over generation without seeing all platform support is almost equivalent to slapping me in face. It makes me wonder how long these cards have been inventoried waiting to clear 3090 ti inventories.

Likely since early 2022, prior to Z590 launch, hence no PCI 5.0 support. What excuse does AMD have? Probably the same.

TSMC's 4nm process coming ahead of schedule - GSMArena.com news
Posted on Reply
#311
RandallFlagg
TheoneandonlyMrKThe irony your spouting while casually making ridiculous statements and calling others out for bias while being biased, no one Knows what wins out of those two GPU and no 4080 data is out so hyperbolic bias at that.


Then to read your following posts, pure comedy.

I'll await reviews and the 4080, then debate which won.
My bias? I guess hating AMD is why I have an AMD GPU?

You guys can't even get through a couple of simple sentences without projecting your bad assumptions, and that is a direct result of your own bias'.

I'm saying here is AMD has ceded the high end. That's obvious, they even said it themselves indirectly by saying their highest end card is a competitor to a 4080.

If the 7900 XTX competes with a 4080, what does the 7800 XT compete with? A 4070? And the 7700 XT? A 4060? That's all very likely.
Posted on Reply
#312
Ravenas
RandallFlaggMy bias? I guess hating AMD is why I have an AMD GPU?

You guys can't even get through a couple of simple sentences without projecting your bad assumptions, and that is a direct result of your own bias'.

I'm saying here is AMD has ceded the high end. That's obvious, they even said it themselves indirectly by saying their highest end card is a competitor to a 4080.

If the 7900 XTX competes with a 4080, what does the 7800 XT compete with? A 4070? And the 7700 XT? A 4060? That's all very likely.
I don't think they've really ceded it in the future, but certainly on this launch generation has given inflation, lack of consumer confidence due to interest rates, and last but not least just starting off with their chiplet design. I mean we're really looking at the birth of what could be a tidal wave of success similar to Ryzen.

I think they expect gamers to react in a way that is saving $600 dollars with performance likely better than 4080 and probably ~5-10% less than 4090.
Posted on Reply
#313
Easo
ARFNo.

Current pricing in Germany:

Radeon RX 6400 - 169.00
Radeon RX 6500 XT - 199.00
Radeon RX 6600 - 261.99

Radeon RX 6600 XT - 339.00
Radeon RX 6650 XT - 339.00
Radeon RX 6700 XT - 439.00
Radeon RX 6750 XT - 499.90
Radeon RX 6800 - 559.00
Radeon RX 6800 XT - 635.90
Radeon RX 6900 XT - 748.00

Radeon RX 6950 XT - 899.00

The majority of people will buy up to the RX 6650 XT that goes for 339 as of now but its price should go downward spiralling because it's crappy for 1080p only.
How nice for Germany. I, however, do not live in Germany. Pretty much none of the "super price cuts" ever arrived there in Latvia (and not only Latvia if you read comments here time from time).
Plus - we are talking about the new gen. How much do you think they are going to cost...?
Posted on Reply
#314
gffermari
RandallFlaggIf the 7900 XTX competes with a 4080, what does the 7800 XT compete with? A 4070? And the 7700 XT? A 4060? That's all very likely.
I think AMD's lineup is made to fill the gaps between nVidia's GPUs rather than compete to a specific series.
They would do that if they were competitive. Like they do with their CPUs.

But now they introduced the XTX variant, so they differ to the previous lineup, 6000.

The only thing that may save the 7000 GPUs is the software raytracing and if they manage to collaborate with Epic (Unreal Engine 5).
If they leave nVidia to put their finger in every AAA title (...to push the developers to use the hardware rt cores), like Cyberpunk, Control, Metro Exodus, Alan Wake 2 etc., they are doomed.
Posted on Reply
#315
W1zzard
ModEl4Is this the reason?


Just kidding!
The below is serious:
  • New AMD Radiance Display™ Engine – Provides 12 bit-per-channel color for up to 68 billion colors and higher refresh rate displays compared to AMD RDNA 2 architecture and includes support for DisplayPort 2.1 and HDMI 2.1a.
It's not. I saw this slide only after making the writeup. This slide is still under NDA btw, so I can't publish any info from it.
Posted on Reply
#316
medi01
The curious bit all this noise is that Samsung 8nm was not as bad as claimed after all.
RandallFlaggYou guys can't even get through a couple of simple sentences without projecting your bad assumption
That's rich coming from someone who doubted 7900XTX could compete with the other 4080 it haven't "unlaunched" yet.
RandallFlaggIf the 7900 XTX competes with a 4080, what does the 7800 XT compete with?
I doubt even NV could answer that question.

You assume that GPU manufacturers are well aware in advance of what is going on in the other camp.
If that would be the case, the 4080 unlaunch would not happen.
Nor "drop a tier" on the last gen cards (3080 was supposed to wield 20GB and be called at least 3080Ti, cost more).

AMD just happens to have that product as fallout from 7900XTX and that is pretty much it.


The "AMD tried to roll out halo, but failed" is so strange, given AMD's choices.
GDDR6, not x.
522mm2 of N5+N6 total.
Oh, also N5 not N4 (mkay, N4 is essentially enhanced N5, but it's still better eh?)

How come someone could cook that up and prepare to trump 610mm2 N4 chip, with GDDR6x and special power connector/crazy TDP?
Posted on Reply
#317
Vayra86
RandallFlaggMy bias? I guess hating AMD is why I have an AMD GPU?

You guys can't even get through a couple of simple sentences without projecting your bad assumptions, and that is a direct result of your own bias'.

I'm saying here is AMD has ceded the high end. That's obvious, they even said it themselves indirectly by saying their highest end card is a competitor to a 4080.

If the 7900 XTX competes with a 4080, what does the 7800 XT compete with? A 4070? And the 7700 XT? A 4060? That's all very likely.
But that statement is categorically wrong because the high end is comprised of more than an x90 card that apart from the last gen never even existed in high end segments, the last x90 we had was a dual GPU GTX 690. Another such card nobody in their right mind actually bought. And better yet, not even Nvidia truly targeted recent x90's at gamers. In Ampere, they targeted 'the creator'. Its a segment above high end, in price, in VRAM, in everything, and it falls between their pro line up and Geforce realistically; it has been doing so since GK110 (Titan).

The halo product is getting waaay too much credit here for determining where the market is. That actual market is definitely NOT at a 1600 dollar card that carves out its own extra special epeen segment at a power target that melts connectors.

Who cares about x90. If AMD plays ball with 4080 16GB competition they are alive kicking and actively competing on the high end. And Nvidia's x80 offerings are 50% cancelled (because they'd look utterly ridiculous not rebadging the 12GB x80 and they knew it) and the other 50% is so far below x90 that it makes you wonder what happened there - alongside specs relative to Ampere that are nothing to write home about. It literally only has a perf/w advantage going for it, alongside suddenly decent amounts of VRAM, yet another complete misfire in Ampere as Nvidia had to re-release the entire stack with other capacities to satisfy demand.

Oh yeah, they win some epeen points on RT for having a rough 20-30% extra perf. Again... relevance to a large target market (yes, target... despite what you've said about what people are/aren't in the market for - that is your bias right there) is slim at best.

It really does depend what you focus on in a GPU. I just want another x70~x80 range piece of silicon that runs all my shit for the next 5-7 years, much like the x80 I have now has been doing. RT? If its there, yay, if its not, yay, honestly, a GPU is a package deal of featureset, price, perf, and overall quality. Nvidia's quality in GPU has been taking a nosedive lately and I'm supposed to hang a spiderweb of untrustworthy cabling off it to make it functional.

Bias. We all have it, it'd be good to consider that - and when I see people who take the x90 as the be all end all metric of perf in a new gen, I see people who have completely lost the plot, nothing else. Slaves to commerce, blinded by marketing. See, that marketing too, is interpreted with bias from all ends.

AMD did cede the high end when they made Polaris and trailed Nvidia by 1,5 ~ 2 generations. Now they trail them exclusively on RT performance they're not even making buck off themselves; after all, AMD controls where gaming TRULY goes by controlling console performance. You might wanna re-evaluate your view on market direction here and look at actual gaming share. The PCMR might think it has market power in the gaming front, but honestly? That power is exercised NOT in RT push... but in the games that run on toasters: indie. Yet another thing of perspective. Are you a real gamer or a hardware/spec whore? ;) The supposed high end where 'AMD isn't competing' according to you is currently that 2-3% of the market you can easily miss. And even they are tied to console ports from their non-competitor regardless. Who's kidding who...? One thing is certain... the 4090 owner is on the very bottom of that food chain, money and fool parted, on to the next one; 750W next time? Why not?

I hope the above clarifies why there are such diametrically opposed takes of what AMD presented here. And there isn't a right or wrong. The market decides; but what I do see here is an AMD that is fully competitive again on GPU, on arguably the whole stack, top to bottom.
Posted on Reply
#318
ARF
EasoHow nice for Germany. I, however, do not live in Germany. Pretty much none of the "super price cuts" ever arrived there in Latvia (and not only Latvia if you read comments here time from time).
Plus - we are talking about the new gen. How much do you think they are going to cost...?
I also don't live in Germany but Germany is the only option for a purchase. I will find a regular traveller who will bring the GFX to me ;)

The prices are:

RX 7900 XTX 24 GB - 1200 after 19% VAT
RX 7900 XT 20 GB - <1100 after 19% VAT
Posted on Reply
#319
ModEl4
W1zzardIt's not. I saw this slide only after making the writeup. This slide is still under NDA btw, so I can't publish any info from it.
I know, I was just kidding.
The link I quoted in my response below the slide, was from AMD's official site (was live when the event ended) so there is no NDA regarding this info.
Posted on Reply
#320
W1zzard
ModEl4I know, I was just kidding.
The link I quoted in my response below the slide, was from AMD's official site (was live when the event ended) so there is no NDA regarding this info.
Ah I missed the 2nd link, yeah that's where I saw it, the press release
Posted on Reply
#321
Nopa
GarrusNobody can help you, you can believe whatever you want. AMD showed off at least 50 percent faster than 6950 XT performance today. That is not 10 percent faster than the 4080, the card that is less than 60 percent of the 4090. It could be 30 percent faster than it.



Exactly there is no point in showing comparisons against a card that costs 60 percent more. They need to wait for the RTX 4080 to release before they can release marketing materials showing comparisons.


They showed charts claiming 50-70 percent faster performance. What more do you want. "didn't show anything solid" um, what?


The 4080 is the most cutdown '80 card released in the last decade. The 3080 was the least cutdown. I think you will be shocked at how bad the 4080 really is compared to your 3080-set expectations.
10% faster than 4090 in both Rasterization and RT for 1000$ would have me jump and pre-order immediately. It's a dream now, AMD will charge 1200-1500$ for such a card.
Posted on Reply
#322
Warrior24_7
AnarchoPrimitiv70% performance increase for the same price sounds good....at $1699 MSRP do people think the 4090 is 70% faster than the 7900xtx to justify the 70% increase in price?
Yep!
Posted on Reply
#323
TheoneandonlyMrK
Warrior24_7Yep!
Well we will have a better idea in a month or so but I personally think Nope!? Preliminarily.

And it's not 70% , who knows what it will be but with markup it is more than double now.

Time will tell.
Posted on Reply
#324
Warrior24_7
Everybody is focused on $1,699 when the card sold out at $2,500-$3,000 scalped! AMD doesn’t have a competitor for this card. It’s easy to sit back and wait for your competitor to complete something, because you lack the confidence in your own design. Then release something after the fact.
Posted on Reply
#325
TheoneandonlyMrK
Warrior24_7Everybody is focused on $1,699 when the card sold out at $2,500-$3,000 scalped! AMD doesn’t have a competitor for this card. It’s easy to sit back and wait for your competitor to complete something, because you lack the confidence in your own design. Then release something after the fact.
Yes that's how graphics cards go, AMD waited for Nvidia to eventually release the 4090 Late and then set about making a non competition card to beat it's smaller un released 4080 card?!?

you correctly state the 4090 sold out but then again if it was scalped do you happen to know if They cleared their stock on, no, I guess not.

Or do you know how many 4090s hit the market, was it a significant amount?!
Posted on Reply
Add your own comment
Jun 1st, 2024 08:12 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts