Tuesday, October 25th 2022

AMD Radeon RX 7900 XTX to Lead the RDNA3 Pack?

AMD is reportedly bringing back the "XTX" brand extension to the main marketing names of its upcoming Radeon RX 7000-series SKUs. The company had, until now, reserved the "XTX" moniker for internal use, to denote SKUs that max out all hardware available on a given silicon. The RX 7000-series introduce the company's next-generation RDNA3 graphics architecture, and will see the company introduce its chiplets packaging design to the client-graphics space. The next-generation "Navi 31" GPU will likely be the first of its kind: while multi-chip module (MCM) GPUs aren't new, this would be the first time that multiple logic chips would sit on a single package for client GPUs. AMD has plenty of experience with MCM GPUs, but those have been single logic chips surrounded by memory stacks. "Navi 31" uses multiple logic chips on a package; which is then wired to conventional discrete GDDR6 memory devices like any other client GPU.

The rumored Radeon RX 7900 XTX is features 12,288 stream processors, likely across two logic tiles that contain the SIMD components. These tiles are [for now] rumored to be built on the TSMC N5 (5 nm EUV) foundry process. The Display CoreNext (DCN), and Video CoreNext (VCN) components, as well as the GDDR6 memory controllers, will be built on separate chiplets that are likely built on TSMC N6 (6 nm). The "Navi 31" has a 384-bit wide memory interface. This is 384-bit and not "2x 192-bit," because the logic tiles don't have memory interfaces of their own, but rely on memory controller tiles shared between the two logic tiles, much in the same as a dual-channel DDR4 memory interface being shared between the two 8-core CPU chiplets on a Ryzen 5950X processor.
The RX 7900 XTX features 24 GB of GDDR6 memory across a 384-bit wide memory interface. This memory ticks at 20 Gbps speed, which means a raw memory bandwidth of 960 GB/s. AMD is also expected to deploy large on-die caches, which it calls the Infinity Cache, to further lubricate the GPU's memory sub-system. The most interesting aspect of this rumor is the card's typical board power value, of 420 W. Technically, this is in the same league as the 450 W typical graphics power value of the GeForce RTX 4090. Since its teaser earlier this year in the launch event of the Ryzen 7000 series desktop processors, speculation is rife that AMD will not deploy the 12+4 pin ATX 12VHPWR power connector with its Radeon RX 7000-series GPUs, and the reference-design board likely has up to three conventional 8-pin PCIe power connectors. You're any way having to spare four 8-pin connectors for an RTX 4090.

AMD's second-best SKU based on the "Navi 31" is expected to be the RX 7900 XT, with fewer stream processors—likely 10,752. The memory size is reduced to 20 GB, and the memory interface narrowed to 320-bit, which at 20 Gbps memory speed produces 800 GB/s of bandwidth. Keeping up with the trend of AMD's second-largest GPU having half the stream processors of the largest one (eg: "Navi 22" having 2,560 against the 5,120 of the "Navi 21,") the "Navi 32" chip will likely have one of these 6,144-SP logic tiles, and a narrower memory interface.
Source: VideoCardz
Add your own comment

95 Comments on AMD Radeon RX 7900 XTX to Lead the RDNA3 Pack?

#26
Jeager
Mack4285Another big brick that draws 350-400W and costs a fortune. I'll pass.
And how to you plan to play games 8K@240 FPS ? (you know, the new standard)
Posted on Reply
#27
Unregistered
I find it funny that while AMD bought ATI, today's AMD looks more like ATI, they dropped the green for Red, naming scheme. They should've dropped the AMD name for ATI.
dj-electricAs silly as it is, I remember when XTX was a statement. Anyone who has been around the block remembers the X1950 XTX
Yes the queen of DX9.
#28
human_error
dj-electricAs silly as it is, I remember when XTX was a statement. Anyone who has been around the block remembers the X1950 XTX
I have fond memories of my X1950XTX crossfire setup. Think I have the cards in boxes upstairs somewhere.

Nostalgia is real if they go for the name.
Posted on Reply
#29
Legacy-ZA
ZetZet
It's not so far-fetched really, some companies, unlike nVidia, care about their reputation, take EVGA for example.
Posted on Reply
#30
ratirt
Really curious about the performance, the power draw which comes with the performance but mostly about the price, as this one will tell if AMD is following NV greedy behavior or is trying rather to reach for dominance or at least higher market share.
Posted on Reply
#31
Oberon
There is only one "logic die" (i.e. GCD) and six MCDs on Navi 31. Confirmed long ago after it was reported by Angstronomics.
Posted on Reply
#32
Totally
Dirt ChipLookes bad, I agree. Interesting to know at what circumstance it happened and if it was running under spec.
Guy had literally just bought it and had it running his system for a few hours. Not sure how you'd expect one to run it out of spec.
Dirt ChipBut we also have burned 8PIN and many other connectors type as well, so nothing new here.
The 12VHPWR is new and somewhat controversial so any pop with it achieve news in a flesh. No one is interested about burned 8PIN but they happened rest assure - LMGTFY.
Every single one of those(I checked) was either due to a bad power supply or continuous mining for a couple years (probably the PSU again). So yeah, people aren't going to talk about a failure that occurs after years of heavy use/abuse.
Posted on Reply
#34
HBSound
So I guess the AMD GPUs will never deal with CUDA? If I understand correctly, that is the main difference between Nvidia and AMD GPUs.
Posted on Reply
#35
Iocedmyself
wolfI'm not certain that's going to be the best metric to try and assume performance from.

The 3090 has 189.8/556 and the 6900XT has 359.7/899.2 - and the 6900XT certainly didn't perform over and above the 3090 relative to those numbers.

In fact, the 6900XT vs 3090, it had 188% GPixel/s, and 161% GTexel/s

This seems like 7900XTX vs 4090, 130% GPixel/s and 184% GTexel/s - weaker in pixel fill rate relatively and significantly vs the older comparison, but stronger in Texture fill rate relatively.

My takeaway? who knows, but I doubt it's a metric you could hang your hat on for comparison.


XFX Radeon RX 7900XTX Merc Speedster 6969 XXX edition :roll:
Actually, the 6900xt has superior raster performance in a good many titles compared against both the 3090 and 3090 TI and it does so while sucking down 150w less, with 8 gigs less of slower ram and a MSRP of $500 less. Mostly they trade blows, depending on the title, until you get into Ray tracing, where nvidia has a clear lead simply because they have more mature tech, heavily aided by their proprietary DLSS.

If i was going to buy a card for CAD or video editing, i'd snap up a 3080/90/TI in a heartbeat. But i don't think anyone should be that impressed by what Nvidia, or intel for that matter is having to do to cling to their crowns. The planet is dying, and Nvidia is there trying to normalize 600w for a video card so they can market themselves as the leader in 8k gaming performance. Intel is once again pushing CPU power consumption, up to 350w this time so they can claim a 10% performance increase (sometimes) and all they had to do was DOUBLE the power draw. This isn't the direction that things are supposed to be moving in.
Posted on Reply
#36
Eva01Master
X1950XTX's MSRP was USD450 according to this.

That's what I would like to see back the most from that moniker, being top of the line in your generation without requiring me to choose between being able to stay relatively well fed or game at the highest possible settings available at the time.
Posted on Reply
#37
dyonoctis
Mack4285Another big brick that draws 350-400W and costs a fortune. I'll pass.
I mean they could make a card that tops out at 230w, but then people would complain about AMD not being competitive at the high-end. This new gen of hardware opened a can of worms: You either find a way to design an arch that is 50% more efficient than the competition on the same node for the same perf, or you have to push the limits to match them.
Posted on Reply
#38
ZetZet
Legacy-ZAIt's not so far-fetched really, some companies, unlike nVidia, care about their reputation, take EVGA for example.
If they retain the 999 price tag for 7900XT that will already by generous. Their competition is pricing 4080 16GB at 1200 and everyone knows it will be slower than 7900XT, even in RT.
Posted on Reply
#39
Space Lynx
Astronaut
@btarunr I call bs on that image, because Jayz2cents and others have said with confidence that RDNA3 will be using the new power connector that 4090 uses, and that image is using two older style power connectors (if my zoom was correct).

heh, we will see soon enough.
Posted on Reply
#40
wolf
Better Than Native
IocedmyselfActually, the 6900xt has superior raster performance in a good many titles compared against both the 3090 and 3090 TI and it does so while sucking down 150w less, with 8 gigs less of slower ram and a MSRP of $500 less.
what titles does a 6900XT have superior raster performance (by a non insignificant margin, shall we say 10%+? ) and do that consuming 200w while a 3090 consumes 350w. I don't think I've ever seen that.

More materially to my point, the 6900XT enjoyed advantages in the metrics I quoted to the tune of 88% and 61%, I don't think it enjoys a single win over the 3090 to the tune of even 61%, let alone 88%, but I'm sure if you dig hard enough you might find an unrealistic niche example or two where that might be the case.

From what I know, the 6900XT enjoyed a minor lead at 1080p (less than 10% on average), roughly par at 1440p, and the 3090 enjoyed a minor lead at 4k (less than 10% on average)
IocedmyselfMostly they trade blows, depending on the title, until you get into Ray tracing, where nvidia has a clear lead simply because they have more mature tech, heavily aided by their proprietary DLSS.
That's the reality I remember, expect most publications don't really test DLSS, at least not in like for like testing, because then it wouldn't be like for like... so the 3090 trounces a 6900XT for RT, and then you have DLSS to help even more.
IocedmyselfThis isn't the direction that things are supposed to be moving in.
If I were you I'd brace for AMD being all too happy to follow this trend, hell, it's already started.
Posted on Reply
#41
Denver
ratirt@btarunr I think you mean AMD here.
#Biased Write about AMD with Nvidia in mind :p
Posted on Reply
#42
TheinsanegamerN
Dirt ChipTwo logic tiles will be very interesting to see. If it will make the same ZEN effect we might see actual NV-AMD market changes to a more 50:50 situation.
Hopefully they can do 4090 with less (30W?) power but not going 12VHPWR is the wrong way I thing. We will see 4*8pin on OC 3rd party cards...
The size look like a 4090 volume fallow up, and it`s not a good thing at all.
Also, the render pic only have 2*8pin (maybe it`s for the 7900XT, not the XTX) so I corrected it* :)



*for 3rd party cards
AMD being the good guy and eliminating the need for squid connectors that light on fire *toast*.

The 12VPWR connector is unnecessary, frankly. 8 pins are larger and clunky, but they work well, and have for 15 years.
Eva01MasterX1950XTX's MSRP was USD450 according to this.

That's what I would like to see back the most from that moniker, being top of the line in your generation without requiring me to choose between being able to stay relatively well fed or game at the highest possible settings available at the time.
Well that card would be $662 today. Oh, and dont forget the core 2 at the time was a pricy bunch, a lower end core 2 was over $300, or $441 today.

$600 still buys you a whole lotta GPU today.
Posted on Reply
#43
ThomasK
Modern GPU's and PSU's shouldn't catch fire, specially costing an eye-watering $1600. That's all I'm saying.
Posted on Reply
#44
AusWolf
Very nice! All I need is news like this on the 7700 XT (including release date) before I click "buy" on the 6750 XT that I have in my basket.
Posted on Reply
#45
SOAREVERSOR
Legacy-ZAIt's not so far-fetched really, some companies, unlike nVidia, care about their reputation, take EVGA for example.
Right but there is a flip side to that. If you position yourself as the cheaper alternative and sell for less than you can get away with than you cement your reputation as the crappier and budget version in the market and that's a bad thing and hurts you. Conversely if you are the market leader you want to over charger for your product to cement your reputation as the leader and the better product.
Posted on Reply
#46
medi01
zenlasermanGPU-L claims this beast will have 576 GP/s and 2304 GT/s vs the 4090's 440ish/1250ish. If the drivers are sound, we might have a good game here!
With Infinity Cache, AMD was able to compete on par while on much slower bus. I don't see why it would be different this time.
Posted on Reply
#47
kapone32
medi01With Infinity Cache, AMD was able to compete on par while on much slower bus. I don't see why it would be different this time.
Except that the bus is also much wider. It may be difficult to compare these to any AMD card before it. They seem to be fundamentally different than RDNA2 for sure.
Posted on Reply
#48
Guwapo77
ZetZetIf they retain the 999 price tag for 7900XT that will already by generous. Their competition is pricing 4080 16GB at 1200 and everyone knows it will be slower than 7900XT, even in RT.
If AMD does this, I would be flabbergasted. Last gen, they priced their cards based on performance relative to the Nvidia cards. I hope like hell they hit that $1,000 MSRP, but I just don't see it.
Posted on Reply
#49
b1k3rdude
MarsM4NSo what? :twitch: At least basic 8pin connectors do not catch fire like Nvidia's 12VHPWR connector.

Its NOT nVidia's connector, Intel along with PCI SIG created the connector. Intel are shock horrow to blame here, but just like 'USB Gen 3.2 2x2', 12VHPWER connector its going to be an ill fated standard.

WHY this undersized 12pin connector was even created with smaller pins, when existing 8pin PCI power connectors are capable of passing 300W/25A per cponnector. Corsair clearly demonstarted this with thier 2x 8pin to 12pin cables - www.corsair.com/uk/en/Categories/Products/Accessories-%7C-Parts/PC-Components/Power-Supplies/600W-PCIe-5-0-12VHPWR-Type-4-PSU-Power-Cable/p/CP-8920284
TotallyGuy had literally just bought it and had it running his system for a few hours. Not sure how you'd expect one to run it out of spec.
But as Jay pointed out the end user bent the cable wrong, but clearly Jay's comment was in jest. As this is just as bad as Steve Jobsworth telloing a custoemr they were holding the iphone 4 wrong...

I have always bent my cables for airs and tidyness, so depending on which card I upgrade too I think I will be buying a right angle adapter from Der8auer/Thermal Grizzly.
Posted on Reply
#50
medi01
ratirtReally curious about the performance, the power draw which comes with the performance but mostly about the price, as this one will tell if AMD is following NV greedy behavior or is trying rather to reach for dominance or at least higher market share.
The main question is whether AMD has a superior product.
Superior i the sense that they can match or beat competitor, while spending less.

If yes, I'd expect CPU like offensive.
Posted on Reply
Add your own comment
Dec 22nd, 2024 15:45 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts