Friday, June 23rd 2023

Radeon RX 7800 XT Based on New ASIC with Navi 31 GCD on Navi 32 Package?

AMD Radeon RX 7800 XT will be a much-needed performance-segment addition to the company's Radeon RX 7000-series, which has a massive performance gap between the enthusiast-class RX 7900 series, and the mainstream RX 7600. A report by "Moore's Law is Dead" makes a sensational claim that it is based on a whole new ASIC that's neither the "Navi 31" powering the RX 7900 series, nor the "Navi 32" designed for lower performance tiers, but something in between. This GPU will be AMD's answer to the "AD103." Apparently, the GPU features the same exact 350 mm² graphics compute die (GCD) as the "Navi 31," but on a smaller package resembling that of the "Navi 32." This large GCD is surrounded by four MCDs (memory cache dies), which amount to a 256-bit wide GDDR6 memory interface, and 64 MB of 2nd Gen Infinity Cache memory.

The GCD physically features 96 RDNA3 compute units, but AMD's product managers now have the ability to give the RX 7800 XT a much higher CU count than that of the "Navi 32," while being lower than that of the RX 7900 XT (which is configured with 84). It's rumored that the smaller "Navi 32" GCD tops out at 60 CU (3,840 stream processors), so the new ASIC will enable the RX 7800 XT to have a CU count anywhere between 60 to 84. The resulting RX 7800 XT could have an ASIC with a lower manufacturing cost than that of a theoretical Navi 31 with two disabled MCDs (>60 mm² of wasted 6 nm dies), and even if it ends up performing within 10% of the RX 7900 XT (and matching the GeForce RTX 4070 Ti in the process), it would do so with better pricing headroom. The same ASIC could even power mobile RX 7900 series, where the smaller package and narrower memory bus will conserve precious PCB footprint.
Source: Moore's Law is Dead (YouTube)
Add your own comment

169 Comments on Radeon RX 7800 XT Based on New ASIC with Navi 31 GCD on Navi 32 Package?

#2
fancucker
Promising but the 7900xt was essentially supposed to be the 7800xt. This is an attempt at product differentiation and price segmentation (cost-cutting measure). Hopefully they managed to fix the clock issue for added value but that's unlikely.
Posted on Reply
#3
nguyen
So there is no Navi32 GCD with the supposed bug fixes, i guess AMD would like to skip this gen and move on (while putting RDOA3 on fire sale)
Posted on Reply
#4
kapone32
nguyenSo there is no Navi32 GCD with the supposed bug fixes, i guess AMD would like to skip this gen and move on (while putting RDOA3 on fire sale)
You keep complaining about Bug fixes yet you have a 4090.
Posted on Reply
#5
sLowEnd
kapone32You keep complaining about Bug fixes yet you have a 4090.
Throwing around opinions about parts for hypothetical builds is fun. It's probably the reason why half of us are here. lol
Posted on Reply
#6
tabascosauz
Keeping the GCD and easy scaling based on MCD is only, like, the entire point behind bringing RDNA3 to chiplets, what took them so long?? :confused:

Also, "answer to the AD103" is an interesting statement considering it's hoping for parity with 4070 Ti at best? 4070 Ti is AD104.
kapone32You keep complaining about Bug fixes yet you have a 4090.
So I guess that precludes you from ever commenting on 12VHPWR's woes because you never touched a 40 series card? We're all armchair generals here :D

Though I am curious what these clock "bugs" are, even when it comes to problems seems like Navi31 has more relevant concerns that need to be solved first.
Posted on Reply
#7
john_
I haven't spend much time reading stuff about AMD's older Navi 32, but with RDNA3's failure to offer (really) better performance and better efficiency over RDNA2 equivalent specs, those specs of the original Navi 32 where looking like a sure fail compared to Navi 21 and 22. Meaning we could get an 7800 that would have been at the performance levels or even slower than 6800XT or even plain 6800 and 7700 models slower or at the same performance levels with 67X0 cards. I think the above rumors do hint that this could be the case. So, either AMD threw into the dust bin the original Navi 32 and we are waiting for mid range Navi cards because AMD had to build a new chip, or maybe Navi 32 still exists and will be used for 7700 series while this new one will be used for 7800 series models.

Just random thoughts ...
Posted on Reply
#8
kapone32
tabascosauzSo I guess that precludes you from ever commenting on 12VHPWR's woes because you never touched a 40 series card? We're all armchair generals here :D

Though I am curious what these clock "bugs" are, even when it comes to problems seems like Navi31 has more relevant concerns that need to be solved first.
I don't think I have ever commented on 12VHPWR. That is the thing. I have been running a 7900XT since launch basically and I have had zero issues. Indeed I have never enjoyed my PC as much as I do now. That is not hyperbole. If there are issues I expect a driver update to resolve them at some point anyway.
Posted on Reply
#9
lexluthermiester
kapone32You keep complaining about Bug fixes yet you have a 4090.
Don't feed the trolls.
Posted on Reply
#10
kapone32
john_I haven't spend much time reading stuff about AMD's older Navi 32, but with RDNA3's failure to offer (really) better performance and better efficiency over RDNA2 equivalent specs, those specs of the original Navi 32 where looking like a sure fail compared to Navi 21 and 22. Meaning we could get an 7800 that would have been at the performance levels or even slower than 6800XT or even plain 6800 and 7700 models slower or at the same performance levels with 67X0 cards. I think the above rumors do hint that this could be the case. So, either AMD threw into the dust bin the original Navi 32 and we are waiting for mid range Navi cards because AMD had to build a new chip, or maybe Navi 32 still exists and will be used for 7700 series while this new one will be used for 7800 series models.

Just random thoughts ...
I think we are reading too many tea leaves. There is no scenario in which a 6800XT can compete with a 7900XT. I also feel that the 7600 is a red herring. Everybody wants to compare it to the 6700XT when it should be compared to the 6600 period. Now we are getting a 7800XT and as much as we are getting some info there is nothing that tells what the performance of the 7800XT will be.
Posted on Reply
#11
lexluthermiester
kapone32I don't think I have ever commented on 12VHPWR. That is the thing. I have been running a 7900XT since launch basically and I have had zero issues. Indeed I have never enjoyed my PC as much as I do now. That is not hyperbole. If there are issues I expect a driver update to resolve them at some point anyway.
I've never seen any serious issues either. There have been a few very minor game glitches as is normal, but nothing for anyone to get the panties in a bunch about.
Posted on Reply
#12
john_
lexluthermiesterI've never seen any serious issues either. There have been a few very minor game glitches as is normal, but nothing for anyone to get the panties in a bunch about.
Probably referencing to AMD's slide where it says that Navi 31 is designed to go over 3.0GHz and that higher power consumption in some scenarios, if I am not mistaken, idle, multi monitor, video playback - something like that.
kapone32I think we are reading too many tea leaves. There is no scenario in which a 6800XT can compete with a 7900XT. I also feel that the 7600 is a red herring. Everybody wants to compare it to the 6700XT when it should be compared to the 6600 period. Now we are getting a 7800XT and as much as we are getting some info there is nothing that tells what the performance of the 7800XT will be.
I do read many tea leaves here, you on the other hand don't read correctly what I wrote. :p Never compared 6800XT with 7900XT.

Also 7600 should have been as fast as 6700, just as a generational rule where the x600 should be as fast or faster than previous x700 in both Nvidia's and AMD's case.
But the problem is not 7600 vs 6700. It's 7600 vs 6650 XT. Same specs, same performance, meaning RDNA2 to RDNA3 = minimal gains.
Posted on Reply
#13
lexluthermiester
john_Probably referencing to AMD's slide where it says that Navi 31 is designed to go over 3.0GHz and that higher power consumption in some scenarios, if I am not mistaken, idle, multi monitor, video playback - something like that.
But that is not a bug or glitch situation.
Posted on Reply
#14
Vayra86
The problem isn't RDNA3, its the competitive edge. The whole RDNA3 'stack' (three SKUs lol) just doesn't lead and also fails to cover a relevant stack position in price/perf. Ada realistically has it beat; its ever so slightly more power efficient, has bigger featureset, and sacrificed raster perf/$ compared to RDNA3 is compensated for by DLSS3, if that's what you're willing to wait for. All aspects that command a premium. It should have competed on price. The only advantage it has is VRAM, but RDNA2 has that too.

That's where people imagine that 7800XT. But I think we already have the 7900XT doing this. Its just too expensive. The price drops we saw in news yesterday are finally nudging it into the right place, but now there is no wow effect and local pricing will adjust too slowly for it to really make a dent.

AMD did a penny wise pound foolish launch imho.
john_Probably referencing to AMD's slide where it says that Navi 31 is designed to go over 3.0GHz and that higher power consumption in some scenarios, if I am not mistaken, idle, multi monitor, video playback - something like that.


I do read many tea leaves here, you on the other hand don't read correctly what I wrote. :p Never compared 6800XT with 7900XT.

Also 7600 should have been as fast as 6700, just as a generational rule where the x600 should be as fast or faster than previous x700 in both Nvidia's and AMD's case.
But the problem is not 7600 vs 6700. It's 7600 vs 6650 XT. Same specs, same performance, meaning RDNA2 to RDNA3 = minimal gains.
I'm sure 3 Ghz was a stretch goal for them or something, because I have actually seen my 7900XT go over 2900 on rare occasions. All it took was nudge the max frequency slider to the right. You won't have it reliably, but it will just situationally boost to it.
Posted on Reply
#15
Dr. Dro
kapone32You keep complaining about Bug fixes yet you have a 4090.
Can't blame anyone who chose a 4090 over any AMD GPU, it's faster, more feature complete, better maintained, it's simply a product that AMD currently can't hope to match. They don't have either the engineering or software development required to release a product of its caliber at this time.
Vayra86The problem isn't RDNA3, its the competitive edge. The whole RDNA3 'stack' (three SKUs lol) just doesn't lead and also fails to cover a relevant stack position in price/perf.
I thought the same until I saw the RX 7600, and nope, RDNA 3 is practically irredeemable at this point. It's just bad beyond belief if you stack it against Ada Lovelace. And from my standpoint as someone with a high-end Ampere card, it's not for me either. Its marginal gains in raster and equal RT performance at the cost of losing of the whole feature set of the RTX ecosystem and Nvidia's superior driver software make it unworthy of me even considering a replacement. This generation has failed, and short of massive price cuts, I don't think any of these are worth buying.

Unfortunately no heads will roll, Radeon needs new blood, innovative people to develop products and the brand. The boomers that developed these GPUs in the 2000s and linger in upper management need to go. They can't keep up with the industry anymore.
Posted on Reply
#16
john_
lexluthermiesterBut that is not a bug or glitch situation.
The 3GHz, it's not. This power consumption in multi monitor and video playback is unacceptable in 2023.

This is from the original review of TechPowerUp but I think it still remains a problem even today.
Vayra86I'm sure 3 Ghz was a stretch goal for them or something, because I have actually seen my 7900XT go over 2900 on rare occasions. All it took was nudge the max frequency slider to the right. You won't have it reliably, but it will just situationally boost to it.
Probably.
Vayra86The problem isn't RDNA3
In my opinion it is. Looking at 6650XT and 7600 specs side by side and then no performance change in games, that's a problem.
Posted on Reply
#17
the54thvoid
Super Intoxicated Moderator
Dr. DroRadeon needs new blood, innovative people to develop products and the brand. The boomers that developed these GPUs in the 2000s and linger in upper management need to go. They can't keep up with the industry anymore.
I imagine it's boomers which design the GPU's on both sides.
Posted on Reply
#18
lexluthermiester
john_The 3GHz, it's not. This power consumption in multi monitor and video playback is unacceptable in 2023.
That's more opinion and perspective than a conclusion about proper functionality. The point being that AMD's recent GPUs and drivers for same have been exceptionally stable. Whether or not the GPU uses a given amount of power in a specific situation is not something that will effect stability in desktop, media viewing or gaming.
john_In my opinion it is.
Thus...
Posted on Reply
#19
Vayra86
Dr. DroCan't blame anyone who chose a 4090 over any AMD GPU, it's faster, more feature complete, better maintained, it's simply a product that AMD currently can't hope to match. They don't have either the engineering or software development required to release a product of its caliber at this time.



I thought the same until I saw the RX 7600, and nope, RDNA 3 is practically irredeemable at this point. It's just bad beyond belief if you stack it against Ada Lovelace. And from my standpoint as someone with a high-end Ampere card, it's not for me either. Its marginal gains in raster and equal RT performance at the cost of losing of the whole feature set of the RTX ecosystem and Nvidia's superior driver software make it unworthy of me even considering a replacement. This generation has failed, and short of massive price cuts, I don't think any of these are worth buying.

Unfortunately no heads will roll, Radeon needs new blood, innovative people to develop products and the brand. The boomers that developed these GPUs in the 2000s and linger in upper management need to go. They can't keep up with the industry anymore.
Its obviously not for you if you're sitting on a 3090 that is already within spitting distance :) I think you're exaggerating a little. AMD's GPU stack has been in a far worse position not too long ago, arguably from (post-) Hawaii XT up to and including RDNA1. They have a more competitive stack since RDNA2, not a worse one. RDNA3 features that are missing are mostly in the RT performance/DLSS camp, and even despite that they run everything fine. Its really not relevant that a 4090 exists with a much higher perf cap; everything below it is samey in almost every way between these two camps.
Posted on Reply
#20
TriCyclops
I hope that's true. I don't like the SKU milking that AMD has engaged in this generation.
Posted on Reply
#21
lexluthermiester
TriCyclopsI hope that's true. I don't like the SKU milking that AMD has engaged in this generation.
You mean providing a variety of products that people can buy? No, that's never a good thing, is it.. :rolleyes:
Posted on Reply
#22
sLowEnd
Vayra86Its obviously not for you if you're sitting on a 3090 that is already within spitting distance :) I think you're exaggerating a little. AMD's GPU stack has been in a far worse position not too long ago, arguably from Hawaii XT up to and including RDNA1. They have a more competitive stack since RDNA2, not a worse one. RDNA3 features that are missing are mostly in the RT performance/DLSS camp, and even despite that they run everything fine.
Back in '06-07ish, I had the displeasure of using their Catalyst Control Center for a while. The drivers never crashed on me, but damn was the interface awful. Thankfully ATI Tray Tools existed.
Posted on Reply
#23
nguyen
Dr. DroCan't blame anyone who chose a 4090 over any AMD GPU, it's faster, more feature complete, better maintained, it's simply a product that AMD currently can't hope to match. They don't have either the engineering or software development required to release a product of its caliber at this time.

I thought the same until I saw the RX 7600, and nope, RDNA 3 is practically irredeemable at this point. It's just bad beyond belief if you stack it against Ada Lovelace. And from my standpoint as someone with a high-end Ampere card, it's not for me either. Its marginal gains in raster and equal RT performance at the cost of losing of the whole feature set of the RTX ecosystem and Nvidia's superior driver software make it unworthy of me even considering a replacement. This generation has failed, and short of massive price cuts, I don't think any of these are worth buying.

Unfortunately no heads will roll, Radeon needs new blood, innovative people to develop products and the brand. The boomers that developed these GPUs in the 2000s and linger in upper management need to go. They can't keep up with the industry anymore.
Well once AMD keep dipping into the red as they did last quarter, heads will roll, let hope it's this guy first LOL
Posted on Reply
#24
Vayra86
sLowEndBack in '06-07ish, I had the displeasure of using their Catalyst Control Center for a while. The drivers never crashed on me, but damn was the interface awful. Thankfully ATI Tray Tools existed.
IKR. People have short memory. AMD has its shit in pretty good order lately, in a relative sense; and Nvidia's has arguably gotten worse, buzzwords do not scale linearly with quality if you ask me. People herald the 4090 on the regular, I'm seeing a 4 slot 450W behemoth that primarily exists to be able to say 'I'm fastuuhhr' while still running path traced RT, the only workload where it really would matter to put half a brick house in your PC, at sub 30 FPS and north of 1500 bucks. I'm also seeing sub 150W cards not getting the right treatment. Ada is super efficient, but we still get 2-3 slot midrange? Okay. The same thing goes for VRAM relative to price and core oomph. I'm not seeing a fantastic Ada stack to be honest. Every single product in it, is somehow sub optimal on the hardware or perf/price front. DLSS3 and supposed fantastic RT perf (20% gap lol) should make up for what's really missing under the hood.

People parrot marketing and short term nonsense. Long term strategy is where its at.
Posted on Reply
#25
john_
lexluthermiesterThat's more opinion and perspective than a conclusion about proper functionality. The point being that AMD's recent GPUs and drivers for same have been exceptionally stable. Whether or not the GPU uses a given amount of power in a specific situation is not something that will effect stability in desktop, media viewing or gaming.


Thus...
Obviously we are exchanging opinions. And as Harry Callahan have said... :p

Don't know what a GPU needs to do to keep feeding a high refresh rate monitor or a dual monitor setup. But close to 100W power consumption in video playback is unacceptable in 2023. While an opinion I also believe it is logical to say that. Your smartphone in your pocket or on your desk can probably playback the same video files at 5-10W power consumption. In fact even AMD's own APUs play video while keeping their power consumption for the whole SOC at even lower than 15W. So, something gone wrong with RDNA 3 here. And even if we consider this still an opinion, then there is this chart

where a 6900XT is at half the power consumption.
AMD Radeon RX 7900 XTX Review - Disrupting the RTX 4080 - Power Consumption | TechPowerUp
Posted on Reply
Add your own comment
Nov 21st, 2024 11:42 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts