Saturday, February 15th 2025

Despite Frank Azor's Dismissal, Whispers of a 32 GB Radeon RX 9070 XTX Resurface
Recent rumors hinted at a 32 GB variant of the Radeon RX 9070 XT being in the works, which were quickly dismissed as false information by AMD's Frank Azor. However, reliable sources seem to point to the contrary, stating that a 32 GB variant of the RX 9070 XT, likely dubbed the RX 9070 XTX, is under active development indeed. The source, as pointed out by Wccftech, has a decent track record with AMD-related claims, which sure does add weight to the assertion. Unlike previous XTX-class cards from AMD, which boasted higher clock speeds and core counts, the 9070 XTX is almost certain to feature the same core count as the XT, since the latter already utilizes the full Navi 48 chip - unless, of course, there is an even higher-end chip under wraps.
The VRAM amount seems to indicate that the card will likely be positioned to appease AI enthusiasts. There is also the possibility that the rumored card will be launched under a different branding entirely, although that is not what the post at Chiphell states. Interestingly, Frank Azor did specifically mention that a 32 GB "RX 9070 XT" card is not on the horizon - he did not state that a higher-end XTX card isn't either, which sure does leave room for us to speculate. Benchlife has also chimed in on the matter, claiming that they are aware of AIB partners working on a 32 GB RDNA 4 card with the Navi 48 GPU, which in some ways, confirms the information that came out of Chiphell. The RDNA 4 cards are set to see the light of day soon enough, it seems the wait won't be much longer. However, if the 32 GB card is indeed in the pipeline, it's likely still further down the road.
Source:
Wccftech
The VRAM amount seems to indicate that the card will likely be positioned to appease AI enthusiasts. There is also the possibility that the rumored card will be launched under a different branding entirely, although that is not what the post at Chiphell states. Interestingly, Frank Azor did specifically mention that a 32 GB "RX 9070 XT" card is not on the horizon - he did not state that a higher-end XTX card isn't either, which sure does leave room for us to speculate. Benchlife has also chimed in on the matter, claiming that they are aware of AIB partners working on a 32 GB RDNA 4 card with the Navi 48 GPU, which in some ways, confirms the information that came out of Chiphell. The RDNA 4 cards are set to see the light of day soon enough, it seems the wait won't be much longer. However, if the 32 GB card is indeed in the pipeline, it's likely still further down the road.
84 Comments on Despite Frank Azor's Dismissal, Whispers of a 32 GB Radeon RX 9070 XTX Resurface
RX 9070 XT directly replaces RX 7800 XT!
makes zero sense.
>45TF of raster, 9% extra compute for upscaling (around the hit of FSR3 and/or DLSS3, so this should limit FSR3 perf impact and help FSR4 quality; DLSS3?), along with better RT (through extra tmus; ~+30% ppc?)...
...where 4070ti/5070 is limited by ability/buffer. It *should* probably work out ok...Theoretically they can hit just the right spots to keep 60fps where 4070ti/5070 will not (in raster, upscaling and/or RT)...
...But there still needs to be an XTX, because frankly we just need the option for more bandwidth/raster (because any bit between 45 up to 60TF helps as that's the current battleground).
Maybe they see it as anything more under 60TF/18GB doesn't make sense as that's the next tier over 12GB/<45TF and >45TF/16GB? I don't know.
I would argue 4080/5070ti/5080 will exist with more usable raster, and they aren't going to disappear. I get it's prolly not gonna catch the capability of 7900xt (-4GB of ram) outside the new RT implementation.
They may get obsoleted just as quickly (bc <60TF/16GB) but that doesn't mean people still shouldn't have the choice.
There could still probably be instances where that compute and/or bw would save people at 1440p and/or 1080pRT (and upscaling those to 1440p/4k) even if it is just 16GB.
While I see what they're going for here, and it might work (if they can tune it so it hits all those 60fps mins where 4070ti/5070 do not), it all starts to feel really weird.
Because it feels like nVIDIA and AMD are both holding back (from 8 cluster nvidia [~11264-12288sp/24GB) and a 7900xtx refresh like N48 (which both I would argue people want vs what we got).
I get it; they need to sell 7900xt/xtx's and get rid of inventory...and those parts are not bad bc of capability (1080pRT, 1440->4k raster upscaling, native 4k) and they want people to buy the next-gen...
...but that doesn't mean we should get parts that can't last an extra generation if it's feasible/possible. For nVIDIA that would have meant an extra cluster and more RAM...
...and for AMD that means catching a current or last gen 04 (not counting the dubious 4070ti super) OOTB at better pricing.
As I'm starting to understand what AMD is doing with this architecture (explained in the first sentence) it actually is very interesting and could be really good...
....but it still needs the raw perf/bw if they want to charge a premium vs just scraping by over the scenarios 4070ti/5070 are limited.
They do actually need to compete in the other areas 4080s/5070ti/5080 can excel. Which is to say either higher raster performance through compute/bw and/or higher IQ at similar raster performance.
As it stands, even with a well-balanced design to banish 12GB to the dumpster (if a decent price), it just won't (outside of overclocking an XT).
I really just do not like this situation. I'm not having a 'psychotic break'; it's just that it has been AMD's job to come with a part like this, show the limitations of <45TF/12GB, and compete while having longevity.
It is not their job to come in with such a product and claim it is worth a similar ratio of money as a tier slightly above from nVIDIA, which is just not correct. It just isn't. The market does not abide.
I get they'll both be relegated to similar performance tiers *for the most part* and be obsoleted in *many* instances for certain resolutions at the same time, but that again is not the point.
They are looking at this incredibly the wrong way. It is the *new* AMD, for sure. I don't know if it's Papermaster or what...but you can just tell they have a completely different methodology and I don't like it.
I don't think the market will either. I guess we'll have to see where prices actually settle. If it's anything like the 7900xt, and I think it will be, these prices will plummet pretty quickly.
AMD sees Nvidia lying with their MSRPs and pretends to be more honest by putting higher MSRPs from the start.
Nvidia might do for the first time, what AMD is doing for years, but in a different way. Nvidia was always keeping prices steady, while AMD was starting with high prices, but was lowering them month after month. Nvidia puts a low MSRP this time to do the same, but restricts supply to push prices much higher to have the option to present future price reductions as initial high demand that will go back down to MSRP with the help of higher supply in the future. Nvidia wants to avoid price reductions to be associated with low demand, but get associated with higher supply. Because you wouldn't be able to buy the 5070 Ti at it's MSRP. Or at least very few will be the lucky ones.
My theory is what I post here, in this post, as a reply to Sir Beregond.
Ain't nobody out here itching to buy an AMD card at $700 man. And nobody's forcing people to buy them, either.
And if Nvidia was putting a REAL $549 MSRP price on the 5070, 9070 would be priced at $499 from the beginning.
I love how people in other occasions blame AMD for Nvidia's pricing but in this case the fake MSRPs from Nvidia are not a reason to blame Nvidia for AMD's pricing.
Double standards? Well, yes, that's why we have a monopoly in the GPU market.
$600 and $700 is DOA, full stop. Nobody is buying at those prices. Maybe AMD should've built around $500 being their target price instead of counting on NVIDIA being so greedy that their product's better by default if they actually want market share.
Now, AMD was probably thinking selling those cards at $479 and $599, probably with slim to no profit margin or even a loss in the beginning, hoping to start producing profits later when materials get cheaper. Nvidia could probably sell RTX 5070 at $549 and still have profits. Or maybe both companies where thinking lower prices and then Tramp happened with those tariffs. Don't know.
Now, if you think that 9070 XT is DOA because of it's MSRP and 5070 is great at the same $700 because of it's fake MSRP, OK. Buy the 5070. Not my money.
As a native English speaker and someone who works extensively in written communication, it had me think for a second, and quickly I realised the true meaning based on context rather than punctuation and grammar.
And for that brief moment in time I experienced genuine excitement at the prospect that Frank may have been dismissed, and perhaps replaced by someone more capable.
And yes, the 5070 Ti would be pretty good at $700, thank you.
The only consistency and certainty, though, is the exactly the same reasoning, behind the price formation, as it was before/during RDNA3 launch, that is directly bound to nVidia pricing. And this seems to be the only true thing, among his entire BS speech.
Radeon XI ?
RX 9999 XT ?
or maybe just RX 9070 XTX
so much "X" on it, X rated GPU such a dumb name
AMD should change their nomenclature for gpu name scheme.