Thursday, February 6th 2025
AMD's Frank Azor Expects Upcoming Presentation to Fully Detail RDNA 4 GPUs
AMD debuted its first wave of RDNA 4 graphics cards—consisting of Radeon RX 9070 XT and RX 9070 (non-XT) models—at the beginning of January. At the time, press outlets and PC gaming hardware enthusiasts were equally flummoxed by Team Red's confusing presentation strategy. Invited attendees of CES 2025 were allowed to handle demonstration samples, but board partners appeared to be sworn to secrecy regarding technical specifications or performance figures. Miscellaneous leaks and rumors have seeped out since then—according to insiders, AMD was prepping its new Radeon product line for launch late last month. A re-scheduled rollout is seemingly in the works, possibly on next month's calendar entry. Benchlife (via VideoCardz) believes that a pre-launch showcase event is lined up for late February.
Following publication of the latest RDNA 4-related leaks, a brave soul engaged with AMD's Frank Azor on social media. Dee Batch, a loyal and long-term supporter of Radeon gaming hardware, sent a query to Team Red's chief architect of gaming solutions: "can we see the RDNA 4 full presentation? I honestly feel you can prevent many gamers from getting a GeForce RTX 5070 or RTX 5070 Ti GPU...Please, do not miss this opportunity to gain gamer mind share." Azor replied with a short sentence: "yes, full details are coming soon." This brief interaction attracted additional participants—VideoCardz noted that the Team Red executive was taking on board feedback about expectations surrounding RDNA 4 MSRPs. Late last month, Azor refuted rumors of the Radeon RX 9070 XT pricing starting at a baseline of $899. NVIDIA has officially disclosed price points of $549 (RTX 5070) and $749 (RTX 5070 Ti)—AMD enthusiasts have their fingers crossed in hope of TBA competitive numbers.
Sources:
Dee_Batch Tweet, VideoCardz
Following publication of the latest RDNA 4-related leaks, a brave soul engaged with AMD's Frank Azor on social media. Dee Batch, a loyal and long-term supporter of Radeon gaming hardware, sent a query to Team Red's chief architect of gaming solutions: "can we see the RDNA 4 full presentation? I honestly feel you can prevent many gamers from getting a GeForce RTX 5070 or RTX 5070 Ti GPU...Please, do not miss this opportunity to gain gamer mind share." Azor replied with a short sentence: "yes, full details are coming soon." This brief interaction attracted additional participants—VideoCardz noted that the Team Red executive was taking on board feedback about expectations surrounding RDNA 4 MSRPs. Late last month, Azor refuted rumors of the Radeon RX 9070 XT pricing starting at a baseline of $899. NVIDIA has officially disclosed price points of $549 (RTX 5070) and $749 (RTX 5070 Ti)—AMD enthusiasts have their fingers crossed in hope of TBA competitive numbers.
31 Comments on AMD's Frank Azor Expects Upcoming Presentation to Fully Detail RDNA 4 GPUs
- 9070XT -> U$ 599-649, perf. >/= 5070ti
- 9070 -> U$ 499, 15-20% faster than the 5070.
I'm not buying either, pretty happy with my current sig rig, so eh. Wafers at TSMC are much more expensive then they used to be, add in to that all the other inflation of costs of shipping chips around the world, etc. I don't see anything less than $699 really.
By the by I still think this card will end up just at or under 7900XT performance. But I'm open for surprises.
9070 - $449 - 20% faster than 4070S = 7900XT
9070XT - $599 - 20% faster than 4070TiS = 7900XTX
9070 $599 and smack in the middle of 5070 and 5070Ti.
1. 1440p native non-rt performance, (including upscaling to 4k).
2. 1080pRT and upscaling 960p->1440p and 1080->4k. Native 1080p for 9070, 1440p upscaled for 9070xt, the third on a higher-end sku (if it exists). And yes..I judge by absolute (overclocking) performance.
3. FSR4 image quality and performance hit.
4. If there is a 'xtx', and if it's priced at $550 or $600.
First point should already make any of them a good relative value, but upscaling a key factor if (higher-end?) models really can replace the 7900xt (in games that require less than 16GB). Lots of peeps going 4k.
Second important bc that's what people want to get the most out of their experience; it is the new paradigm. While games in the future might cross-over the capability of N48, this is a key factor for right now.
The third is obvious. Not only to have a true DLSS competitor, but because 1440p->4k and ~1080p->1440/4k RT upscaling are important metrics now. Those absolutely need to look 'not bad' and perform 'ok'.
There is a world where I can imagine FSR4 has performance hit of ~10% (<FSR3; >DLSS3 less than 4). There is also a world where it could be closer to 20%. I am curious to see what AMD decided to do.
The last part is obviously the Mystery Science Theater 3000 of it all. If it can clock high/they make a 24gbps sku; if AMD is willing to price it vs 5070. I think AMD is probably watching sentiment on this closely...
I hope reviewers hit above tests hard (even if requires a lot of work/deviation from their original suites). While I don't question (mid-tier) value, it would interesting to see what's truly needed in those scenarios.
N48 might surprise in some (especially after OCing), it may falter in others. I have no horse, the question is if AMD can capitalize on instances of >12GB/45TF and if it's good-enough vs ~60TF/16-20GB.
One is a key factor in performance capability/value vs the 12GB nV cards (and 7800xt). The other is if it's 'good-enough' versus higher-priced cards (7900xt/4080/5070ti/5080).
These are the important questions.
I calculated that AMD would need to sell approximately 10 million chips at a profit of $90-100 each to start making a profit. It would be much simpler if they eliminated AIBs and retailers, and sold directly to consumers. However, this would require significant investment from AMD in infrastructure, support, etc. It's not feasible in practice.
On the bright side, unlike GDDR7, GDDR6 is widely available at a lower price, with 16GB costing about $60-70. Additionally, TSMC has released a cheaper 4nm process (N4C) that maintains compatibility with current 4nm designs.
www.anandtech.com/show/21371/tsmc-preps-lower-cost-4nm-n4c-process-for-2025
5070 is to be launched... some day during March which means if 9070 non-XT is released before it might have someone but AMD diehards and people with severe allergy to second hand video cards buying it... Otherwise, it doesn't matter, it's also super dead.
As per the prices... they will be as sweet and amazing as the NVIDIA ones. Like always.
(absolutely not a PS6 SOC on 3nm with 12-core Zen6 C and ~60TF of performance based on a 3nm 7900xtx UDNA replacement shrink with similar clocks to N31 at stock)
This is why your prices are WAYYY off, guys. ~$180 (max) chip + ~$160 (max price) of GDDR6 + ~40% margin = ?
24gbps might be closer to ~$200 (max); potentially binned chips worth slightly more (20%?)...do the math on that.
Simiarly, a cut-down chip might be closer to $150-160, coupled with ~150-160 of GDDR6 + ~40% margin = ?
I honestly have no idea on the value/numberation of cut-down chips nor their pricing of GDDR6 (which is likely cheaper than spot-pricing), so that's how it *could* be $400.
Yes, I know it's more complicated than that. I'm just giving you a rough way to figure it out.
Another way to look at it is the cheapest price available for 7900xt, which this card will not exceed (but may come close) and lacks 4GB of ram comparatively. That ram costs ~$40 (to us).
That price (according to camecamelcamel) was $620, for both XFX and Powercolor, so it wasn't a one-off. IOW, the most one of these cards (even a XTX) is worth is $600.
I would say $550...but that's comparing the limitations of the potential die versus the capabilities of 20/24Gbpsgbps RAM. AMD might try to segment them more with stock clock.
Like I say, if anything people should really be rooting for a $550 XTX where AMD likely really wants/wanted to price it at $600 (or more).
I think AMD is preparing a full-court press with UDNA next year, and with this lackluster RTX50xx series, they do have a chance to catch-up nvidia in 2026 (even with the expected release of Super cards)
That approximate performance (4090) at consumer-friendly pricing and availability is THE spot to hit. Not even a question. If they can do it (at a better price than nVIDIA)...That is the question.
I stand by 4090 (or a cheaper alternative on 3nm) is the Xbox One/PS4 generation wrt GPUs. 4k native or 1440p->4k RT for a very very very very very long time. Ain't nobody upgrading after that.
I know there are people that will lust after 32/36GB cards, maybe even up to 48GB for native 4k....but are most people going buy them? I don't think so.
Approximate dies per wafer (ignoring defects) = 181 dies x 0.85 (85% yield rate) = 153 chips
Original N4P = $17,000, $111/chip.
N4C = $15,555 = $101/chip
AMD should use 20Gbps (cheap, and in good supply), Samsung never put 24Gbps chips into mass production... AMD only sells chips, so its profit margin must be calculated per unit. While the percentage margin may be high, the absolute profit per chip is relatively small. For example, a 100% margin on a $100 chip still results in just $100 of profit. However, AMD has hundreds of millions to billions in R&D, manufacturing, and operational costs that must be recouped through these sales. Then, $600 is possible, but AMD needs to convince everyone in the chain to tighten margins for this to work.
On Nvidia’s side, they weren’t satisfied with just selling chips. Now, they’ve secured massive quantities of GDDR7 to bundle with their GPUs, ensuring AIBs buy the complete set—further increasing their profits. :p
9070 is probably ~$400, guy. The 7800xt (at $480) is the card to beat right now (comp to 4070ti/5070), and AMD may in-fact beat themselves by essentially lowering the 7800xt/9070 price to ~$400.
Because they have to compete with 5060ti in price at the low-end bc many people are literally ignorant. "$400 16GB nVIDIA card? Good". No, 16GB nVIDIA card actually quite shit, but that's what people do.
They are conceivably replacing 7800xt at the same price with something 'better' (for 1440p +- upscaling to 4k/960p->1440p upscaled RT) for the same price. Yes, it is essentially a refresh w/ higher clocks.
I literally don't understand how this is not a fair-to-excellent value outside of buying a $250 2080 ti (or used 6800xt/7800xt for between $250 and less than $400)?
I think that's correct placement and excellent value for new even considering the used market (especially considering addition of 8-bit/'tensor' ops for FSR4)?
These things are going to hulk smash the 5070 in any instance needing greater than 45TF or 12GB...which is the defining line of how a lot of instances of gaming are built.
Are people really going to be tricked bc nVIDIA may clock 5070 absurdly high (conceivably 3150-3165mhz stock with overclocking up to ~3500 [-3700 maybe, if nvidia lets it]?
BC that is literally what they have to do, and they will. Also the die size of 5070 tells us it will clock high (like N48), and the quoted performance metrics on nvidia's own site, but shh.
In the end, just look at the OC clock settings of 7168/7680 16GB vs 6144 and 12GB. It's pretty clear 5070 will have a STOCK clock above 9070 and maybe XT (for comp RT at low-rez), but that's why OC/XTX.
Are people blind to these things (actual performance) versus gimmicks like a high stock clock? I ask bc I don't know. Probably yes. This is why I think AMD is in-part waiting to see how 5070 clocks (stock/oc).
I think they want to know if they can clock it at something like ~3.2-3.3 or need to go all-out and clock it at something higher at stock bc they want to look similar/better compared to nVIDIA in that regard.
:confused::confused::confused::confused::confused::confused:
www.techpowerup.com/329003/amd-to-skip-rdna-5-udna-takes-the-spotlight-after-rdna-4