Monday, January 29th 2024
Top AMD RDNA4 Part Could Offer RX 7900 XTX Performance at Half its Price and Lower Power
We've known since way back in August 2023, that AMD is rumored to be retreating from the enthusiast graphics segment with its next-generation RDNA 4 graphics architecture, which means that we likely won't see successors to the RX 7900 series squaring off against the upper end of NVIDIA's fastest GeForce RTX "Blackwell" series. What we'll get instead is a product stack closely resembling that of the RX 5000 series RDNA, with its top part providing a highly competitive price-performance mix around the $400-mark. A more recent report by Moore's Law is Dead sheds more light on this part.
Apparently, the top Radeon RX SKU based on the next-gen RDNA4 graphics architecture will offer performance comparable to that of the current RX 7900 XTX, but at less than half its price (around the $400 mark). It is also expected to achieve this performance target using a smaller, simpler silicon, with significantly lower board cost, leading up to its price. What's more, there could be energy efficiency gains made from the switch to a newer 4 nm-class foundry node and the RDNA4 architecture itself; which could achieve its performance target using fewer numbers of compute units than the RX 7900 XTX with its 96.When it came out, the RX 5700 XT offered an interesting performance proposition, beating the RTX 2070, and forcing NVIDIA to refresh its product stack with the RTX 20-series SUPER, and the resulting RTX 2070 SUPER. Things could go down slightly differently with RDNA4. Back in 2019, ray tracing was a novelty, and AMD could surprise NVIDIA in the performance segment even without it. There is no such advantage now, ray tracing is relevant; and so AMD could count on timing its launch before the Q4-2024 debut of the RTX 50-series "Blackwell."
Sources:
Moore's Law is Dead (YouTube), Tweaktown
Apparently, the top Radeon RX SKU based on the next-gen RDNA4 graphics architecture will offer performance comparable to that of the current RX 7900 XTX, but at less than half its price (around the $400 mark). It is also expected to achieve this performance target using a smaller, simpler silicon, with significantly lower board cost, leading up to its price. What's more, there could be energy efficiency gains made from the switch to a newer 4 nm-class foundry node and the RDNA4 architecture itself; which could achieve its performance target using fewer numbers of compute units than the RX 7900 XTX with its 96.When it came out, the RX 5700 XT offered an interesting performance proposition, beating the RTX 2070, and forcing NVIDIA to refresh its product stack with the RTX 20-series SUPER, and the resulting RTX 2070 SUPER. Things could go down slightly differently with RDNA4. Back in 2019, ray tracing was a novelty, and AMD could surprise NVIDIA in the performance segment even without it. There is no such advantage now, ray tracing is relevant; and so AMD could count on timing its launch before the Q4-2024 debut of the RTX 50-series "Blackwell."
396 Comments on Top AMD RDNA4 Part Could Offer RX 7900 XTX Performance at Half its Price and Lower Power
2. It was their flagship, $700+ graphics card that was based on their strongest architecture then. It really was better at compute than gaming, no wonder it became the foundation to what is now known as CDNA. 16 GB of HBM2 across four fully enabled stacks, the first GPU to breach the 1 TB/s mark, and the one I had comfortably hit 1.25 TB/s effortlessly because it also overclocked the HBM well. I think a GPU like this should have been supported for more than just about 4 years... yes, of course AMD gets a pass. I assure you if Nvidia came out tomorrow and said "that's it folks, your 7 year old 1080 Ti's had enough of a good run, no more drivers for you", the pitchforks would be scorching hot and ready to poke. But then again, AMD has been no stranger to just axing hardware they don't want to spend resources in maintaining, like the R9 Fury X before it (the card was what, 5 years old on the clock). And don't get me started on Vega FE, which I also had.
3. No use sugar coating it. AMD lied. All it took was Alder Lake to crash their $300+ Ryzen 5 5600X party with $180 12400F's outperforming them that X370's suddenly supported everything and the CPU prices cratered. TRX40 owners are still waiting, btw. Oh wait, AMD alone entered the HEDT market this generation with Zen 4 Threadrippers at prices that not even Intel dared when they had utter domination with the Core i7 Extreme CPUs that were several times faster than the FX-9590 to begin with back in the day... with Zen 4 HEDT starting at $1,499 all the way to $4,999 MSRP. Not a word from the legions of AMD fans calling them the devil or "emergency edition", I see.
4. I won't criticize the pricing much as I said before it was an unfortunate market situation (mid-mining craze and Covid overlap), but you should be well aware that said alleged limitations regarding maximum refresh rate were always unfounded to begin with. As long as the display on the other end is compatible and the port has enough bandwidth, you should get an image. Now I must question the usefulness of a GPU such as Navi 24 for 120/144 Hz gaming, unless you're playing games from the mid-2000's I don't think you're getting good frame rates on anything like that
5. That doesn't excuse it, after all, the RTX 4090 D's sole reason for existing is US sanctions on China. It's just about powerful enough to comply with the government's regulation and thus is a legal product to export. I don't think that gamers should be punished and prevented from buying a product because of their nationality, I'd already feel quite slighted to get something nerfed simply because I am not American.
6. Overlapping with point one, the 7900X3D and 7950X3D rely on a software driver for core allocation as only one of the CCDs is equipped with the 3D V-Cache. No tangible benefit because they refused to provide such a chip anyway - why threaten their own HEDT or server business?
If you get messages after your game crashed (it shouldn't crash to begin with), it's because you got a TDR and this is so incredibly common with AMD that they've actually decided to use it as a point for data collection. I can't remember the last time my RTX 4080 crashed and caused a TDR, probably because it hasn't happened since I bought it. Boot times were caused by buggy AGESA, meaning this platform should never have launched in that state to begin with. It was never the DDR5 training, it was never the fact that it's "new", it's just AMD's low level firmware code being horribly broken - as it had traditionally been.
Most of the features that AMD has implemented in their graphics drivers in recent memory, if not all, are clones of existing Nvidia technologies. If you look at what they introduced in the latest, GeForce has supported HAGS, video upscaling, etc. for years now - and overlapping with your initial point, AFMF is exclusive to RDNA 3, not that you'd want to play with frame reprojection on... since it's clearly not generative, and the frame rate counter being unchanged rats that out big time.
2. I wanted a Vega 7. I too salivated over the specs. Though you seem to forget that AMD drivers are Universal. I had Vega 64 in crossfire so what are you saying? I am pretty sure my 5600G still gets driver updates and I am pretty sure that is Vega.
3. Yep here we go the 12400F. This is after 5 different chipsets that just about every CPU was compatible with on AM4. At least you don't lose 8 lanes if you install an M2 in the top slot. Oh wait that is 12th Gen I guess that means Z590 and no PCIe 5.
4. Unfounded, I had a RX570 before and I can promise you that a 6500XT is much smoother with Freesync support than that. I was one of those people too. I used to argue in the PC store that refresh rate did not matter. Until I stared playing Division. I had a 60Hz panel and using the Machine Gun the gun would go all over the screen. I updated to a 32QC and all of a sudden I could use a scope with the machine gun to make head shots.
5. US Sanctions, Who is the Government? You make them seem like there are no geopolitics that are influencing that. Seeing that as Gamers being punished in a Country that drove down Online Gaming stock down with it's tactics and I did not know the entire stack was banned?
6. Here we go with "they rely on software". Show me a piece of PC hardware that does not rely on software to work.
I am so happy that you have had no crashes. You see I play a modded version of TWWH. When Creative Assembly updates the Game, it breaks the mod. That is what causes the issue to bring up the message. I know because they were not perfect releasing a brand new platform because they are human but at least memory support is better and Memory long times persist if you pick the 2nd Expo profile but it doesn't matter.
It is because Nvidia uses propaganda to make people feel triggered you mention that AMD could be competitive and in some cases better. Therefore ad naseum talk about Ray Tracing (In the 2 Games that supported it) made what we only were promised in one Game seem moot. Ashes of the Singularity was one of the first DX12 Games and is still one of the most optimized but of course we want DLSS and FSR is always blurry. Video Upsclaing, you should Google did Sapphire Trixx support Upscaling across all Games. You see that is where you miss the argument. I have a 7900XT. I was watching a PC World podcast and they were talking about "Bad Console ports". I asked a Super chat "I have a 7900X3D/7900XT combo why do I not have any of these issues". The response from Gordon was "Well if you have the horsepower it's not an issue for you". Maybe in 3 years when my 7900XT is older I will look into Upscaling. For my previous argument Upscaling makes the most sense at the low end. The thing is that though Nvidia may create, AMD makes it for the masses. AFMF is another Freesync moment that will improve with time and people that buy an 8700Gs will be happy for it.
www.amd.com/en/support/kb/release-notes/rn-rad-win-24-1-1
More on this:
www.anandtech.com/show/21126/amd-reduces-ongoing-driver-support-for-polaris-and-vega-gpus
Mate... 12th Gen is for LGA 1700, that means Z690 and Z790... they all have PCIe 5.0 support... DDR5 support... it was just AMD price gouging while Intel had no competition.
Geopolitics aren't relevant to money... Chinese gamers aren't getting cards for free... and there's no propaganda here, you're just... wrong, man.
Also its very stupid to say this kind of things, because u allredy know its not a true and not happening. Ppl say no one wants new Corvette 100.000$ its just stupid..
But i bought one juts like many other.
If u dont buy 1600$ GPU then someone else will.
if someone have only 1600$ in bank account and use all those moneys and buy 1600$ Gpu then its stupid.
its all about money,u know. U can buy new upcoming High end Nvidia.
The RDNA3 series might not have bin that impressive, it will take a few generations to fully uncork it's potential. It was just the start.
The good margins are within the midrange of GPU's - not the 2000$ costing GPU's. Failed? I think you half understand what they are doing.
Whole CDNA (=Mi300X) is expected to skyrocket in regards of sales.
Of course a 6500XT us going to feel better than a RX 570 when people all claimed that the Rx 570 was a better card.
Forgive me for forgetting that Intel likes to release 2 main chipsets a year. I stand corrected forgive me.
If you think the Geo politics and money are not connected....The US did not ban the 4080,4070 or 4060. F me they did not even ban the 3090. Just the 4090. You can go ahead and blindly trust that China is no different than the West. I guess the US only has Carrier Strike Groups to waste resources in that part of the World.
I saw that story too and I still get updates for my 5600G but go on. They are releasing new Vega skus anyway but we can go on.
Sometimes you have to reprioritize your product offerings based on what’s hot and capacity constrained manufacturing.
Compute cards have multiple compute dies because they aren't as latency sensitive trust me if AMD had it working for graphics cards it would be released. It's been wildly known that CPU devision has had the majority of the R&D allocation and rightly so it was where AMD could shine against a sleeping Intel. It's easier for compute less latency sensitive.
"screw that up because of greed"
Its like you guys don't understand AMD has shareholders and they demand profit. They are not pricing it to do you a favor Shareholders > Consumers when it comes to priority. They are a business at the end of the day.
Team Green has 4090's for $2000 are they also greedy?
That said, rumors claim RDNA5 is going back to a monolithic chip, or a hybrid monolithic+chiplet, as it's showing more promise in the near-term, while RDNA4 is a second attempt to refine the multi-GPU chiplet route until they can improve it for RDNA6 or 7. At the same time, RDNA3 is showing very good performance in AI workloads, so it's possible that their approach there will quickly lead to dividends in the rising AI sector (moreso since CDNA3 was based on RDNA3 work), and if RDNA4 is just refined RDNA3, could lead to even better AI performance.
If AMD is also leveraging 2 rival groups within for GPU development like they are for Ryzen (to ensure they always have options), then I expect we'll probably see alternating periods between performance uplift (monolithic or semi-monolithic) and efficiency gains (chiplet-based) until the chiplet method becomes good enough to compete in both while saving cost.
There are other advantages to chiplets like modularization, increased yields, reduced costs, vastly improved binning, scalability, and design simplicity but I assume that AMD 100% intended to have multiple GCDs when it first set out to design the RDNA3. AMD explicitly stated that it was unable to make it work due to bandwidth restrictions, which means they put significant effort into trying to make it work in order to find that out.
Maybe latency would be worse if AMD did have multiple GCDs or maybe it would be possible for them to keep data from hopping over often. In any case latency is not something that has regressed as compared to RDNA2. I think perhaps you were just unlucky and got a defective product / had driver issues. This is not something that has been reported to be a widespread issue. I have never heard this rumor and likely for good reason, it's the most nonsensical I've read to date. Going back to monolithic would require them to re-incorporate the MCDs back into the GCD, which makes no sense to do given it increases cost, decreases yield, decreases flexability, and a whole host of other negatives. MI300 would be impossible on such a design and AMD would again have to make different tapeouts for different products as compared to a chiplet based design where AMD makes 2 tapeouts, one for the MCD and one for the GCD.
Hybrid monolithic? If you have one big chip and smaller chips working together that's just a chiplet based architecture. You can't have a Hybrid between a chiplet architecture and a monolithic architecture as they are mutually exclusive. Monolithic implies one primary chip carries out all the functions of the architecture whereas chiplet splits those functions between multiple chips installed on an interposer. In otherwords, you either have one chip (monolithic) or chiplets (more than one) at the same time. You have products with multiple monolithic CPUs / GPUs on the PCB but those are just two monolithic chips in a group, not a hybrid or chiplet. Monolithic and chiplet refers to the architecture of the silicon product (CPU / GPU), not the grouping. You wouldn't call SLI GPUs or dual GPU cards chiplet architecture.
- The enterprise market means more to them than to consumers.
- Shareholders take priority over consumers
- Profit > feels good or doing the right thing
Pay more attention to what they do and how they operate and less of the AMD vs Nvidia.