Monday, January 29th 2024

Top AMD RDNA4 Part Could Offer RX 7900 XTX Performance at Half its Price and Lower Power

We've known since way back in August 2023, that AMD is rumored to be retreating from the enthusiast graphics segment with its next-generation RDNA 4 graphics architecture, which means that we likely won't see successors to the RX 7900 series squaring off against the upper end of NVIDIA's fastest GeForce RTX "Blackwell" series. What we'll get instead is a product stack closely resembling that of the RX 5000 series RDNA, with its top part providing a highly competitive price-performance mix around the $400-mark. A more recent report by Moore's Law is Dead sheds more light on this part.

Apparently, the top Radeon RX SKU based on the next-gen RDNA4 graphics architecture will offer performance comparable to that of the current RX 7900 XTX, but at less than half its price (around the $400 mark). It is also expected to achieve this performance target using a smaller, simpler silicon, with significantly lower board cost, leading up to its price. What's more, there could be energy efficiency gains made from the switch to a newer 4 nm-class foundry node and the RDNA4 architecture itself; which could achieve its performance target using fewer numbers of compute units than the RX 7900 XTX with its 96.
When it came out, the RX 5700 XT offered an interesting performance proposition, beating the RTX 2070, and forcing NVIDIA to refresh its product stack with the RTX 20-series SUPER, and the resulting RTX 2070 SUPER. Things could go down slightly differently with RDNA4. Back in 2019, ray tracing was a novelty, and AMD could surprise NVIDIA in the performance segment even without it. There is no such advantage now, ray tracing is relevant; and so AMD could count on timing its launch before the Q4-2024 debut of the RTX 50-series "Blackwell."
Sources: Moore's Law is Dead (YouTube), Tweaktown
Add your own comment

517 Comments on Top AMD RDNA4 Part Could Offer RX 7900 XTX Performance at Half its Price and Lower Power

#452
Patriot
Dr. DroUnfortunately, quite impractical (to the point of unfeasibility) for anyone writing and using a Windows application, not to mention the entire footprint of a Linux system on top.
Depends what you are wanting to do, it immediately unlocks all LLM models as they are mainly developed on linux in the first place. And the linux footprint is rather small...
But if you are wanting to use it for rendering... They are working on desktop app support for windows directly through hip.
rocm.docs.amd.com/projects/install-on-windows/en/latest/ YMMV
I ... don't use windows for work... so...
Posted on Reply
#453
AusWolf
3valatzyDefine "noise". Define "knows". Define "facts".
Noise = online discussion.
Knows = in possession of definite information.
Facts = definite information coming straight from AMD.

Better?

Edit: typo
Dr. DroWhy is it that when anyone speaks anything that isn't strictly positive about AMD, it is thread crapping and hostility, but when it is anything related to Nvidia it's fair game to dunk on anything, prices, call them "nGreedia", Jensen, people who "willingly let themselves get ripped off", yeah, it's perfectly okay to do so?
So by your logic, if there are people throwing "Ngreedia" around, then you may as well join the fight and throw in a line about some "crappy drivers" with zero detail or explanation? How does that make you better than the "Ngreedia" people? How does that help anybody on the forum make an informed judgement? It doesn't.
Posted on Reply
#454
Dr. Dro
AusWolfSo by your logic, if there are people throwing "Ngreedia" around, then you may as well join the fight and throw in a line about some "crappy drivers" with zero detail or explanation? How does that make you better than the "Ngreedia" people? How does that help anybody on the forum make an informed judgement? It doesn't.
Let's start by the completely dysfunctional compute architecture, inconsistent release schedule and the incomplete API support. Their habit of taking the lazy route and doing the barest minimum work on the software as possible. The mindset which resulted in the recent "Antilag gets you VAC banned" disgrace. Amongst many others, the thing is, going into each and every of these takes time and effort unsuitable even for a topic all its own about this. It's not really going to result in a productive back and forth because frankly, nobody's interested.

But I am not as conceited as thinking I am above reproach or that my word is law. I too make mistakes (many of them, in fact) and like any other human being, highly prone to talking smack. Thus, I am still waiting: Why should I give AMD my hard earned cash and faithfully opt for their product instead?
Posted on Reply
#455
Visible Noise
evernessinceThere are three problems with this theory. The first is that AMD doesn't need multiple GCDs to reach the high end market. RDNA3 is evidence of that, where they have one GCD and multiple cache dies. The second is that it assumes AMD completely bungle their ability to have multiple GCDs on a single die for the second generation in a row.

Most of all though economically it makes zero sense for AMD to stop at the mid-range. AMD can add or subtract chiplets from a design at a near linear cost.
kapone32Alas here we go. People making stories about the narrative is crazy. Is there any statement from an AMD employee that they are not looking for the most powerful GPU to get over 50% more performance than the 7900XTX? How quickly the people in this thread forget that the 7900 series cards are faster than the 3090. I guess that does not matter in a world where the narrative is more trusted than the truth. This reminds me of the stories that purported that Ryzen was focused for the mainstream market and not for the high end.
These aged well…

;)
Posted on Reply
#456
AusWolf
Dr. DroLet's start by the completely dysfunctional compute architecture, inconsistent release schedule and the incomplete API support. Their habit of taking the lazy route and doing the barest minimum work on the software as possible. The mindset which resulted in the recent "Antilag gets you VAC banned" disgrace. Amongst many others, the thing is, going into each and every of these takes time and effort unsuitable even for a topic all its own about this. It's not really going to result in a productive back and forth because frankly, nobody's interested.

But I am not as conceited as thinking I am above reproach or that my word is law. I too make mistakes (many of them, in fact) and like any other human being, highly prone to talking smack. Thus, I am still waiting: Why should I give AMD my hard earned cash and faithfully opt for their product instead?
1. Not everybody cares about compute.
2. The release schedule doesn't affect the product, let alone drivers.
3. It wasn't anti-lag, but anti-lag plus, and it got fixed, as far as I know.
4. What incomplete API support?

+1. No one said you should buy AMD. But just because you have your reasons not to, it doesn't mean that those reasons aren't highly subjective.
Posted on Reply
#457
Dr. Dro
AusWolf1. Not everybody cares about compute.
2. The release schedule doesn't affect the product, let alone drivers.
3. It wasn't anti-lag, but anti-lag plus, and it got fixed, as far as I know.
4. What incomplete API support?

+1. No one said you should buy AMD. But just because you have your reasons not to, it doesn't mean that those reasons aren't highly subjective.
1. Everyone should care. What is a GPU without reliable compute in 2025? We're in the middle of the AI age. Everything uses GPGPU compute today, and it's incidentally at this time that they most sorely miss having a viable CUDA equivalent. All of their attempts have been trash on Windows. ROCm more or less took off on Linux out of sheer necessity, but remains an unfulfilled promise on Windows.

2. Yes, it does affect the product. GPUs without drivers are as good as expensive paperweights. You should demand of them the very highest standards in support software releases.

3. Semantics, and yes, of course it got fixed. But the fact remains that should have never shipped. It's one of the most unsanitary, unsafe, utterly stupid ideas I have ever seen. I've once heard a rather stupid joke about the AMD driver being developed by 4 Indian guys in a basement, paid with a goat and a crate of cheap beer. Of course, to even entertain such a notion that this individual was actually being serious is completely absurd, but stunts like the VAC issue truly do place them in that exact light to anyone who has an actual sense of the do's and don'ts of how the Windows DLL architecture works.

4. There was a reason as to why I made the inquiry as to if they actually offer comprehensive support to all API extensions from a wildly popular, 16 year old graphics API. Just as there was a reason I did not get an answer.

I take it no one will try to make a good faith answer to a good faith question, then. I genuinely want to understand what drives this passion, and why are their products worth it.
Posted on Reply
#458
AusWolf
Dr. Dro1. Everyone should care. What is a GPU without reliable compute in 2025? We're in the middle of the AI age. Everything uses GPGPU compute today, and it's incidentally at this time that they most sorely miss having a viable CUDA equivalent. All of their attempts have been trash on Windows. ROCm more or less took off on Linux out of sheer necessity, but remains an unfulfilled promise on Windows.
Bullshit. Why should a gamer care about compute? I've never used compute in my life and I don't intend to.
Dr. Dro2. Yes, it does affect the product. GPUs without drivers are as good as expensive paperweights. You should demand of them the very highest standards in support software releases.
You still haven't mentioned a single issue with the drivers.
Dr. Dro3. Semantics, and yes, of course it got fixed. But the fact remains that should have never shipped. It's one of the most unsanitary, unsafe, utterly stupid ideas I have ever seen. I've once heard a rather stupid joke about the AMD driver being developed by 4 Indian guys in a basement, paid with a goat and a crate of cheap beer. Of course, to even entertain such a notion that this individual was actually being serious is completely absurd, but stunts like the VAC issue truly do place them in that exact light to anyone who has an actual sense of the do's and don'ts of how the Windows DLL architecture works.
Sure, be angry about a past error related to an optional feature that got fixed. It won't fix it any further, you know.
Dr. Dro4. There was a reason as to why I made the inquiry as to if they actually offer comprehensive support to all API extensions from a wildly popular, 16 year old graphics API. Just as there was a reason I did not get an answer.
What extensions aren't supported?
Dr. DroI take it no one will try to make a good faith answer to a good faith question, then. I genuinely want to understand what drives this passion, and why are their products worth it.
Where are your good faith answers? All I see is some general shittalk about AMD and drivers, but no examples.
Posted on Reply
#459
Visible Noise
AusWolfBullshit. Why should a gamer care about compute? I've never used compute in my life and I don't intend to.
Is gaming the only thing you do on your PC?
AusWolfYou still haven't mentioned a single issue with the drivers.
Have they gotten all their HDR issues straightened out yet?
Posted on Reply
#460
AusWolf
Visible NoiseIs gaming the only thing you do on your PC?
I also browse the web, watch YouTube and offline videos, look at family photos, edit documents, etc. None of these tasks needs GPU compute.

Edit: I also crunch with WCG, but it doesn't have any GPU work for my Nvidia cards, either.
Visible NoiseHave they gotten all their HDR issues straightened out yet?
What HDR issues? I might have missed the news on that one. I've never had problems with HDR on my system.
Posted on Reply
#461
Dr. Dro
AusWolfno examples
Nothing more from me... I don't know how to reply while remaining cordial, really. Even if gaming is all you ever did (to the extent it'd make a gaming console blush, strictly by the books), GPU compute has been an integral part of it. It was one of the pillars of the unified shader revolution, and since DirectX 11, it's been a core function of the API. As for the rest, you know most of it already to some extent or the other. The juice is not worth the squeeze, because I know myself and I will sound patronizing. That is not something I wish to do.

I can take things in sport and learn from experiences, so I was hoping that someone could shed light on why am I wrong, without resorting to anecdotal "evidence". The simple truth is, there is not one department where AMD's software ecosystem measures up, but like I said, I am not conceited as to considering myself to be above reproach.

The one thing I won't accept is the regarding the whole Anti-Lag VAC issue. That was beyond unacceptable, it's not just an "error", it's going below even the lowest software quality standards that anyone could have. I just won't look the other way for something like that. I'm generally lenient with blunders as long as things are made right and made right fast, but this one was easily the most egregious of all.
Posted on Reply
#462
AusWolf
Dr. DroNothing more from me... I don't know how to reply while remaining cordial, really. Even if gaming is all you ever did (to the extent it'd make a gaming console blush, strictly by the books), GPU compute has been an integral part of it. It was one of the pillars of the unified shader revolution, and since DirectX 11, it's been a core function of the API. As for the rest, you know most of it already to some extent or the other. The juice is not worth the squeeze, because I know myself and I will sound patronizing. That is not something I wish to do.
Aha, so you personally have nothing, but you expect everyone to believe you based on evidence you don't have?

Sure, my evidence on current issues with Nvidia (one) and current issues with AMD (none) are anecdotal, but even that's better than just talking out of thin air, don't you think?

Personally, I always take stuff on the internet with a grain of salt. Even reviews. That's why I go out of my way and buy stuff that I'm interested in. I even bought a 6500 XT only because everyone on the online world was outrageous about it, I wanted to see what the fuss was about. That's the kind of person I am. I've had lots of disappointment in generally loved products, and I've grown to love ones that are generally hated by the public.

So I'm sorry, but no first-hand evidence = no evidence in my eyes. It's only fan-talk or shittalk.
Dr. DroI can take things in sport and learn from experiences, so I was hoping that someone could shed light on why am I wrong, without resorting to anecdotal "evidence". The simple truth is, there is not one department where AMD's software ecosystem measures up, but like I said, I am not conceited as to considering myself to be above reproach.
Control panel? Linux support? Overclocking / tuning? Just to name a few.
Dr. DroThe one thing I won't accept is the regarding the whole Anti-Lag VAC issue. That was beyond unacceptable, it's not just an "error", it's going below even the lowest software quality standards that anyone could have. I just won't look the other way for something like that. I'm generally lenient with blunders as long as things are made right and made right fast, but this one was easily the most egregious of all.
Okay. Please let us know how AMD could fix it beyond it being fixed.

Edit: I'm also curious why you think that compute is essential for gaming.
Posted on Reply
#463
Visible Noise
AusWolfWhat HDR issues? I might have missed the news on that one. I've never had problems with HDR on my system.
Obviously you have. It’s been two years and they still haven’t fixed the broken HDR on their APUs.
Posted on Reply
#464
AusWolf
Visible NoiseObviously you have. It’s been two years and they still haven’t fixed the broken HDR on their APUs.
So it's only APUs, not dGPUs? I obviously missed it / haven't heard about it.
Posted on Reply
#465
Dr. Dro
AusWolfAha, so you personally have nothing, but you expect everyone to believe you based on evidence you don't have?
Right, you're quite free to think that mate. Rants are useless, and rants that serve little purpose beyond causing bad blood are actively detrimental. Still, I appreciate you playing the devil's advocate, some of the questions you made ill befit your status ;)
AusWolfControl panel? Linux support? Overclocking / tuning? Just to name a few.
Ain't much, I personally don't care for it, but

www.nvidia.com/en-us/software/nvidia-app/

While speaking of nice control panels, shame this isn't available on gaming drivers. Quite useful

AusWolfOkay. Please let us know how AMD could fix it beyond it being fixed.
The more appropriate question is "How did this get greenlit, vetted, supposedly tested, marketed and shipped".
Posted on Reply
#466
AusWolf
Dr. DroRight, you're quite free to think that mate. Rants are useless, and rants that serve little purpose beyond causing bad blood are actively detrimental. Still, I appreciate you playing the devil's advocate, some of the questions you made ill befit your status ;)
If rants are useless, then why are you doing it? I'm not the one complaining about some driver quality issues here. ;)
Dr. DroAin't much, I personally don't care for it, but
You're totally free not to care about any of it. But then, am I not free not to care about Nvidia's features, including CUDA, and regard them as optional instead of necessary?
Dr. DroThe more appropriate question is "How did this get greenlit, vetted, supposedly tested, marketed and shipped".
How did the 3.5 GB GTX 970 get greenlit, vetted, supposedly tested, marketed and shipped? Do you see how useless talking about the past is?
Posted on Reply
#467
Dr. Dro
Simple... it wasn't a 3.5 GB card, it was not a design fault, it was a limitation of the architecture... alternative would be to release it as a true 3.5GB product, which in hindsight would have avoided them a lawsuit. Could ask the same of Bulldozer, I argue it was an 8 core processor even with its architecture, but AMD settled in court in recognition otherwise...

And are we even talking about the same thing at this point? You kinda brought up the fancy control panel. idk, I need sleep. will make sure to count the leather jackets :sleep:
Posted on Reply
#468
AusWolf
Dr. DroSimple... it wasn't a 3.5 GB card, it was not a design fault, it was a limitation of the architecture... alternative would be to release it as a true 3.5GB product, which in hindsight would have avoided them a lawsuit. Could ask the same of Bulldozer, I argue it was an 8 core processor even with its architecture, but AMD settled in court in recognition otherwise...
Ah, so if Nvidia intentionally misleads customers, it's fine, but if AMD makes a genuine mistake, it should be remembered and talked about forever. Now we see where the wind blows.
Dr. DroAnd are we even talking about the same thing at this point? You kinda brought up the fancy control panel. idk, I need sleep. will make sure to count the leather jackets :sleep:
I brought it up as an example of something that does measure up and exceed Nvidia's solution (as you claimed there was nothing such).

You're fine not to like what AMD has, and prefer Nvidia instead. Just don't try to sell it as objective fact because it's not.
Posted on Reply
#469
Dr. Dro
Bro. Neither company intentionally misled anyone in both mentioned cases. They were limitations of the technology at the time. That goes for both FX and the 970.
Posted on Reply
#470
AusWolf
Dr. DroBro. Neither company intentionally misled anyone in both mentioned cases. They were limitations of the technology at the time. That goes for both FX and the 970.
The 970 was marketed as a 4 GB 224.4 GB/s card. The last 0.5 GB operating at a much lower bandwidth was never mentioned.

But this is not my point. My point is that we shouldn't argue about the past, especially in an RDNA 4 thread. It's totally not relevant to anything here.
Posted on Reply
#471
Visible Noise
Are you really bringing up a 15 year old Nvidia card in a thread about the 9070?

Get a grip dude.
Posted on Reply
#472
AusWolf
Visible NoiseAre you really bringing up a 15 year old Nvidia card in a thread about the 9070?

Get a grip dude.
No. I'm saying it shouldn't matter now, just like AMD's anti-lag error which was fixed. No one should say that AMD's drivers are bad because they heard about some isolated problem a thousand years ago that got fixed since.
Posted on Reply
#473
Visible Noise
AusWolfNo. I'm saying it shouldn't matter now, just like AMD's anti-lag error which was fixed. No one should say that AMD's drivers are bad because they heard about some isolated problem a thousand years ago that got fixed since.
I agree. There are plenty of current issues in AMD drivers.
Posted on Reply
#474
Zach_01
AusWolfNo. I'm saying it shouldn't matter now, just like AMD's anti-lag error which was fixed. No one should say that AMD's drivers are bad because they heard about some isolated problem a thousand years ago that got fixed since.
Just another 9070 related thread that turned into a charade. Full of irrelevancy and disorientation... Kinda entertaining to be honest
Posted on Reply
#475
AusWolf
Visible NoiseI agree. There are plenty of current issues in AMD drivers.
Here we go again. *Facepalm*

Go on, then. I see you want to talk about it.
Posted on Reply
Add your own comment
Mar 3rd, 2025 04:28 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts