Monday, January 29th 2024

Top AMD RDNA4 Part Could Offer RX 7900 XTX Performance at Half its Price and Lower Power

We've known since way back in August 2023, that AMD is rumored to be retreating from the enthusiast graphics segment with its next-generation RDNA 4 graphics architecture, which means that we likely won't see successors to the RX 7900 series squaring off against the upper end of NVIDIA's fastest GeForce RTX "Blackwell" series. What we'll get instead is a product stack closely resembling that of the RX 5000 series RDNA, with its top part providing a highly competitive price-performance mix around the $400-mark. A more recent report by Moore's Law is Dead sheds more light on this part.

Apparently, the top Radeon RX SKU based on the next-gen RDNA4 graphics architecture will offer performance comparable to that of the current RX 7900 XTX, but at less than half its price (around the $400 mark). It is also expected to achieve this performance target using a smaller, simpler silicon, with significantly lower board cost, leading up to its price. What's more, there could be energy efficiency gains made from the switch to a newer 4 nm-class foundry node and the RDNA4 architecture itself; which could achieve its performance target using fewer numbers of compute units than the RX 7900 XTX with its 96.
When it came out, the RX 5700 XT offered an interesting performance proposition, beating the RTX 2070, and forcing NVIDIA to refresh its product stack with the RTX 20-series SUPER, and the resulting RTX 2070 SUPER. Things could go down slightly differently with RDNA4. Back in 2019, ray tracing was a novelty, and AMD could surprise NVIDIA in the performance segment even without it. There is no such advantage now, ray tracing is relevant; and so AMD could count on timing its launch before the Q4-2024 debut of the RTX 50-series "Blackwell."
Sources: Moore's Law is Dead (YouTube), Tweaktown
Add your own comment

292 Comments on Top AMD RDNA4 Part Could Offer RX 7900 XTX Performance at Half its Price and Lower Power

#51
Unregistered
Will not happen. But for 599-699€ its more realistic if its XTX level. I expect 5070 to hit this level too. Nvidia will never give you that for 499€. 599€ at least and its nvidia so 659€ is more reasonable.

maybe both surprise us but who am i kidding.
#52
Nhonho
Sad, right now that people want to play in 4K resolution...
Posted on Reply
#53
Unregistered
4k is awesome and upscaling made it much more reasonable. I dont like FG personally but still it helps too.

4070 like will get you 60fps 4k in most games no issue with dlss quality 80 ish fps. Optimized settings on top and you are good to go.
#54
Camm
3 pages, and only Daven got close to the nail.

There's limited fab capacity (see Nvidia right now where there's scarcity of any large dies right now for the gaming market), and AMD can make bank on CDNA where it is competitive rather than trying to compete in the high end, where mindshare is firmly in Nvidia's camp, and of which they have an architecture simply not built to scale to as high a die space as Nvidia's (and is still wanting for their MCM GPU approach to become viable).

Most of us should be excited, a high performing card for cheap? That is sorely missing from today's market.
Posted on Reply
#55
Markosz
Hmm... that performance/price would definitely shake up a few things, let's hope they are talking about native performance and the claims are true.
In that case, even if AMD doesn't compete in the top end, it should push down most GPU prices significantly, which people buy anyway.

Also, nothing is stopping AMD from making a larger GPU, slapping on more CUs
Posted on Reply
#56
Chrispy_
This is all still rumour right? Most of the 'sources' just link back to MLID and whilst he's often right and has plenty of contacts with AMD and Nvidia board partners, this is still just his speculation because none of the board partners will have anything this far ahead of launch. Interestingly, the MLID video in question appears to have been pulled from YouTube - which makes the rumour even more suspect!

I *can* see dropping the high-end as a sensible option for AMD - their bread-and-butter is custom silicon for consoles and the surge of new handhelds that have hit the market. Having their desktop product stack be something that can also go into consoles and handhelds is far more relevant to them in terms of profit and market penetration, and the lower half of the RDNA2/RDNA3 product stack seems to have been selling well.

Flagships are nice and all, but the battle for flagship dominance is going to be won by Nvidia until AMD invest more heavily into raytracing - and honestly, even on Nvidia's 3rd generation of RTX cards, we're still at a point where half of the Nvidia product stack (accounting for 90%+ of their actual units sold) is too weak for acceptable raytracing/path-tracing performance in the extremely small list of games that make heavy use of it like CP2077/Alan Wake 2/Ratchet & Clank. RTX 3080/4070 and up is kind of where you need to be for 1080p60 raytracing in those games (or higher resolutions with DLSS). For AMD to beat a 4090 they need to more-than-double their raytracing performance, and that's wasted effort for anything slower than a 3080/4070. In other words, the 7800XT is the slowest card that could conceivably see the benefits of improved raytracing efforts and the vast majority of what AMD sells isn't at the performance tier where it matters (yet).
Posted on Reply
#57
RamiHaidafy
Here's to hoping they make a dual-GPU version then. Call it a Radeon 8750 X2 or Radeon 8990. I miss those days.
Posted on Reply
#58
3valatzy
RedwoodzNice try dude. Remove that useless , budget busting raytracing and you see why AMD is very comfortable where they are. No one really wants a $1600 gaming gpu for a personal PC. It's just stupid. /
Does AMD support ray-tracing? Because AMD is so weak in ray-tracing, doesn't mean that they should say that it's "useless".
If they agreed with the rules of the game, then please be so kind and play the game, or exit it.
AusWolfThen tell us what the point is in AMD spending millions to develop a high-end chip just so that you can buy Nvidia at a reduced price?

There is nothing to compete with the 4090, and even if there was, people would still buy the 4090 over anything, so what's the point in trying to compete where you can't?


71% more performance for 95% more money? How is that better? :kookoo:
Who says that they will buy nvidia instead! That's a speculation on your side. Very wrong.
Posted on Reply
#59
DeathtoGnomes
Limited production availability will drive retail prices by as much as 150%, once the higher prices are sestablished, we wont see anything at msrp.
Posted on Reply
#60
AusWolf
3valatzyDoes AMD support ray-tracing? Because AMD is so weak in ray-tracing, doesn't mean that they should say that it's "useless".
If they agreed with the rules of the game, then please be so kind and play the game, or exit it.
Nvidia loses 50% performance with RT on, AMD loses 60%. This makes both of them weak, in my opinion. Calling Nvidia "better" instead of "slightly less crap" is marketing.
3valatzyWho says that they will buy nvidia instead! That's a speculation on your side. Very wrong.
What you said above (thinking that Nvidia's "better" RT means anything) is proof. Also, I personally know people (not one, but several) who would never buy an AMD card, no matter the price or performance.

This is representative of the general public mindshare. "Nvidia = gaming, AMD = second place."
Posted on Reply
#61
Dawora
Amd fans have to buy Nvidia high end Gpu if they want one.
Or they can just buy lower res monitor or play lower settings.

its good to be a Nvidia user, there is no morale problems to buy High end Gpu in future.
Posted on Reply
#62
Beginner Macro Device
AusWolfNvidia loses 50% performance with RT on, AMD loses 60%.
Or about 55% VS 78% in path tracing (RTX 4070 Super VS 7900 XT). That means if an NV GPU matches the raster performance of an AMD one then it's about to more than double its path tracing performance.

And I'm talking totally enjoyable VS barely playable, not garbage VS complete garbage:


I don't mind RDNA4 GPUs being unimpressive at pure raster (RDNA3 GPUs are already not impressive due to discounts on NV production and the way the UE5 treats this architecture). I don't mind them having no top-tier SKU (AMD have never had competed with NV's halos with any success anyway). What I do mind is the same performance drop when it comes to RT. So if a 500 dollar RDNA4 GPU fails to outpace the 7900 XT in pure raster it's bad but acceptably bad. But if it also fails to outperform it in RT it's a terrible release.
Posted on Reply
#63
Vayra86
Chrispy_This is all still rumour right? Most of the 'sources' just link back to MLID and whilst he's often right and has plenty of contacts with AMD and Nvidia board partners, this is still just his speculation because none of the board partners will have anything this far ahead of launch. Interestingly, the MLID video in question appears to have been pulled from YouTube - which makes the rumour even more suspect!

I *can* see dropping the high-end as a sensible option for AMD - their bread-and-butter is custom silicon for consoles and the surge of new handhelds that have hit the market. Having their desktop product stack be something that can also go into consoles and handhelds is far more relevant to them in terms of profit and market penetration, and the lower half of the RDNA2/RDNA3 product stack seems to have been selling well.

Flagships are nice and all, but the battle for flagship dominance is going to be won by Nvidia until AMD invest more heavily into raytracing - and honestly, even on Nvidia's 3rd generation of RTX cards, we're still at a point where half of the Nvidia product stack (accounting for 90%+ of their actual units sold) is too weak for acceptable raytracing/path-tracing performance in the extremely small list of games that make heavy use of it like CP2077/Alan Wake 2/Ratchet & Clank. RTX 3080/4070 and up is kind of where you need to be for 1080p60 raytracing in those games (or higher resolutions with DLSS). For AMD to beat a 4090 they need to more-than-double their raytracing performance, and that's wasted effort for anything slower than a 3080/4070. In other words, the 7800XT is the slowest card that could conceivably see the benefits of improved raytracing efforts and the vast majority of what AMD sells isn't at the performance tier where it matters (yet).
Everybody knows that any fart about AMD and leaving high end or populating the high end in whatever way is internet bonus points and ad revenue.

People are going straight to 11 on this brainfart, its amazing. Sheeple
Posted on Reply
#64
EatingDirt
3valatzyThat's not exactly right. AMD offers worse performance per money as is. Look at the benchmarks.
RTX 4090 is 71% faster than RX 7900 XTX for approximately 95% more money (1850 euros vs 950 euros).
Given Nvidia's far superior brand recognition, AMD is the big underdog and loser here.
Second tier manufacturer.
What? AMD's current advantage, and one of the only real advantages, is that it offers better value for pure raster performance across the board. You can go to every single GPU review on TPU, browse to the Performance Per Dollar tab, and see that essentially every single current generation AMD card has a better performance-per-dollar verses their primary Nvidia competitor.
Posted on Reply
#65
napata
AusWolfNvidia loses 50% performance with RT on, AMD loses 60%. This makes both of them weak, in my opinion. Calling Nvidia "better" instead of "slightly less crap" is marketing.


What you said above (thinking that Nvidia's "better" RT means anything) is proof. Also, I personally know people (not one, but several) who would never buy an AMD card, no matter the price or performance.

This is representative of the general public mindshare. "Nvidia = gaming, AMD = second place."
50%? The post you're quoting shows 22% on a 4090 versus 41% on 7900XTX. It's obviously going to vary heavily from game to game with pathtracing being much worse, both in cost and how AMD does in them.

On Nvidia lots of things are playable for most while they aren't on AMD if that's how you want to argue. Relative performance is just much worse for AMD.

It's also not like you can escape RT as we're already seeing games where RT is default or where raster has broken lightning because it was developed with RT in mind. Hell, even one of the latest AMD sponsored games has mandatory RT.
Posted on Reply
#68
AusWolf
napata50%? The post you're quoting shows 22% on a 4090 versus 41% on 7900XTX. It's obviously going to vary heavily from game to game with pathtracing being much worse, both in cost and how AMD does in them.

On Nvidia lots of things are playable for most while they aren't on AMD if that's how you want to argue. Relative performance is just much worse for AMD.

It's also not like you can escape RT as we're already seeing games where RT is default or where raster has broken lightning because it was developed with RT in mind. Hell, even one of the latest AMD sponsored games has mandatory RT.
If you mean Avatar: FoP, then it uses RT lightly enough so that it runs relatively well on both AMD and Nvidia. As for heavy RT, like path tracing, it runs like crap on everything, so whether you're losing half of your performance, or even more, doesn't matter, in my opinion.
Posted on Reply
#69
Beginner Macro Device
Recus
They're still disastrous if compared to previous gen achievements. Depends on where you're looking at.

NB: Outselling the 7600 XT is not an achievement. The only barely competitive GPUs are 7800 XT and, to a lesser extent, 7900 XTX. Other RDNA3 GPUs are currently seriously overpriced.
Posted on Reply
#70
evernessince
Broken ProcessorThey haven't abandoned it they simply can't get multi GCD's working correctly if the rumours are to be believed. So this gives them room to get the following generation working correctly rather than wasting resources fire fighting. It boils down to Radeon group not having the resources it needed and being forced to make hard choices but thankfully with AI it appears that's likely to change but if that helps or hinders GPU development who knows.
There are three problems with this theory. The first is that AMD doesn't need multiple GCDs to reach the high end market. RDNA3 is evidence of that, where they have one GCD and multiple cache dies. The second is that it assumes AMD completely bungle their ability to have multiple GCDs on a single die for the second generation in a row. It's nonsense that the Radeon group doesn't have the resources, AMD have been pouring money into them. I'd assume after the first generation of failure they have a general idea of the bandwidth required for multiple GCDs. At the very least if they couldn't reach the required bandwidth number I'd expect them to further modularize their GPU, stack cache, ect. There are plenty of options for AMD to reach the high end while maintaining RDNA3's small die size and without the use of multiple GCDs.

Most of all though economically it makes zero sense for AMD to stop at the mid-range. AMD can add or subtract chiplets from a design at a near linear cost. This is particularly pertinent because for Nvidia the cost increase is exponential at the high end due to fact that yield drastically decreases when you get to the size of a high end GPU. By extension this means AMD has a large cost advantage in the high end (not that it really needs it given Nvidia's margins have always been large on those high end cards). AMD might not compete with Nvidia's high end chip with a single GCD but at the very least I'd expect them to stack up as many chiplets to have a competitive high end product simply because that's what stands to make them the most money. There's really no reason for AMD to simply leave money on the table.
Posted on Reply
#71
theouto
Recus
One sold nine units, the other sold one
Posted on Reply
#72
Denver
3valatzyFor keeping the competitiveness alive.
AMD is several years behind with their RDNA architecture. They need something new and something soon in order to stay relevant.



That's not exactly right. AMD offers worse performance per money as is. Look at the benchmarks.
RTX 4090 is 71% faster than RX 7900 XTX for approximately 95% more money (1850 euros vs 950 euros).
Given Nvidia's far superior brand recognition, AMD is the big underdog and loser here.
Second tier manufacturer.

Years ? This is one of the most absurd things I've ever read. ADA and RDNA3 are very close architectures in their capabilities, in general, talking about rasterization which is what 99% of games run on. RDNA3 only incurs some disadvantages in terms of efficiency by adopting the MCM design, and inferior process, in favor of maximizing performance per area and lowering manufacturing costs.



In the real world the average is 20-30%, I don't have the patience to get into the discussion about RT again. I'll summarize by saying that I think it's a waste of resources in every way.
Posted on Reply
#73
Chrispy_
Vayra86Everybody knows that any fart about AMD and leaving high end or populating the high end in whatever way is internet bonus points and ad revenue.
I hope it's just a bad rumour.
Nvidia having no competition for the 4090 is why it's $2000 compared the 4080S at $1000. If AMD had a 7950XTX that was just 10% faster than the 7900XTX, the 4090 would adjust slightly. It would adjust more if there wasn't a CUDA-dependent AI market to serve, too.
Recus
I'm not surprised at all:
  • The 4070 Ti Super is better value than its predecessor and brings a sensible amount of VRAM to the price point for the first time in a long while.
  • The 7600XT is worse than either of it predecessors. It's too expensive and power-hungry to compete with the 7600 8GB and it's inferior in every metric to the identically-priced 6700XT.
I have no idea how well the 4070 Ti Super is selling, but they could be selling disastrously AND also be outselling the 7600XT 9:1 at the same time. The two statements are not mutually-exclusive! An AMD GPU that does sell well is the RX 6600 at €209, offering plenty of 1080p performance for 40% less money. Until that becomes unavailable, it's still the best-value sub-$300 card and if you want more performance than it offers, the next options worth looking at are the 4060Ti or 6700XT.
Posted on Reply
#74
evernessince
DavenThis will probably get an 80 CU config with a 256-bit memory interface and a TBP of 300W. Ray tracing capabilities could be increased as well as the clock speed to match 7900XTX performance. AMD could possibly combine two of these on a next, next gen SKU at 3 nm.

I assume a simpler, easier to manufacturer part (read higher volume yields) is because AMD is prioritizing higher end silicon to CDNA.

There is also a good chance that AMD is betting on Nvidia abandoning the mid-range to budget discrete desktop GPU space. This would leave AMD (RDNA4) and Intel (Battlemage) to compete in the sub $500 price bracket.
AMD is using chiplets which by extension means if there is a high end part hogging the best dies it doesn't impact the consumer parts. AMD EPYC is a good example of this, the high end parts just get the better binned chiplets but this also means AMD has a ton of dies that didn't quite meet the higher bar that go into consumer parts.
Posted on Reply
#75
remekra
Broken ProcessorThey haven't abandoned it they simply can't get multi GCD's working correctly if the rumours are to be believed. So this gives them room to get the following generation working correctly rather than wasting resources fire fighting. It boils down to Radeon group not having the resources it needed and being forced to make hard choices but thankfully with AI it appears that's likely to change but if that helps or hinders GPU development who knows.
Don't know if they will implement it or not in their consumer cards, but their Instinct MI300X is already having 2xGCD and it's working and is being treated by software as a single GPU. I know that there is a difference between that arch and RDNA, but still seems they are making good progress on that.

If they could make a GPU with 2GCD, each performing as 7900XTX, with MCDs on them, improve RT perf then I would sell my 7900XTX and buy it instantly.

EDIT
Did a read up on that Instinct card:

chipsandcheese.com/2023/12/17/amds-cdna-3-compute-architecture/

It uses XCDs and has eight of them and they are all exposed as a single GPU.
Question is does it convert to consumer market and a GPU used for gaming, we'll see I guess.
Posted on Reply
Add your own comment
May 18th, 2024 13:18 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts