• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Top AMD RDNA4 Part Could Offer RX 7900 XTX Performance at Half its Price and Lower Power

So AMD's best card will be about the performance of a low-end RTX 5070 or 5060. AMD really are incompetent. Good thing they were gifted an amazing CPU architecture that is easy to iterate on, otherwise they would be bankrupt by now.
We know no details on blackwell, and if ADA is anything to go by then it might be better than both, and besides, not having a high end gpu does not mean not being able to beat nvidia in the other important areas. (Let's remember a 4060 cannot beat a 3060ti, and the 4070 is roughly on par with the 100usd more expensive 3080, in MSRP anyways)
But having a good Halo class gpu does tend to make people think higher of your lower end products, no matter how underwhelming.
 
We know no details on blackwell, and if ADA is anything to go by then it might be better than both, and besides, not having a high end gpu does not mean not being able to beat nvidia in the other important areas. (Let's remember a 4060 cannot beat a 3060ti, and the 4070 is roughly on par with the 100usd more expensive 3080, in MSRP anyways)
But having a good Halo class gpu does tend to make people think higher of your lower end products, no matter how underwhelming.
But it does mean AMD does not go forward for another 2 or so years. Is that good in your book? AMD will be 3 generations behind nGreedia, which puts immense pressure on AMD to claw back that vast perf gap. If AMD aren't careful, Intel will catch up and maybe overtake them, then AMD/Radeon is finished. I would be very worried if I were Sony/Microsoft...

I would assume that unless nGreedia is going to separate their AI and Consumer GPU designs, then we should expect at least another 40-60% perf from Blackwell. But at what cost, well, that's only limited by Jensen's greed and arrogance.
 
I desperately hope they don't abandon the high-end. At least one other company needs to compete with nVidia there, don't care if it's Intel or AMD. Just, some kind of pressure...
For what? Monkey wants banana. People pay 200% more for 20-30% more performance, it makes no sense for AMD to spend hundreds of millions developing another high-end chip just for you to end up buying Nvidia at a discount.

Anyway, the performance of the XTX @ $400 is a great offer. If we had an effective MCM solution it could be 2x the performance of the XTX at US$800-1000.
 
Yeah yeah why not, next you'll tell me the pigs would fly too :wtf:
Hand me over a cannon, catapult, trebuchet or a suspiciously strong man and I'll make it happen.
 
Just when I was wondering if I should swap my 7800 XT for a 7900 XTX. I guess the answer is no.
 
This will probably get an 80 CU config with a 256-bit memory interface and a TBP of 300W. Ray tracing capabilities could be increased as well as the clock speed to match 7900XTX performance. AMD could possibly combine two of these on a next, next gen SKU at 3 nm.

I assume a simpler, easier to manufacturer part (read higher volume yields) is because AMD is prioritizing higher end silicon to CDNA.

There is also a good chance that AMD is betting on Nvidia abandoning the mid-range to budget discrete desktop GPU space. This would leave AMD (RDNA4) and Intel (Battlemage) to compete in the sub $500 price bracket.
 
For what?

For keeping the competitiveness alive.
AMD is several years behind with their RDNA architecture. They need something new and something soon in order to stay relevant.

Monkey wants banana. People pay 200% more for 20-30% more performance, it makes no sense for AMD to spend hundreds of millions developing another high-end chip just for you to end up buying Nvidia at a discount.

That's not exactly right. AMD offers worse performance per money as is. Look at the benchmarks.
RTX 4090 is 71% faster than RX 7900 XTX for approximately 95% more money (1850 euros vs 950 euros).
Given Nvidia's far superior brand recognition, AMD is the big underdog and loser here.
Second tier manufacturer.

1706524248184.png
 
There is also a good chance that AMD is betting on Nvidia abandoning the mid-range to budget discrete desktop GPU space.
I don't see a reason why NV should abandon this segment.
 
For keeping the competitiveness alive.
AMD is several years behind with their RDNA architecture. They need something new and something soon in order to stay relevant.
Then tell us what the point is in AMD spending millions to develop a high-end chip just so that you can buy Nvidia at a reduced price?

There is nothing to compete with the 4090, and even if there was, people would still buy the 4090 over anything, so what's the point in trying to compete where you can't?

That's not exactly right. AMD offers worse performance per money as is. Look at the benchmarks.
RTX 4090 is 71% faster than RX 7900 XTX for approximately 95% more money (1850 euros vs 950 euros).
71% more performance for 95% more money? How is that better? :kookoo:
 
If this is true (press x to doubt), then the only remotely reasonable explanation I can imagine is that they want to allocate more to CDNA, where the big money is. Also why would Jensen abandon the mid-range market where they still outsell AMD without even trying.
 
I don't see a reason why NV should abandon this segment.
This requires looking at Nvidia’s revenues and product trajectory from both AMD and Intel. Nvidia makes the lionshare of its money on compute GPUs and Gaming GPUs with higher margins at the high end. NVidia already abandoned the sub $300 4050 SKU. There are now three players in the sub $500 space: AMD, Intel and Nvidia. AMD plans to increase the CU count up to 40 in their APUs.

There is no advantage for Nvidia to compete against APUs, Battlemage and RDNA4 just to sell a low margin SKU. Its too crowded at the low end and the margins aren’t worth it. Something similar also happened with Nvidia discontinuing mobile MX processors in the laptop space as mobile APUs and SoCs became faster. I predict Nvidia will drop the RTX 5060 parts and only sell 5070/5080/5090 parts in its next GPU series.
 
This requires looking at Nvidia’s revenues and product trajectory from both AMD and Intel. Nvidia makes the lionshare of its money on compute GPUs and Gaming GPUs with higher margins at the high end. NVidia already abandoned the sub $300 4050 SKU. There are now three players in the sub $500 space: AMD, Intel and Nvidia. AMD plans to increase the CU count up to 40 in their APUs.

There is no advantage for Nvidia to compete against APUs, Battlemage and RDNA4 just to sell a low margin SKU. Its too crowded at the low end and the margins aren’t worth it. Something similar also happened with Nvidia discontinuing mobile MX processors in the laptop space as mobile APUs and SoCs became faster. I predict Nvidia will drop the RTX 5060 parts and only sell 5070/5080/5090 parts in its next GPU series.
That's because they're still selling the 3050 and 3060 (heck, even the 1650). What I suspect is, Nvidia will make the 5070-80-90, and keep selling 30 and 40 series cards in the lower segments to keep development costs at a minimum. Mid-range cards aren't and won't be abandoned. Only older models are / will be relegated to mid-range selling points.
 
That's because they're still selling the 3050 and 3060 (heck, even the 1650). What I suspect is, Nvidia will make the 5070-80-90, and keep selling 30 and 40 series cards in the lower segments to keep development costs at a minimum. Mid-range cards aren't and won't be abandoned. Only older models are / will be relegated to mid-range selling points.
Selling previous gen parts instead of the latest series for a certain price segment is tantamount to abandonment in my opinion. Besides the 3050 and 3060 will be really old by year’s end and don’t seem to be going down much in price. They are also mostly refurbished and used. I’m not sure Nvidia will allocate fab capacity to these two SKUs for much longer if they haven’t already stopped.

Edit: If anything gets sent to the fabs from Nvidia at the low end, it will probably be Switch 2 SoCs.
 
Last edited:
Selling previous gen parts instead of the latest series for a certain price segment is tantamount to abandonment in my opinion. Besides the 3050 and 3060 will be really old by year’s end and don’t seem to be going down much in price. They are also mostly refurbished and used. I’m not sure Nvidia will allocate fab capacity to these two SKUs for much longer if they haven’t already stopped.
Then we'll have the 4060 as their replacement. It's a tiny chip on a simple PCB, so I'm sure Nvidia can afford a few price drops on it.
If by "abandonment", you mean not spending money to carve a smaller chip out of the new architecture, then I agree.
 
Sounds like a big disappointment if AMD is going to abandon at least trying to go head to head against Nvidia.
 
The YouTube link is dead.
 
Sounds like a big disappointment if AMD is going to abandon at least trying to go head to head against Nvidia.
If they manage to go head to head in the mid-range, I'll consider it a win. Personally, I don't care about the top tier, and I doubt that many people do.
 
I don't see a reason why they would abandon the high end when the biggest margins are there!
There could be a number of reasons. Highest margins ≠ Highest profit, and even margins are highly debatable, once you factor in things like R&D, cost of development, yields, and of course modern day manufacturing and logistics problems.
But given that Moore's Law Is Dead deleted that video, I assume he's not that confident in his "predictions" either. While the dude is mostly competent, he's just like other youtubers - may guess few things right now and then, but he isn't a tech-Nostradamus.
 
I don't see a reason why they would abandon the high end when the biggest margins are there! The issue with AMD this generation is the price, they've got cheaper silicon compared to Nvidia, yet their pricing has been extremely bad, especially at launch.

Imagine the RX 7900XT launching at $800, it would have been an instant hit, it would have been the go to card with enthusiasts, but at $900 it was overexpensive and lacked value, even the 7900XTX was better value.
Then they launched the 7600 at $270 which was a reduction from the $300 price they initially went for, imagine this card launching at $240, it would have been an entry level hit, it would have been the go-to card for people looking for value and as an entry card.

They also screwed up with the 7700XT price, it should have cost $420 at start and it would have been just as popular, if not more than the 7800XT. At $450 it was worse value than the more expensive 7800xt.

According to rumors, there were two reasons
1. RDNA5 is shaping to be promising enough that they have to prioritize more resources to finalize it ASAP.
2. RDNA4 Multi Chiplet design is still inefficient compared to RDNA5, going RDNA4 high-end means a lot of work for diminishing results.

So they could either continue with high-end RDNA4 thus delaying RDNA5 more and gaining little results from the high-end RDNA4 (not competitive enough or even worse than RDNA3 competitive position), or scrap high-end RDNA4 to accelerate RDNA5 development and release.
 
I don't see a reason why they would abandon the high end when the biggest margins are there! The issue with AMD this generation is the price, they've got cheaper silicon compared to Nvidia, yet their pricing has been extremely bad, especially at launch.
Because using these high end chips in AI dedicated hardware gives even higher margins and gamers will always complain about AMD while giving money to nvidia anyway.

Anyway, the performance of the XTX @ $400 is a great offer. If we had an effective MCM solution it could be 2x the performance of the XTX at US$800-1000.
I just hope it's not another DLSS3 moment, where AMD starts using FSR3 numbers on their new cards vs FSR2 with old ones in official PR benchmarks.
 
they've got cheaper silicon compared to Nvidia, yet their pricing has been extremely bad, especially at launch.
Any concrete evidence on that, or is it just a hunch?
Before replying consider 2 things:
AMD's profit margins around 40-50%. Nvidia's profit margins at 65-75%.
Nvidia can have more wafers from TSMC and uses them only for GPUs. AMD needs to also use the majority of wafers it gets from TSMC for EPYC and Ryzen CPUs and APUs.
 
Back
Top