Monday, January 29th 2024
Top AMD RDNA4 Part Could Offer RX 7900 XTX Performance at Half its Price and Lower Power
We've known since way back in August 2023, that AMD is rumored to be retreating from the enthusiast graphics segment with its next-generation RDNA 4 graphics architecture, which means that we likely won't see successors to the RX 7900 series squaring off against the upper end of NVIDIA's fastest GeForce RTX "Blackwell" series. What we'll get instead is a product stack closely resembling that of the RX 5000 series RDNA, with its top part providing a highly competitive price-performance mix around the $400-mark. A more recent report by Moore's Law is Dead sheds more light on this part.
Apparently, the top Radeon RX SKU based on the next-gen RDNA4 graphics architecture will offer performance comparable to that of the current RX 7900 XTX, but at less than half its price (around the $400 mark). It is also expected to achieve this performance target using a smaller, simpler silicon, with significantly lower board cost, leading up to its price. What's more, there could be energy efficiency gains made from the switch to a newer 4 nm-class foundry node and the RDNA4 architecture itself; which could achieve its performance target using fewer numbers of compute units than the RX 7900 XTX with its 96.When it came out, the RX 5700 XT offered an interesting performance proposition, beating the RTX 2070, and forcing NVIDIA to refresh its product stack with the RTX 20-series SUPER, and the resulting RTX 2070 SUPER. Things could go down slightly differently with RDNA4. Back in 2019, ray tracing was a novelty, and AMD could surprise NVIDIA in the performance segment even without it. There is no such advantage now, ray tracing is relevant; and so AMD could count on timing its launch before the Q4-2024 debut of the RTX 50-series "Blackwell."
Sources:
Moore's Law is Dead (YouTube), Tweaktown
Apparently, the top Radeon RX SKU based on the next-gen RDNA4 graphics architecture will offer performance comparable to that of the current RX 7900 XTX, but at less than half its price (around the $400 mark). It is also expected to achieve this performance target using a smaller, simpler silicon, with significantly lower board cost, leading up to its price. What's more, there could be energy efficiency gains made from the switch to a newer 4 nm-class foundry node and the RDNA4 architecture itself; which could achieve its performance target using fewer numbers of compute units than the RX 7900 XTX with its 96.When it came out, the RX 5700 XT offered an interesting performance proposition, beating the RTX 2070, and forcing NVIDIA to refresh its product stack with the RTX 20-series SUPER, and the resulting RTX 2070 SUPER. Things could go down slightly differently with RDNA4. Back in 2019, ray tracing was a novelty, and AMD could surprise NVIDIA in the performance segment even without it. There is no such advantage now, ray tracing is relevant; and so AMD could count on timing its launch before the Q4-2024 debut of the RTX 50-series "Blackwell."
396 Comments on Top AMD RDNA4 Part Could Offer RX 7900 XTX Performance at Half its Price and Lower Power
I don't think AMD will be competitive unless they somehow find a way to make it on the newer N3, and not on the older 5nm+ (4N).
- connected different screens and video playback resulting in elevated power consumption www.techpowerup.com/review/nvidia-geforce-rtx-4080-super-founders-edition/41.html
- overall lower performance per watt www.techpowerup.com/review/nvidia-geforce-rtx-4080-super-founders-edition/42.html
- higher power consumption which results in hotter, noisier cards - high hotspot temperatures, need for gigantic coolers which introduce all types of other PC case-related inconveniencies, etc.
Not to mention, Nvidia cards come with the same gigantic coolers whether they need them or not.
Remember Radeon X1800 XT. Around 110 W.
Radeon X1600 PRO only 40 W.
www.techpowerup.com/gpu-specs/radeon-x1800-xt.c159
www.techpowerup.com/gpu-specs/radeon-x1600-pro.c139
Even the power hog for the time Radeon HD 2900 XT was "only" 215 W.
www.techpowerup.com/gpu-specs/radeon-hd-2900-xt.c192
1. Nvidia cards come with the same gigantic coolers as AMD ones do.
2. AMD hotspots are just as cool as Nvidia ones are when you don't take made-by-AMD cards into consideration.
www.techpowerup.com/review/amd-radeon-rx-5700-xt/32.html
I feel like this current architecture is a huge FAIL, and they are currently working on something different, but they need time, and in the meantime, they just launch the same thing made on smaller nm.
At this point, I really hope that Battlemage manages to achieve at least the same level of performance, and at least at the lower end of dGPUs, we'll see great prices
"AMD hotspots are just as cool as Nvidia ones are when you don't take made-by-AMD cards into consideration."
Besides, a 5700 XT review is just as relevant these days as a 2070 review would be. That is, not at all.
I think if they flagship chip isn't either iteratively faster and more efficient (say either 20% faster for the same power, same speed for 20% less power or say 10% faster, 10% less power) for similar money it's going to be a disappointment.
I'd struggle to be happy about 7900XTX performance for less money and power if there's not a tier above it.
I think the one thing that's up in the air until we see the silicon is are they closing the gap on RT performance. Can the chiplet design provide an avenue for something on die that can be used as raster and repurposed to provide effective RT without tanking performance.
Inquiring minds (mainly mine) want to know
After years upon years of repeatedly releasing awful products at the consumer segment (the Heavy Equipment line, especially Bulldozer, was so laughably bad that people who defended it at the time were simply defending the indefensible out of pure devotion to AMD - I still remember the lengthy discussions of folks trying to justify it and they all had to go nuclear the second anyone showed benchmarks from a Core i5, many even stuck to their increasingly obsolete Phenom II CPUs because they were actually faster, despite being built on the K10 architecture dating from 2007 and having a worse instruction set than a 65 nm Conroe Core 2), a rapidly waning and increasingly irrelevant server business, plus a failed ARM venture nearly bankrupted AMD. By July 2015, the situation was dire because other than maybe Radeon GPUs, all of their products were worthless garbage and the company had about as much prospect as a dry mine to show investors, shares of $AMD reached an all time low of $1.61. If I only knew that things would turn the way they did at the time, I would have never have to work a day in my life and I would definitely drive a Ferrari.
There is a reason the commercial name of the Zen processors is "Ryzen", it symbolizes the rebirth and rise of AMD (like a phoenix) in the literal sense of the word.
This is where the history is complicated ~ Intel if they chose to bury AMD could've done it post 2010, although I'm not sure if that monopoly(?) law in the US would kick in. AMD if they would've done faster node transitions post 2004 would be much more competitive with Phenom or Phenom II or if Intel didn't bribe the OEM's two decades back they may have had a lot more money for R&D post K10 or so. It's worked out in the end for AMD but it's no short of a miracle they're still here today!
Even if I prefer AMD I wouldn't want Intel, or Nvidia, to go anywhere because we know what they'd do in the absence of competition! The days of continued growth in this sector will continue for a bit more probably till we hit a pm or Apple/QC give up trying o_O
On the desktop CPU front, there's no need to even argue. The i5-2500K humiliated any AMD chip for gaming for practically 7 years after it was released (it was better than anything AMD has had until Ryzen came out), and for workstations, 1st gen Gulftown i7's would run loops on any Piledriver FX, including the FX-9590. By the time we had Broadwell-E, the differential between the fastest CPU that AMD had on offer (9590) and the i7-6950X was so high that you'd be looking anywhere from 3 to 10x+ the performance depending on how many cores and instruction set utilized. I recall having benchmarks twice as high as a FX-8350 with my i7-990X as it was.
I don't think Intel's misdeeds during the Otellini era justify K10's woes either. Remember, the original Phenom launched with a severe bug (remember the TLB bug?), and unlike the Core 2's 45 nm makeover, Phenom II is basically just K10 as it should have been from the start. Even after Thuban (worst money I ever spent on a CPU, that X6 1090T), it seemed obvious that they'd never be able to even dream of comparing to the Core i7, the architecture was just too dated and the absence of SSSE3 and SSE4.1 support hurt these CPUs super badly. Many years later, the poor souls that were still using a Phenom finally got hit with the software compatibility issues as their chips stopped running many games at all.
Servers? Well, at the time it was common knowledge that the enterprise world had clearly chosen Xeon. It wouldn't be until Epyc that AMD would finally earn relevance in this market. Xeons were ridiculously expensive, but scalable and performant, with Intel taking even requests from their biggest customers for tailored SKUs. Decommissioned Opterons made for fun and relatively affordable quad processor builds (I'd actually build one myself nowadays if I could get all parts including motherboard for cheap), but that's about where the "positives" ended. Opteron A series ARM CPUs essentially flopped and were discontinued with no further development, they never saw a successor.
Funny to bring up HEDT though, AMD just released a current generation Zen 4 consumer-grade "non-Pro" HEDT... with prices that makes the Core i7 Extreme look cheap back when Intel was considered insane to ask $1700 on a CPU. The cheapest Zen 4 Threadripper is $1499, and the most expensive one tops out at $4999, with a similar outlook if you compare that chip to say, a "pedestrian" Raptor Lake i9 or 7950X/X3D. Haven't heard a word about AMD being greedy scammers trying to ripoff people with their overpriced wares, but oh well.
Your comments about how "bad" AMD products were shows how influenced you were by propaganda. You see, even in the days of AMD's bad leadership their CPUs were fine. If you were building you own PC a $99 Phenom 2 965BE was a CPU that you could put a user in front of and they would not tell the difference vs an Intel CPU.
We don't need a History lesson through your lens. The fact is that none of what you say matters as it cannot be changed. Today AMD parts are in the leadership position.
You act like Intel is not asking a king's ransom for it's HEDT part and by the way those CPUs you mention had value added to them because people love shows like the Witcher and Content creation has ballooned with the rise of the streaming services. Not like Intel that sold you 2 more cores for $1000. At that tasks more cores and I/O matter and people are willing to pay for it. That means there is a premium but if I was buying the 24 core I would aprreciate that it is less than the MSRP of the previous 24 core part.
If you want an example of unabashed greed just look at top end boards like Godlike or Ace from MSI for Z790 or X670E and then look at those boards for HEDT. Who is hosing who?
Intel also does not currently offer an HEDT lineup at all, just like AMD didn't back in the day. Historical revisionism is farcical and doesn't excuse them from mistakes, past or present. I agree. My father still uses his 15 year old i5-520M laptop, he only uses it for light word processing and internet banking. It does the job, although, the machine has begun to break down and parts for it aren't easy to find anymore. Still, a great run, and props to Sony for building such a quality, long-lasting laptop.
That's why intel got the fines, because of all its illegal activities, some of which continue to this day.
By the time AMD was struggling with Phenom and then the biblical failure of the Heavy Equipment lineup, Intel would have been far, far ahead regardless simply because they had a far superior product to offer. Often I think that people don't quite realize how blessed the CPU market is today. Across all tiers and segments, Intel and AMD have equivalent processors who win over each other in an equal number of scenarios, this brings prices down and is great news to us, the consumers.