Tuesday, November 19th 2024
AMD to Skip RDNA 5: UDNA Takes the Spotlight After RDNA 4
While the current generation of AMD graphics cards employs RDNA 3 at its core, and the upcoming RX 8000 series will feature RDNA 4, the latest leaks suggest RDNA 5 is not in development. Instead, UDNA will succeed RDNA 4, simplifying AMD's GPU roadmap. A credible source on the Chiphell forums, zhangzhonghao, reports that the UDNA-based RX 9000 series and Instinct MI400 AI accelerator will incorporate the same advanced Arithmetic Logic Unit (ALU) designs in both products, reminiscent of AMD's earlier GCN architectures before the CDNA and RDNA split. Sony's next-generation PlayStation 6 is also rumored to adopt UDNA technology. The PS5 and PS5 Pro currently utilize RDNA 2, while the Pro variant integrates elements of RDNA 4 for enhanced ray tracing. The PS6's CPU configuration remains unclear, but speculation revolves around Zen 4 or Zen 5 architectures.
The first UDNA gaming GPUs are expected to enter production by Q2 2026. Interestingly, AMD's RDNA 4 GPUs are anticipated to focus on entry-level to mid-range markets, potentially leaving high-end offerings until the UDNA generation. This strategic pause may allow AMD to refine AI-accelerated technologies like FidelityFX Super Resolution (FSR) 4, aiming to compete with NVIDIA's DLSS. This unification is inspired by NVIDIA's CUDA ecosystem, which supports cross-platform compatibility from laptops to high-performance servers. As AMD sees it, the decision addresses the challenges posed by maintaining separate architectures, which complicate memory subsystem optimizations and hinder forward and backward compatibility. Putting developer resources into RDNA 5 is not economically or strategically wise, given that UDNA is about to take over. Additionally, the company is enabling ROCm software support across all products ranging from consumer Radeon to enterprise Instinct MI. Accelerating software for one platform will translate to the entire product stack.
Source:
PC Guide
The first UDNA gaming GPUs are expected to enter production by Q2 2026. Interestingly, AMD's RDNA 4 GPUs are anticipated to focus on entry-level to mid-range markets, potentially leaving high-end offerings until the UDNA generation. This strategic pause may allow AMD to refine AI-accelerated technologies like FidelityFX Super Resolution (FSR) 4, aiming to compete with NVIDIA's DLSS. This unification is inspired by NVIDIA's CUDA ecosystem, which supports cross-platform compatibility from laptops to high-performance servers. As AMD sees it, the decision addresses the challenges posed by maintaining separate architectures, which complicate memory subsystem optimizations and hinder forward and backward compatibility. Putting developer resources into RDNA 5 is not economically or strategically wise, given that UDNA is about to take over. Additionally, the company is enabling ROCm software support across all products ranging from consumer Radeon to enterprise Instinct MI. Accelerating software for one platform will translate to the entire product stack.
63 Comments on AMD to Skip RDNA 5: UDNA Takes the Spotlight After RDNA 4
I am not happy with Nvidia prices, and do make it clear in many posts, also not happy with how Nvidia are under spec'd VRAM in so many cards.
The reason I havent gone out and brought a cheaper AMD card is they dont have feature parity. With GPUs its about software as well. SGSSAA is a deal breaker for me. DLSS is the best modern upscaler, and as it turns out I now like RTX video. On top of this apparently AMD's encoder is even worse than NVENC.
Also the reason I have said AMD need to drop prices is they are the ones chasing market share, thats typically what you need to do to get market share. Of course one effect of AMD doing that is it can also affect Nvidia pricing.
I am no fan boy though, I never understood the mindset of falling in love with a corporate, I dont particularly like Nvidia, too much proprietary stuff in addition to the stuff mentioned above. But ultimately if I dislike a company it doesnt stop me buying their products, my decisions are not based on emotions, life is too short for that, I buy whats suited for my needs at the moment.
I agreed with AMDs initial stance of providing better VRAM instead of silly novelty RT, but sadly Nvidia has managed to infect the AAA market with it, so it looks like AMD are having to change tact on that with future hardware.
Radeon HD 4890 = $250 in 2009
Radeon HD 5870 = $400 in 2010
Market share:
Today, when RX 7900 XTX is $1000, AMD's share has gone down from 44.5% to 12%:
pcviewed.com/nvidia-vs-amd-discrete-gpu-market-share/
AMD cannot compete if their margins turn into dust, something many of you also cannot comprehend is that if the margins of AIBs become too low they'll simply drop AMD. End result ? You'll pay even more for that Nvidia card you've been waiting to buy.
No matter how its spun - preferences, brand loyalties, or personal justifications, the bottom line remains the same, the pricing is outright ridiculous. From a consumer perspective, this issue should be at the forefront of every discussion about the industry’s future. No level of feature sets, dominance, marketing ploys, strategic affiliations/partnerships and mind/market-share should have consumers justifying corporate goals which is nothing short of unethical exploitation for profit.
Even if AMD improves their products, it'll just mean they'll follow nvidia's price strategy for those sweet margins, and if their UDNA plan follows through it means they'll 100% be able to copy this strategy (full focus on data center, leftovers for the consumer market).
Maybe Intel can provide some good value products, we shall see. What I think will happen is the death of discrete components for most casual gamers and integrated solutions becoming more common, since it allows products in a smaller envelope without the limitations found in our usual ATX formats. Strix Halo is a good exame of that.
However, I believe it's meant to say about folks that eat Cheetos all day long.
agreed that 4060 sucks, but what amd sells in the same price range is a 7600, rebranded 6600xt, not a 7800xt. and it's not like nvidia doesn't sell a 4070 Super that's faster and more efficient than 7900gre. the retail price difference between them is about 40 euros, if you say it is not worth the RT performance, efficiency and not having to use the shimmering mess that fsr is for an upscaler at higher resolutions, you're frankly just an amd fanboy/apologist. I've played 300hrs of rdr2 at 5K DSR with dlss performance, and personally that alone would be enough for me to take the 4070S over the 7900gre if I were to choose again.
It very much matters what they offer,can't blame everything on "mindshare" if there are actual disadvantages to owning their products compared to the other brand.
I do a lot of work on the pc too, using dual high refresh monitors, and I know for a fact after owning a 6800 that it's a mess on amd. When you have two monitors on, and god forbid you want to play a yt video in the background, 6800 just ramps up to +40w power. 4070S sitting at 9W now doing exactly that. When you do 10-20 hrs of such work every week, that adds up to your power bill, which will pretty much nullify any price advantage that amd has in a matter of a year or less.
If you're paying $2.92/kWh or €2.77/kWh then I can see that difference but the average price in Europe is an order of magnitude lower and that's going by the single new 6800 I can find today at $520. Using the competitive price of $400 it was at before stock ran out, you'd need to be paying $8/kWh to make up the difference between the 4070S and the 6800 in a year.
I don't like high power use for simple tasks, it bugs me and Nvidia cards are better behaved in this way. But the financial impact in most use cases is minimal.
I think DLSS FSR comparison is flawed half the time. Nvidia spends zilch on optimizing their cards for FSR. Comparing FSR to DLSS on an Nvidia card is unfortunately done too often. I don't see any major difference between DLSS and FSR 3.0, and I think techsites exaggerate the difference to absurdities. A control group vs placebo group may be an ego buster for Geforce owners.
doesn't matter what I put in my signature, as long as I'm quoting facts. There is a night and day difference between saying certain things because you prefer X over Y (fanboyism), and preferring X over Y because you can say certain things abut how they compare. And hey, look at yours..... I have never seen anyone claim fsr looks different on amd than on nvidia. Can you prove it ? Sounds made up to me.
Nvidia makes DLSS, not FSR. It's not on them to tinker with FSR implementation. No one willingly chooses the other one if they have a better solution available. Just look at TPUs reviews of FSR2/3, still the worst of upscalers. This is the latest, from STALKER2, but it's not like other fsr2/3 games are better than dlss3. Also, dlss3.5 includes a rt denoiser, which amd just doesn't have : "Tech sites report it, but I don't see it" is not an objective point of view to begin a discussion. Refer to what I wrote about saying things because of brand preference, you're doing the exact things you accuse me of. I guess W1zzard has just been posting nvidia/intel/Epic/Sony-sponsored content in dlss/xess/UE5 upscaler (can't remember the name) vs fsr reviews for years.
btw, fsr3.0/3.1 available for a handful of games only, while dlss3 in hundreds.
Get back to me when you're ready to discuss actual facts.