Tuesday, November 19th 2024
AMD to Skip RDNA 5: UDNA Takes the Spotlight After RDNA 4
While the current generation of AMD graphics cards employs RDNA 3 at its core, and the upcoming RX 8000 series will feature RDNA 4, the latest leaks suggest RDNA 5 is not in development. Instead, UDNA will succeed RDNA 4, simplifying AMD's GPU roadmap. A credible source on the Chiphell forums, zhangzhonghao, reports that the UDNA-based RX 9000 series and Instinct MI400 AI accelerator will incorporate the same advanced Arithmetic Logic Unit (ALU) designs in both products, reminiscent of AMD's earlier GCN architectures before the CDNA and RDNA split. Sony's next-generation PlayStation 6 is also rumored to adopt UDNA technology. The PS5 and PS5 Pro currently utilize RDNA 2, while the Pro variant integrates elements of RDNA 4 for enhanced ray tracing. The PS6's CPU configuration remains unclear, but speculation revolves around Zen 4 or Zen 5 architectures.
The first UDNA gaming GPUs are expected to enter production by Q2 2026. Interestingly, AMD's RDNA 4 GPUs are anticipated to focus on entry-level to mid-range markets, potentially leaving high-end offerings until the UDNA generation. This strategic pause may allow AMD to refine AI-accelerated technologies like FidelityFX Super Resolution (FSR) 4, aiming to compete with NVIDIA's DLSS. This unification is inspired by NVIDIA's CUDA ecosystem, which supports cross-platform compatibility from laptops to high-performance servers. As AMD sees it, the decision addresses the challenges posed by maintaining separate architectures, which complicate memory subsystem optimizations and hinder forward and backward compatibility. Putting developer resources into RDNA 5 is not economically or strategically wise, given that UDNA is about to take over. Additionally, the company is enabling ROCm software support across all products ranging from consumer Radeon to enterprise Instinct MI. Accelerating software for one platform will translate to the entire product stack.
Source:
PC Guide
The first UDNA gaming GPUs are expected to enter production by Q2 2026. Interestingly, AMD's RDNA 4 GPUs are anticipated to focus on entry-level to mid-range markets, potentially leaving high-end offerings until the UDNA generation. This strategic pause may allow AMD to refine AI-accelerated technologies like FidelityFX Super Resolution (FSR) 4, aiming to compete with NVIDIA's DLSS. This unification is inspired by NVIDIA's CUDA ecosystem, which supports cross-platform compatibility from laptops to high-performance servers. As AMD sees it, the decision addresses the challenges posed by maintaining separate architectures, which complicate memory subsystem optimizations and hinder forward and backward compatibility. Putting developer resources into RDNA 5 is not economically or strategically wise, given that UDNA is about to take over. Additionally, the company is enabling ROCm software support across all products ranging from consumer Radeon to enterprise Instinct MI. Accelerating software for one platform will translate to the entire product stack.
63 Comments on AMD to Skip RDNA 5: UDNA Takes the Spotlight After RDNA 4
It's not necessarily important either whether everyone uses these features or not, people do factor a lack of certain features or qualities into their purchasing decision.
I don't understand why they're doing this, according to them Instinct cards will generate some 5 billion in revenue this year alone so they can clearly make a lot of money from the compute side so why ruin it ? It made sense with Vega because they really didn't have the funds but now it could be a disaster. Delusional to think an RX 7900 XTX should be priced at 500$, I really don't understand these obnoxious Nivdia fanboys takes, why should AMD charge peanuts while Nvidia inflates their prices with each generation, do you people have a masochistic fetish or something ? Why do you want to pay more ?
If their gaming business weakens due to being derived from the datacenter lineup, this is actually the smartest move and better outcome for the company's bottom line, and I cannot fault them for focusing on the businesses that make money: datacenter and client CPU segments. By contrast, NVIDIA's gaming business made $3.3 billion, up 15% from last quarter, even at the tail end of a generation. NV's gaming business made almost the same money as AMD's datacenter business, and that's just insanity.
Things really, really aren't good for Radeon right now. I'm not sure if you even noticed, but AMD hasn't released a driver this month, and it's already November 20. Meanwhile, NVIDIA released the STALKER 2 optimized Game Ready driver 8 days in advance, sometime early last week.
Instinct was repositioned as a datacenter product, and this segment's revenue was $3.5 billion according to their latest earnings call. Most of it will be immediately sunk in research and development costs for the next-generation products. Nvidia's datacenter revenue for the last fiscal quarter was $30.8B by comparison; and even that will end up being largely spent in R&D. $500 for a 7900 XTX-level card may actually end up a reality within the next 6 months as the RTX 50 series come out. And depending on the features offered by Blackwell, or their power consumption figures - as long as it has 16+ GB of RAM, $500 might actually make them rather undesirable...
www.statista.com/statistics/988048/nvidia-research-and-development-expenses/
A new GPU architecture is drafted roughly 4-6 years in advance, which probably means the products they are releasing now were under active development from 2018 to 2020, with costs increasing steadily over time. AMD's in the same boat, here's the data I found, but it also covers all of their other businesses, CPUs and all:
www.macrotrends.net/stocks/charts/AMD/amd/research-development-expenses
With these figures gaming and datacenter revenues wouldn't pay for their 2025 R&D budget at all
We'll then see if they can keep driver development on par with software releases with the new architecture in 2026, or if we will be back to the dark age of months between driver releases fixing some problems and causing others. So fun to fire people to "solve issues".
So far, it seems datacenter sales turned the gaming consumer into a 3rd class citizen, up to AMD to prove the contrary.
Today AMD has a marketshare problem. Very few customers buy CDNA GPUs and that's partly because they're not already common. RDNA itself could attract more customers if it had compute, and if all Radeon products had compute more people would have access to AMD's compute software and this would precipitate in more professional GPU sales. Ultimately, AMD has concluded that its market share can grow and its GPU architectures be made more cheaply if gaming and compute are re-unified, and this justifies the higher transistor cost. This article has what AMD said when this was announced. www.techpowerup.com/326442/amd-to-unify-gaming-rdna-and-data-center-cdna-into-udna-singular-gpu-architecture-similar-to-nvidias-cuda
1. UDNA 1 doesn't flop (performance or feature regression)
2. UDNA 1 doesn't suffer from the same type of chronic issues that plagued RDNA 1 (infamous 5700 XT black screen issues for example)
3. They have more than just a pretty control panel to offer. Even then, I'll concede that asking polished drivers that are stable and performant for a new architecture day one is a bit much - although I expect full stability after 4 months or so
I don't think their previous woes had anything to do specifically with them an unified architecture, but because of the resulting product. At the end of the day, that is what most customers care about. The Vega series (especially the VII) were built on what was essentially an almost 10 year old architectural base, even though they were refined to the extreme and used highly advanced technology for their time (such as HBM memory), the fundamental problems of GCN still affected them, all the while Turing was already DX12 Ultimate capable before the standard was even made public. RDNA 1 was an attempt to restart the gaming lineup, and while the first iteration was pretty bad, we can see that RDNA 2 was very much a respectable architecture, arguably the best AMD's made in the past few years.
It was clearly on a level all its own, even though the prices were far higher than what AMD practiced at the time, you usually got what you paid for, perhaps it didn't mean much at the time with rudimentary, almost demo-like support for DLSS 1 and early RT games in the beginning, but long-term, I'd say people who invested in Turing ended up way better off. We see a similar situation now; Nvidia has a more refined, feature-complete, efficient product to sell, and even though they charge for the privilege, the market has clearly - and repeatedly chosen to support that path. The overwhelming success of Pascal architecture and the fact that people are still fond of and using their GTX 1080 and 1080 Ti GPUs today, 8 years later, has even prolonged the RDNA 1 cards' viability, since they have incomplete DirectX 12 support just like Pascal cards and game developers are still inclined to support these products today.
I wonder how UDNA will translate across the spectrum of power needs. From smartphones to servers. Usually doesn't go to well. I started seeing articles pop up showing that AMD might be interested in getting into the mobile market, I don't really see how that will really turn into a thing.
But all the time AMD over-price their cards, offer lower visual quality then they will carry on fading in to the void. They need more than just speed.
Not to hopeful there will be an RDNA card that is decently more powerful than the 6600 in the same power budget. RT and AI might be heavily improved, but in these lower end cards you still need to have great raster performance as AI and RT is less valuable to utilize.
There are plenty of comparison videos that show the difference between FSR and DLSS, and FSR is pretty good, but it NEVER wins against DLSS. FACT, unless you don't know what to look for, which wouldn't surprise me at this point!
But GCN did not end with VEGA. CDNA is GCN, but with the ROP and TMU units removed to make way for matrix cores. By the way, CDNA3 still keeps the GFX9 version like VEGA (GCN5).
=>GCN5/CDNA/CDNA2/CDNA3/CDNA4: GFX9
=>RDNA/RDNA2 (GFX10) - RDNA3 (GFX11) - RDNA4 (GFX12)
Logical to think that UDNA will be GFX13 (or GFX14 if they are superstitious) for both Radeon and Instinct.