• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD to Skip RDNA 5: UDNA Takes the Spotlight After RDNA 4

@evernessince True, but cuda and ai are features aimed at the enterprise - workstations, not the average joe that plays on steam.

If AMD continues to offer an inferior product, they will continue to not sell.

RX 7900 XTX is not inferior to RTX 4070, for example. The obstacle that AMD themselves put is the price itself. No one wants to pay more than 500$ for an RX 7900 XTX.
 
RDNA was a big step up from GCN in performance per watt and performance per square mm. If they go the unified route, they would have to bring some of the RDNA improvements into CDNA.
 
@evernessince True, but cuda and ai are features aimed at the enterprise - workstations, not the average joe that plays on steam.

The Stable Diffusion reddit is one of the largest subreddits and there is a huge hobbyist AI market. I wouldn't call this workstation or enterprise per say. The 4090 has sold so well precisely thanks to this market. Aside from just AI I think you are vastly under-estimating the number of people who do more than just gaming on their PCs.

It's not necessarily important either whether everyone uses these features or not, people do factor a lack of certain features or qualities into their purchasing decision.
 
Does this mean that they will replace the 'old' graphical hardware pipeline with compute shaders and thus get a parallelisation that scales better?
 
Seems like really bad news, they're going to do the same thing with Vega where they made the architecture too granular with a lot of compute and not enough ROPs because there's really no other way you could unify these two different sets of requirements into one architecture.

I don't understand why they're doing this, according to them Instinct cards will generate some 5 billion in revenue this year alone so they can clearly make a lot of money from the compute side so why ruin it ? It made sense with Vega because they really didn't have the funds but now it could be a disaster.

RX 7900 XTX is not inferior to RTX 4070, for example. The obstacle that AMD themselves put is the price itself. No one wants to pay more than 500$ for an RX 7900 XTX.
Delusional to think an RX 7900 XTX should be priced at 500$, I really don't understand these obnoxious Nivdia fanboys takes, why should AMD charge peanuts while Nvidia inflates their prices with each generation, do you people have a masochistic fetish or something ? Why do you want to pay more ?
 
Last edited:
I don't understand why they're doing this, according to them Instinct cards will generate some 5 billion in revenue this year alone so they can clearly make a lot of money from the compute side so why ruin it ? It made sense with Vega because they really didn't have the funds but now it could be a disaster.

Because they still don't have the funds and can't upkeep two architectural lines anymore. While the CPU business at AMD has been remarkably successful and has earned everyone's respect and recognition, they have much, and I mean much work to do in their graphics business. Meanwhile, research and development has become extremely expensive, wafer prices are at an all time high due to the extremely advanced nodes being used, their resources are spread thin. They're having issues producing results, the gaming business made only $462 million last quarter, down 69%. There have been layoffs, people lost their jobs.

If their gaming business weakens due to being derived from the datacenter lineup, this is actually the smartest move and better outcome for the company's bottom line, and I cannot fault them for focusing on the businesses that make money: datacenter and client CPU segments. By contrast, NVIDIA's gaming business made $3.3 billion, up 15% from last quarter, even at the tail end of a generation. NV's gaming business made almost the same money as AMD's datacenter business, and that's just insanity.

Things really, really aren't good for Radeon right now. I'm not sure if you even noticed, but AMD hasn't released a driver this month, and it's already November 20. Meanwhile, NVIDIA released the STALKER 2 optimized Game Ready driver 8 days in advance, sometime early last week.

Instinct was repositioned as a datacenter product, and this segment's revenue was $3.5 billion according to their latest earnings call. Most of it will be immediately sunk in research and development costs for the next-generation products. Nvidia's datacenter revenue for the last fiscal quarter was $30.8B by comparison; and even that will end up being largely spent in R&D.

Delusional to think an RX 7900 XTX should be priced at 500$, I really don't understand these obnoxious Nivdia fanboys takes, why should AMD charge peanuts while Nvidia inflates their prices with each generation, do you people have a masochistic fetish or something ? Why do you want to pay more ?

$500 for a 7900 XTX-level card may actually end up a reality within the next 6 months as the RTX 50 series come out. And depending on the features offered by Blackwell, or their power consumption figures - as long as it has 16+ GB of RAM, $500 might actually make them rather undesirable...
 
Last edited:
Because they still don't have the funds and can't upkeep two architectural lines anymore.
There is no way a revenue stream of billions of dollars is not sufficient for that, besides if need be it makes more sense to just keep RDNA for the time being. Something is suspect, they either expect all this datacenter AI demand to crater in the coming years which is entirely possible or maybe they really don't know what they're doing.
 
There is no way a revenue stream of billions of dollars is not sufficient for that, besides if need be it makes more sense to just keep RDNA for the time being. Something is suspect, they either expect all this datacenter AI demand to crater in the coming years which is entirely possible or maybe they really don't know what they're doing.

I can easily see it being not sufficient, especially if they opt to spend that revenue in their stronger businesses (as a CEO, I would choose this path). Quick search shows each 3 nm wafer runs around $20k and this is set to increase next year. Engineers are very expensive as well, software takes a lot of time and investment to develop and maintain... I'm not sure Statista is the best source, but NV seems to have spent over $8.6B in R&D this fiscal year alone


1732151130197.png


A new GPU architecture is drafted roughly 4-6 years in advance, which probably means the products they are releasing now were under active development from 2018 to 2020, with costs increasing steadily over time. AMD's in the same boat, here's the data I found, but it also covers all of their other businesses, CPUs and all:


1732151345979.png


With these figures gaming and datacenter revenues wouldn't pay for their 2025 R&D budget at all
 
RDNA4 is the chance they have to prove they aren't 100% disconnected from reality and provide what made Polaris such a great non-high-end product, competitive pricing for the feature set you get.
We'll then see if they can keep driver development on par with software releases with the new architecture in 2026, or if we will be back to the dark age of months between driver releases fixing some problems and causing others. So fun to fire people to "solve issues".

So far, it seems datacenter sales turned the gaming consumer into a 3rd class citizen, up to AMD to prove the contrary.
 
I've only gotten back into computer hardware relatively recently, so I'm not familiar with AMD's GCN hardware. Could someone please give me a layman answer on why the unification will be beneficial?
GCN "Graphics Core Next" was AMD's graphics architecture used for the Radeon HD 7000 series (GCN ~1), R 200 series (GCN ~2), R9 285 (GCN 3), RX 400/500 series (GCN 4 aka Polaris), and RX Vega/Vega VII series (GCN 5 aka Vega). All GCN graphics cards, professional or gaming, used GCN. But now all gaming cards use RDNA and all professional cards use CDNA. Going from Vega to RDNA 1, AMD gaming cards delivered a nice improvement in gaming, but regressed in some professional workloads like compute. But part of RDNA's success was that the transistor and power budget wasn't wasted on as many non-gaming resources.

Today AMD has a marketshare problem. Very few customers buy CDNA GPUs and that's partly because they're not already common. RDNA itself could attract more customers if it had compute, and if all Radeon products had compute more people would have access to AMD's compute software and this would precipitate in more professional GPU sales. Ultimately, AMD has concluded that its market share can grow and its GPU architectures be made more cheaply if gaming and compute are re-unified, and this justifies the higher transistor cost. This article has what AMD said when this was announced. https://www.techpowerup.com/326442/...ular-gpu-architecture-similar-to-nvidias-cuda
 
No competition for 5090/5080 until mid 2026, great. AMD stopped trying, and we wonder why NVIDIA is free to do whatever they want with pricing upper mid and high end
 
No competition for 5090/5080 until mid 2026, great. AMD stopped trying, and we wonder why NVIDIA is free to do whatever they want with pricing upper mid and high end

Until mid 2026 assuming:

1. UDNA 1 doesn't flop (performance or feature regression)
2. UDNA 1 doesn't suffer from the same type of chronic issues that plagued RDNA 1 (infamous 5700 XT black screen issues for example)
3. They have more than just a pretty control panel to offer. Even then, I'll concede that asking polished drivers that are stable and performant for a new architecture day one is a bit much - although I expect full stability after 4 months or so
 
With these figures gaming and datacenter revenues wouldn't pay for their 2025 R&D budget at all
This doesn't make sense no matter how you spin it, they were making a lot less money when they decided to split their architectures into CDNA and RDNA. I don't think R&D budgets are the issue here, if you spent 5 billion in past year on R&D you're not mandated to spend even more this year. You are blindsided by the numbers, these companies always spend whatever they have, both AMD and Nvidia have been making more money in the past years so their R&D spending went up proportionally.

NVIDIA is free to do whatever they want with pricing upper mid and high end
They always did that, there isn't a single example I can think of when either AMD or ATI ever had much of an influence on their pricing schemes no matter how competitive they were, absurdities like the GTX 8800 Ultra Mega Super always existed.
 
Last edited:
This doesn't make sense no matter how you spin it, they were making a lot less money when they decided to split their architectures into CDNA and RDNA. I don't think R&D budgets are the issue here, if you spent 5 billion in past year on R&D you're not mandated to spend even more this year. You are blindsided by the numbers, these companies always spend whatever they have, both AMD and Nvidia have been making more money in the past years so their R&D spending went up proportionally.

Sure, you're not mandated, but then you're going to stagnate even further since research and development won't be done for free, and slashing R&D budget ultimately means slower progress and/or a worse resulting product. Given the reliance on compute modern games have, and that dedicated graphics cards are set to become the primary accelerator in the modern AI PC, reunifying CDNA and RDNA into UDNA, while also lowering their development and maintenance costs and reining in R&D spending is a winning move.

I don't think their previous woes had anything to do specifically with them an unified architecture, but because of the resulting product. At the end of the day, that is what most customers care about. The Vega series (especially the VII) were built on what was essentially an almost 10 year old architectural base, even though they were refined to the extreme and used highly advanced technology for their time (such as HBM memory), the fundamental problems of GCN still affected them, all the while Turing was already DX12 Ultimate capable before the standard was even made public. RDNA 1 was an attempt to restart the gaming lineup, and while the first iteration was pretty bad, we can see that RDNA 2 was very much a respectable architecture, arguably the best AMD's made in the past few years.

It was clearly on a level all its own, even though the prices were far higher than what AMD practiced at the time, you usually got what you paid for, perhaps it didn't mean much at the time with rudimentary, almost demo-like support for DLSS 1 and early RT games in the beginning, but long-term, I'd say people who invested in Turing ended up way better off. We see a similar situation now; Nvidia has a more refined, feature-complete, efficient product to sell, and even though they charge for the privilege, the market has clearly - and repeatedly chosen to support that path. The overwhelming success of Pascal architecture and the fact that people are still fond of and using their GTX 1080 and 1080 Ti GPUs today, 8 years later, has even prolonged the RDNA 1 cards' viability, since they have incomplete DirectX 12 support just like Pascal cards and game developers are still inclined to support these products today.
 
It will be interesting to see how rasterization moves forward with UDNA. Same with RT even. AI stuff doesn't care about these. Though HPC does care a lot about shaders. Wonder how it will impact gaming. HPC/AI is where the money is at. Not so much gaming. Will be interesting to see how the business market shapes AMDs consumer video cards.

I wonder how UDNA will translate across the spectrum of power needs. From smartphones to servers. Usually doesn't go to well. I started seeing articles pop up showing that AMD might be interested in getting into the mobile market, I don't really see how that will really turn into a thing.
 
They always did that, there isn't a single example I can think of when either AMD or ATI ever had much of an influence on their pricing schemes no matter how competitive they were, absurdities like the GTX 8800 Ultra Mega Super always existed.
Some short memory you have.
 
Sure, you're not mandated, but then you're going to stagnate even further since research and development won't be done for free, and slashing R&D budget ultimately means slower progress and/or a worse resulting product. Given the reliance on compute modern games have, and that dedicated graphics cards are set to become the primary accelerator in the modern AI PC, reunifying CDNA and RDNA into UDNA, while also lowering their development and maintenance costs and reining in R&D spending is a winning move.

I don't think their previous woes had anything to do specifically with them an unified architecture, but because of the resulting product. At the end of the day, that is what most customers care about. The Vega series (especially the VII) were built on what was essentially an almost 10 year old architectural base, even though they were refined to the extreme and used highly advanced technology for their time (such as HBM memory), the fundamental problems of GCN still affected them, all the while Turing was already DX12 Ultimate capable before the standard was even made public. RDNA 1 was an attempt to restart the gaming lineup, and while the first iteration was pretty bad, we can see that RDNA 2 was very much a respectable architecture, arguably the best AMD's made in the past few years.

It was clearly on a level all its own, even though the prices were far higher than what AMD practiced at the time, you usually got what you paid for, perhaps it didn't mean much at the time with rudimentary, almost demo-like support for DLSS 1 and early RT games in the beginning, but long-term, I'd say people who invested in Turing ended up way better off. We see a similar situation now; Nvidia has a more refined, feature-complete, efficient product to sell, and even though they charge for the privilege, the market has clearly - and repeatedly chosen to support that path. The overwhelming success of Pascal architecture and the fact that people are still fond of and using their GTX 1080 and 1080 Ti GPUs today, 8 years later, has even prolonged the RDNA 1 cards' viability, since they have incomplete DirectX 12 support just like Pascal cards and game developers are still inclined to support these products today.
Yep that is why the 4090 in every single Game. That is why MX300 is not selling well. It does not matter how many paragraphs you write the fact remains that you are very anti AMD.
 
Yep that is why the 4090 in every single Game. That is why MX300 is not selling well. It does not matter how many paragraphs you write the fact remains that you are very anti AMD.

You need to be reading very hard into it and very wrong to deduct "I hate AMD" from that post.
 
It doesn't matter if AMD (ATI) puts a better product out. The Nvidia mindshare is unreal, beyond Apple. They offer extreamely good products right now, like the RX 7800 and 7900gre or even the 7700, Nvidia is selling boats full of 4060's. They can slap their logo on just about anything and your stereotypical diabetic with Cheetos fingers and greasy balding hair is going to buy it.
 
It doesn't matter if AMD (ATI) puts a better product out. The Nvidia mindshare is unreal, beyond Apple. They offer extreamely good products right now, like the RX 7800 and 7900gre or even the 7700, Nvidia is selling boats full of 4060's. They can slap their logo on just about anything and your stereotypical diabetic with Cheetos fingers and greasy balding hair is going to buy it.
True, AMD has too many drawbacks with its bad FSR with loads of visual artifacts, framegen is also not there yet either. If AMD can get FSR and framegen the same or better than NV, and fix it's terrible RT engine then they would be seen as a viable alternative to NV.

But all the time AMD over-price their cards, offer lower visual quality then they will carry on fading in to the void. They need more than just speed.
 
True, AMD has too many drawbacks with its bad FSR with loads of visual artifacts, framegen is also not there yet either. If AMD can get FSR and framegen the same or better than NV, and fix it's terrible RT engine then they would be seen as a viable alternative to NV.

But all the time AMD over-price their cards, offer lower visual quality then they will carry on fading in to the void. They need more than just speed.
I must say I've never had a single issue with AFMF (paired with Anti-Lag), but the CS2 VAC fiasco burned AMD's image seriously.
 
True, AMD has too many drawbacks with its bad FSR with loads of visual artifacts, framegen is also not there yet either. If AMD can get FSR and framegen the same or better than NV, and fix it's terrible RT engine then they would be seen as a viable alternative to NV.

But all the time AMD over-price their cards, offer lower visual quality then they will carry on fading in to the void. They need more than just speed.
Found the 4090 owner with orange fingers.
 
Wonder if this will hurt sales at all. I have less interest in RDNA 4 now honestly. Id be more interested in getting one just to collect. Same with Intel's cards.

Not to hopeful there will be an RDNA card that is decently more powerful than the 6600 in the same power budget. RT and AI might be heavily improved, but in these lower end cards you still need to have great raster performance as AI and RT is less valuable to utilize.
 
I have less interest in RDNA 4 now honestly. Id be more interested in getting one just to collect.

Good plan. You collect, I’ll hold, and together we’ll corner the market on GPUs :D
 
Found the 4090 owner with orange fingers.
So all nVidia owners are Trump supporters now? I'm not even American you small-minded f. But I'll tell you what, nVidia cards are better than Radeon cards, and Trump is your President, deal with it, wipe away those delicious salty lib tears and grow up and move on. And no, I don't own a 4090, nor do I plan on owning a 5090.

There are plenty of comparison videos that show the difference between FSR and DLSS, and FSR is pretty good, but it NEVER wins against DLSS. FACT, unless you don't know what to look for, which wouldn't surprise me at this point!
 
Last edited:
Back
Top