Sunday, August 6th 2023

AMD Retreating from Enthusiast Graphics Segment with RDNA4?

AMD is rumored to be withdrawing from the enthusiast graphics segment with its next RDNA4 graphics architecture. This means there won't be a successor to its "Navi 31" silicon that competes at the high-end with NVIDIA; but rather one that competes in the performance segment and below. It's possible AMD isn't able to justify the cost of developing high-end GPUs to push enough volumes over the product lifecycle. The company's "Navi 21" GPU benefited from the crypto-currency mining swell, but just like with NVIDIA, the company isn't able to push enough GPUs at the high-end.

With RDNA4, the company will focus on specific segments of the market that sell the most, which would be the x700-series and below. This generation will be essentially similar to the RX 5000 series powered by RDNA1, which did enough to stir things up in NVIDIA's lineup, and trigger the introduction of the RTX 20 SUPER series. The next generation could see RDNA4 square off against NVIDIA's next-generation, and hopefully, Intel's Arc "Battlemage" family.
Source: VideoCardz
Add your own comment

363 Comments on AMD Retreating from Enthusiast Graphics Segment with RDNA4?

#101
ZoneDymo
beedooGreat! No need for me to look at AMD cards anymore. No need to think or compare; just buy Nvidia I guess.
thats what most do anyway, thats why you get that 80% - less then 20% split on Steam
Posted on Reply
#102
rv8000
The arm chair engineers and VPs are out in full force here. There are some laughably bad takes in this thread, based on a rumor no less. Though none of this is really a surprise based on the TPU regulars.
Posted on Reply
#103
ZoneDymo
enb141I have to disagree with you, I tried AMD (twice) and in both cases I ended up hating them, I got a 6400 to replace my 1050 TI, the 6400 has way better GPU power, but their support is just trash, their drivers suck, so now I'm moving back to nvidia paying almost twice getting a 4060 but knowing that most of the issues AMD has, nvidia doesn't have.
out of curiosity, what problems did you have that makes you say the support is trash and drivers suck?
Posted on Reply
#104
Tropick
rv8000The arm chair engineers and VPs are out in full force here. There are some laughably bad takes in this thread, based on a rumor no less. Though none of this is really a surprise based on the TPU regulars.
Alright mr cool guy since our takes are so bad what's your opinion?
Posted on Reply
#105
zo0lykas
Great

Most people playing anyway 1080p, some playing 1440p and very small amount on 4k

So what's a point invest money, and creating new high end gpus when they have upscaling software.
Posted on Reply
#106
Assimilator
thunderingroarThats entirely false?? If anything zen3 was the high margin high volume product. Are people so quick to forget all the community outrage when AMD raised prices on zen3 (built on same 7nm+12nm as zen2) while also deciding to exclude the box coolers?
That doesn't prove Zen 3 was high-margin in any way shape or form.
thunderingroarMeanwhile Zen 4 launched with 50$ lower MSRP on 7700x (5800x MSRP was 450$) and 100$ lower MSRP on 7950x (5950x MSRP was 800$) and the non X parts now also include box coolers. Currently ryzen 7000 prices are very reasonable.
Quoting Zen 3 MSRP is asinine because none of those CPUs are selling at MSRP anymore. 5800X goes for £180 new while a 7700X is £100 more, which makes the latter part 50% more expensive for... what? The same is true of the motherboards, and then there's the cost of DDR5 vs dirt-cheap DDR4. Like I said, AMD's competing with itself.
thunderingroarYour claims on zen4 being "high margin" are even less true once you realize that Zen3 was still using that dogshit dirt cheap Global Foundry 12nm for IO die which made all of their cpus have awful idle power draw compared to intel. Now zen4 uses much better and more efficient (also a lot more expensive) TSMC 6nm IO die. May i also remind you that IO die is actually bigger than the core complex die and it heavily factors in the total BoM cost
Fair.
thunderingroarServer market? Consoles and handhelds?
AMD has nothing in the server market. Consoles are a consistent source of revenue but not a particularly lucrative one. Handhelds are toys.
Posted on Reply
#107
AusWolf
TheinsanegamerNIt didnt though. It didnt work with GCN either.

When AMD abandoned the high end market for GPUs with GCN, this gave nvidia's maxwell the top tier on a silver platter. That immediately led to GCN 4.0, AKA polaris, where AMD didnt have anything above the low tier. The result was nvidia's 1070 making more money per unit then any GPU, and outselling all of AMD COMBINED. As did the 1080. And the 1060 sold 4x as all of polaris. rDNA1 had a decent run, with AMD fixing drivers and fixing their reputation, but at the same time high end buyers had no options. Anyone who had a vega card had no upgrade path, meanwhile the 1080ti/2080/2080ti sat right there with a whole market to themselves.

All of this funneled money into nvidia, allowing them to pull further ahead with things like RT. The last thing AMD needs to do now is abandon the high yield cards. They need to get a competent stack of cards with a true halo product out there, like nvidia has consistently done for 15 years. The first time AMD's marketshare increased since 2014 was with rDNA2, the first full stack since 2014.

Or just sell RTG to intel with a clause that they can continue to license the GPU IP for their APUs in the future. Or maybe samsung, they would have their own fabs.
Are halo products the main source of income of either of these companies, though? You can argue for the occult Nvidia fans who buy the latest x90 card in every generation, but there's no such thing on AMD's side. Both the 6900 XT and 7900 XTX were under tight scrutiny by both reviewers and buyers. I don't think putting a huge amount of money and effort into making something that can't beat Nvidia's latest flagship is a financially sound decision.

Also, the 5700 XT wasn't really a success on its own merits. Sure, it was an okay midrange card, but it was a good testing ground for AMD to see how RDNA works. We wouldn't have RDNA 2 without it. If AMD manages to mix things up with RDNA 4's architecture, and make some decent mirage cards with relatively low R&D costs, it could give them a chance to draw some conclusions for RDNA 5 again. Withdraw from the battle to save your strength for the next one is not a bad plan, imo.
Posted on Reply
#108
Assimilator
AusWolfAre halo products the main source of income of either of these companies, though?
Of course not; halo products never are. But NVIDIA has a halo-level revenue stream from the data centre teat, which means they're free to experiment with consumer product pricing and positioning. AMD lacks that strong, almost passive revenue input - which is why it's so completely bizarre to see them not only effectively give the mid-range away, but give it away to their competitor that isn't even particularly interested in playing there this generation!

Every company drops the ball once in a while, but AMD didn't just drop it, they tripped over it and faceplanted. All they had to do was introduce a moderately competent mid-range GPU series, at moderately sane pricing, and they would've eaten the lunch of 4060 and 4070. Instead it's those GPUs that are eating AMD's lunch!
AusWolfI don't think putting a huge amount of money and effort into making something that can't beat Nvidia's latest flagship is a financially sound decision.
On the one hand, it's easy to say that in hindsight; both companies develop their GPUs essentially in the dark from what the other is doing, so they can't know whether they've chosen rightly or wrongly until the other side launches theirs. You've just gotta put your head down, try to build the best, and hope it's good enough. If it isn't, then you've sunk a lot of money into being a loser, and strangely enough that isn't a compelling market position to be in.

On the other, this is again one of the reasons that it makes so much sense to compete in the mid-range first: if your product isn't as good as your competitor's, you can avoid being a loser by adjusting price. The old adage of "no bad products, only bad prices" will never cease to be true as long as we live in a capitalist world, and sometimes it's smart to take the hit to keep up market- and mindshare.
AusWolfWithdraw from the battle to save your strength for the next one is not a bad plan, imo.
It's a terrible plan, because it allows your competitors to take up the slack that you've left by stepping aside. That allows them to get more of their product onto shelves, which means their brand is further into consumer consciousness while yours recedes. And when you come back to try to compete again you'll find retailers aren't willing to give you "your" shelf space back, because they've already got competitor product there that they know they can move. Nope, absolutely the worst thing any company selling consumer goods can ever do.
Posted on Reply
#109
AusWolf
AssimilatorOf course not; halo products never are. But NVIDIA has a halo-level revenue stream from the data centre teat, which means they're free to experiment with consumer product pricing and positioning. AMD lacks that strong, almost passive revenue input - which is why it's so completely bizarre to see them not only effectively give the mid-range away, but give it away to their competitor that isn't even particularly interested in playing there this generation!

Every company drops the ball once in a while, but AMD didn't just drop it, they tripped over it and faceplanted. All they had to do was introduce a moderately competent mid-range GPU series, at moderately sane pricing, and they would've eaten the lunch of 4060 and 4070. Instead it's those GPUs that are eating AMD's lunch!
Are we still talking about RDNA 4, or midrange RDNA 3 now? I'm a bit confused. If you're talking about the lack of midrange RDNA 3 cards, then I agree. Midrange has been the main breadwinner for AMD for about a decade now. Giving it up like they have done in the last year is mind-boggling. They probably put too much faith in the 7900 series, which ended up falling terribly behind Ada. If this is what they're planning to change with RDNA 4, and focus their resources in the area where they can still score some decent goals, that is, the midrange, instead of chasing the halo tier which they'll never reach at this pace, then I think it's a sound decision.
AssimilatorOn the one hand, it's easy to say that in hindsight; both companies develop their GPUs essentially in the dark from what the other is doing, so they can't know whether they've chosen rightly or wrongly until the other side launches theirs. You've just gotta put your head down, try to build the best, and hope it's good enough. If it isn't, then you've sunk a lot of money into being a loser, and strangely enough that isn't a compelling market position to be in.

On the other, this is again one of the reasons that it makes so much sense to compete in the mid-range first: if your product isn't as good as your competitor's, you can avoid being a loser by adjusting price. The old adage of "no bad products, only bad prices" will never cease to be true as long as we live in a capitalist world, and sometimes it's smart to take the hit to keep up market- and mindshare.
Yeah, you don't know how good your race car is until you're out in the race. But if you think it's good in the garage, and then it ends up being totally meh in the field, won't you call regroup and think of an alternative strategy?

The article says AMD is planning to do exactly what you said: compete in the midrange first. Less R&D, lower manufacturing costs, more chips per wafer, higher sales numbers, and an opportunity to learn for RDNA 5.
AssimilatorIt's a terrible plan, because it allows your competitors to take up the slack that you've left by stepping aside. That allows them to get more of their product onto shelves, which means their brand is further into consumer consciousness while yours recedes. And when you come back to try to compete again you'll find retailers aren't willing to give you "your" shelf space back, because they've already got competitor product there that they know they can move. Nope, absolutely the worst thing any company selling consumer goods can ever do.
I didn't see that happen after RDNA 1, and I didn't see that happen after Bulldozer. AMD went nearly bankrupt, but they came back. Shelf space is for anything that sells.
Posted on Reply
#110
Unregistered
What's the point any ways, the GPUs cost a leg now, plus people won't buy them as nVidia has multiple reviewers doing its marketing such as DF or HU.
#111
rv8000
TropickAlright mr cool guy since our takes are so bad what's your opinion?
It’s a rumor, AMD will stick with their traditional route. In the past few years they haven’t competed with the top end GPU from Nvidia, and it’s not gonna change. They have plenty of competitive products otherwise.

Any negativity you took from my comment was aimed at all the nonsense that intel is suddenly going to leap frog AMD, while they’re selling larger dies currently with worse performance, and probably at a loss with how cheap the 750/770 are forced to sell. They’ve got awhile to go.

Also that somehow the 7900/7900xtx are dog while they compete within their price range and beat their counterpart in traditional rasterization (the overwhelming majority of games). Don’t compete on RT, but several posts here are dooming like the cards wouldn’t even run the original quake.

Everything this gen is terrible, Nvidia and AMD alike. But dropping out of the market? Selling the GPU division? Like how would any of that make even the smallest amount of business sense in the immediate future.
Posted on Reply
#112
Pierre Cyr
I think the issue may be simpler. AMD has moved to chiplets and it works fine in current gen so no reason to think they cant go further down that route for navi4. So what is the issue here? That the navi 41 and 42 GCD have been canned. Maybe its simply because they can do everything with the navi 43 GCD along the full stack and arent getting any value or performance out of using bigger GCDs. So RX 8600 is one GCD RX 8700-8800 2 GCDs, RX 8900xt 4 GCDs along with cut down along the way so maybe 1.5 GCDs for the 8700 or 3.5 GCDs for the 8900 non xt ect...

But even if its going to be just an RX 8600 and 8700 we know AMD sold GPU's fine when not running the high end when they only had Polaris. And we see the low end 6600-7600 and nvidia's 60's and 50's run well last 2 gens. So much so many are moving down the stack to that as upgrades now due to price hikes. Not that many care about 100 vs 200 fps. Outside a couple titles its becoming mostly epeen to pay for the 1k$+ class of gpus. As is the 600-60 range performs very well and many expect the dual issue FP32's in the shaders will be utilized at some point in game engine updates as both major vendors have moved to it. Personally waiting on a sale for for one to replace the RIP Vega 64. Wish that had lasted a few more years tho. Id bet intel will move to dual issue as well.

But I doubt the abandoning of the high end because high end 6000's and 7000's sold relatively well. I see no reason for them to can that altogether.
Posted on Reply
#113
milewski1015
ARFI don't think "it worked". How many people do you know (or think) have that mid-range RX 5700 XT?



I think it would be much better if AMD sells the whole of it altogether. The agony must come to an end some day. :banghead:
I've got a 5700 XT and am very happy with it. Slightly over $400 for the top-tier Nitro+ model that's quiet as can be. Even coming up on it's 4th year in my hands here in November, it still performs well in what I play - plenty of FPS for the more lighter, competitive stuff and over 60 at 1440p for the single player games. Sure I'll need a new card in a couple years probably as the titles get even more demanding, but shelling out north of $700 for a card I'd consider to be more oriented towards 4K doesn't make sense when I'm happy with the performance now.
Posted on Reply
#114
HD64G
persondbTheir new arch(RDNA 3) is basically broken. The only performance uplift they got is from higher clocks, while the architecture improvements seem to have been almost nothing(likely because of broken features...).

RDNA 3 is really a repeat of Vega, where there was a lot of marketing about feature that would revolutionize everything(packed math, hbcc, primitive shaders, etc etc) and they were really, never used or not even enable in the drivers.
RDNA3 is mainly a way to make bigger GPU cores with less cost, nothing to do with raw performance. And it got them 50% more FPS on average vs the 6900XT. Not too shabby I think since I remember a few cases both from AMD and nVidia that didn't manage that generational performance jump.
Posted on Reply
#115
ARF
PunkenjoyA theory that haven't been explored, but could be that AMD is on a dead end with the RNDA architecture. They didn't had what they expected from RNDA 3, and their adjusted simulations for RDNA4 were not that much better.

They looked at where the market is heading and they probably see no huge gain for RT and AI without doing a new clean sheet architecture. So instead of spending for 3-4 chips for RNDA 4, they would just do one while they work on their next gen architecture.

This is pure speculation like almost everything under this post. But with the recent release, it could be something probable.
If this is even partly true, then AMD must think about hiring competent engineers like Jim Keller who famously helped to design the Zen microarchitecture and resurrect AMD.

It is actually very possible - there is a hypothesis that the Navi 41 and Navi 42 got cancelled during the design stage just before tape-out because they didn't meet the performance targets.

They definitely need to design a new graphics architecture, and focus on RT if they can.
192 RT cores (instead of only 96) in Navi 31 could have helped a lot, a beefed the ray-tracing computation capabilities of the chip.
Posted on Reply
#116
david salsero
AMD has the best processor on the market ZEN 4 7040 Phoenix and I still don't see it in stores, what is happening?
90% of people demand very light ultrabooks with graphics power and Zen 4 Phoenix is perfect, dGPU is not needed because more ultrabooks are not coming out because Zen 4 is compatible with: DDR5 + USB 4.0 + HDMI 2.1 + RDNA 3 + artificial intelligence.
Posted on Reply
#117
evernessince
ARFAnd still nvidia posts huge profits, while AMD posts losses. Why is that? AMD's strategies don't work?
I mean it's cool all those chiplets and things, but do they actually make a difference?



www.techpowerup.com/309125/nvidia-announces-financial-results-for-first-quarter-fiscal-2024-gaming-down-38-yoy-stock-still-jumps-25#g309125


www.techpowerup.com/forums/threads/amd-reports-second-quarter-2023-financial-results-revenue-down-18-yoy.311976/
All chipmakers across the board are down due to low demand after a period of extremely high demand caused by the pandemic. Nvidia's operating income was down a whopping 80% in Q2 2023. Nvidia's stats like Q1 2024 revenue are still negative as is their net income in the non-GAAP chart you didn't provide above. That even a historic boom in the AI market couldn't fully make up for the drop in sales in other markets for Nvidia just goes to show you that the trend you present is largely due to market conditions.

AMD is making very large increases in it's R&D budget and their recent Xilinx acquisition was expensive, things you definitely want to factor in when looking at quarterly reports. The Xilinx purchase alone is why there is such a big difference between GAAP and non-GAAP reports, the non-GAAP report excludes the cost. You can see the drag on AMD's earnings from the acquisition on several of the past quarterly reports.
I mean it's cool all those chiplets and things, but do they actually make a difference?
AMD has grown from a net worth of 1.5 billion in 2016 to 183.6 billion in 2023 (they peaked at over 200 Billion). It should go without saying that they are important, hence why Intel is moving to a chiplet based architecture as well.

I'm certainly not a fan of AMD's new pricing and product approach but let's not take the current market conditions as a reason to throw the wrong things like chiplets under the bus.
Posted on Reply
#118
TheinsanegamerN
AusWolfAre halo products the main source of income of either of these companies, though? You can argue for the occult Nvidia fans who buy the latest x90 card in every generation, but there's no such thing on AMD's side. Both the 6900 XT and 7900 XTX were under tight scrutiny by both reviewers and buyers. I don't think putting a huge amount of money and effort into making something that can't beat Nvidia's latest flagship is a financially sound decision.
while halos do not shift the volume of lower end parts, they are, undeniably, high margin parts and make a not so small amount of cash. My point was more that the halo cards are brand leaders. When AMD was making competitive flagship cards like the HD 5870 and 290x, their sale sin low and mid ranges were a lot higher as well. When the top 5 fastest cards are all nvidia, it gives the notion to non techies that AMD is just slower then nvidia.

The RX 6000 series was the first in a long time to give the nvidia flagship (at the time the 3090) a run for its money, and AMD couldnt keep the things in stock.
AusWolfAlso, the 5700 XT wasn't really a success on its own merits. Sure, it was an okay midrange card, but it was a good testing ground for AMD to see how RDNA works. We wouldn't have RDNA 2 without it. If AMD manages to mix things up with RDNA 4's architecture, and make some decent mirage cards with relatively low R&D costs, it could give them a chance to draw some conclusions for RDNA 5 again. Withdraw from the battle to save your strength for the next one is not a bad plan, imo.
Withdrawing from a market is a TERRIBLE idea. AMD proved that with bulldozer, even once ryzen arrived it took a few years to rebuild momentum.
ARFIf this is even partly true, then AMD must think about hiring competent engineers like Jim Keller who famously helped to design the Zen microarchitecture and resurrect AMD.

It is actually very possible - there is a hypothesis that the Navi 41 and Navi 42 got cancelled during the design stage just before tape-out because they didn't meet the performance targets.

They definitely need to design a new graphics architecture, and focus on RT if they can.
192 RT cores (instead of only 96) in Navi 31 could have helped a lot, a beefed the ray-tracing computation capabilities of the chip.
I could buy it being a rDNA issue. rDNA 1 didnt have RT at all, and it really does look like it was slapped onto rDNA2.

Which is fine, r300 and evergreen only went so far.
david salseroAMD has the best processor on the market ZEN 4 7040 Phoenix and I still don't see it in stores, what is happening?
90% of people demand very light ultrabooks with graphics power and Zen 4 Phoenix is perfect, dGPU is not needed because more ultrabooks are not coming out because Zen 4 is compatible with: DDR5 + USB 4.0 + HDMI 2.1 + RDNA 3 + artificial intelligence.
That's debateable. The 7040 series is a good product, but not the be all end all.

It takes time for designs to roll out. Given OEMs are still waiting for the official phoenix GPU drivers they are in no rush to get these things out there.

Not sure what that bolded part is even trying to say.
Posted on Reply
#119
enb141
ToTTenTranzBecause Nvidia had warehouses full of those Tegra chips for tablets that had failed miserably against their Snapdragon and Apple competition, which they were willing to sell for cheap.
Tegra X1 was such a failure that Nvidia never even tried to enter the Android market again.
A warehouse full of those Tegra chips, if true, then they had probably 10,000 units or 100,000 units, I doubt they had millions, which Nintendo has bought from Nvidia.

So if that were true, then Nvidia had been selling Tegra chips to Nvidia at a loss, which is not the case.
Posted on Reply
#120
R0H1T
The initial chip x1 was introduced back in 2015 ~ yes that long back, it was a major flop at the time & given it was targeted at tablets(flagship mobiles?) Nvidia most definitely had at least a few million sitting somewhere in Taiwan or China. As you know consoles are a multi year venture so unless you think console makers want all 122+ million chips to be made at once ~ but that's not how it goes.
Posted on Reply
#121
TheinsanegamerN
enb141A warehouse full of those Tegra chips, if true, then they had probably 10,000 units or 100,000 units, I doubt they had millions, which Nintendo has bought from Nvidia.

So if that were true, then Nvidia had been selling Tegra chips to Nvidia at a loss, which is not the case.
A warehouse with only 10,000?

I can fit 10,000 tegra chips in my closet. They're not exactly big.
Posted on Reply
#122
persondb
HD64GRDNA3 is mainly a way to make bigger GPU cores with less cost, nothing to do with raw performance. And it got them 50% more FPS on average vs the 6900XT. Not too shabby I think since I remember a few cases both from AMD and nVidia that didn't manage that generational performance jump.
I wouldn't say that, there were big changes in the architecture, from the CU/WGP to Pixel and Geometry pipes.

I don't know where you are getting 50% as the techpowerup review of the 7900 XT shows an average 20~25% improvement. Which considering how AMD improved clocks, added more CUs/WGP, etc etc, would really mean that architecturally-wise AMD really didn't manage to improve much despite obvious big goals(dual-issue, new RT instructions, MDIA, pixel wait sync, etc).

You can see that in the 780M vs 680M as improvements are really mostly due to clocks(22% increase in clocks and it ended up with 20~30% perf improvement).
Posted on Reply
#123
enb141
G777The 4060 is about 3 times as powerful as the RX 6400, so it's not like you're getting ripped off here.
I'm not complaining about the 4060, I know the AMD or intel "equivalent" will be cheaper but their drivers suck, even if people here say they don't.

To me AMD and Intel suck, I have tried them so nobody will convince me that they don't suck because I have tried them all and the best in Nvidia, so now I'm glad they are releasing a low profile 4060.

The only thing that I would choose AMD is for a handheld because intel or nvidia doesn't have a good windows based handheld, Intel has but is not as popular as AMD in this field.
Posted on Reply
#124
Prime2515102
Did TPU make this rumor up themselves? There is no mention of where it came from.
Posted on Reply
#125
enb141
ZoneDymoout of curiosity, what problems did you have that makes you say the support is trash and drivers suck?
No VRR and limited to 8 bit color in Smart TV, they the drivers since this year, have an issue with Kodi that you can't watch videos (only audio) if you enable HDR on Windows.

I reported (tried to) by their forums, nobody from AMD responded, also sent a support ticket by mail, they told me that they couldn't help me, and also reported to their bug tool.

Has been a year since I reported the VRR and limited 8 bit color in my Smart TV and about 8 months since the issue with Kodi, guess what, the bugs are still there.

Plus I got random windows reboots.
Posted on Reply
Add your own comment
Nov 21st, 2024 11:16 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts