Wednesday, September 13th 2023

AMD Accelerators Rumored to Nudge Out Higher-End Radeon RX 8000 GPUs

We heard murmurings back in early August about a slightly disappointing future for RDNA 4—multiple sources claimed that AMD had dropped development of a Navi 31 successor. Rumored reasons included "a cost justification of developing high-end GPUs to push enough volumes over the product lifecycle," as well as the apparent complexity of chiplet designs making it difficult to climb up the ladder of generational performance improvements.

The "narrowed" RDNA 4 product lineup is said to only encompass Navi 43 and Navi 44 GPUs, with a heavier focus on mid-range offerings. Lately, Bits And Chips has decided to pile on with another theory, likely obtained from their favored inside source: "AMD will sacrifice next Radeon gaming GPUs (RX 8000) output at TSMC in order to pump up FPGA and GPGPU production." The AI hardware market is in a boom phase, and Team Red is keen to catch up with NVIDIA—past reports have suggested that Team Green production priorities have shifted away GeForce RTX 4090 GPUs, in favor of an output uptick of "immensely profitable" H100 AI GPUs. Is AMD simply copying the market leader's homework?
Sources: Techspot, Bits & Chips
Add your own comment

44 Comments on AMD Accelerators Rumored to Nudge Out Higher-End Radeon RX 8000 GPUs

#1
ZoneDymo
It would be interesting if AMD just dipped out of dedicated gpu's for a few years, see if that affects anything.
Posted on Reply
#2
Divide Overflow
ZoneDymoIt would be interesting if AMD just dipped out of dedicated gpu's for a few years, see if that affects anything.
Other than Nvidia's prices?
Posted on Reply
#3
oxrufiioxo
ZoneDymoIt would be interesting if AMD just dipped out of dedicated gpu's for a few years, see if that affects anything.
They'll always have a mid-tier option due the PS/Xbox sporting AMD hardware the 7800XT or a variation of it seems to be going into the PS5 Pro. An 8700XT or 9700XT could be the basis for a future console so They'll likely continue to develop the mid range. They haven't talked about RDNA5 yet but supposedly part of the reason high end RDNA4 was canceled was so that it's development could be expedited. I guess only time will tell
Posted on Reply
#4
mrnagant
Gotta fallow the money. Nvidia from Q1 to Q2 saw an increase of $6B in datacenter. Up 141% from Q1. Per AMDs Q2 report, they are sampling MI300 and customer interest in AMDs AI increased 7x.

AMD and Nvidia combined gaming segments were $4B with limited growth, while datacenter was $11.6B with it seemingly only going to dramatically increase. Capacity is going to be consumed by AI and GPGPU hardware that prints money. A number of businesses are excelling at AI already and many other businesses don't know what AI is or what they can do with it, but they too are already all in. If AMD wants to have a shot at significantly increasing their revenue, they basically need to ignore what consumers want and go after what businesses want.

Radeon 7900XTX for example is ~530mm^2 while the MI210 is ~770mm^2 and the MI250 is 2 GCDs. The MI300 is going to have SKUs that are going to be absolutely huge.
Posted on Reply
#5
AnotherReader
The AI specific products typically use HBM3 and interposers or CoWos in TSMC parlance so they are not limited by the number of wafers. Instead, they are constrained by TSMC's output of interposers. It seems unlikely that this would affect all RDNA4 GPUs. The most likely victim of an interposer shortage would be the rumoured high end RDNA4. Any monolithic dies would be unaffected. Products using chiplets connected via in-package links like RDNA3 would be unaffected as well.
Posted on Reply
#6
ZoneDymo
oxrufiioxoThey'll always have a mid-tier option due the PS/Xbox sporting AMD hardware the 7800XT or a variation of it seems to be going into the PS5 Pro. An 8700XT or 9700XT could be the basis for a future console so They'll likely continue to develop the mid range. They haven't talked about RDNA5 yet but supposedly part of the reason high end RDNA4 was canceled was so that it's development could be expedited. I guess only time will tell
its not about what they will or wont have, its more of just an interesting scenario, or maybe it wont affect anything and nothing really will happen.

75% of Steam users are on Nvidia, seems the market can live just fine without AMD in it.
Posted on Reply
#7
oxrufiioxo
ZoneDymoits not about what they will or wont have, its more of just an interesting scenario, or maybe it wont affect anything and nothing really will happen.

75% of Steam users are on Nvidia, seems the market can live just fine without AMD in it.
Nvidia already acts like AMD doesn't exist so their strategy likely wouldn't change but the 7800XT seems to be doing fine for them and RDNA2 is still selling ok at least here in the states having less options is never a good thing.
AnotherReaderThe AI specific products typically use HBM3 and interposers or CoWos in TSMC parlance so they are not limited by the number of wafers. Instead, they are constrained by TSMC's output of interposers. It seems unlikely that this would affect all RDNA4 GPUs. The most likely victim of an interposer shortage would be the rumoured high end RDNA4. Any monolithic dies would be unaffected. Products using chiplets connected via in-package links like RDNA3 would be unaffected as well.
That was the first thing that popped into my head if true that is likely a big reason high end RDNA4 died....
Posted on Reply
#8
Makaveli
mrnagantGotta fallow the money. Nvidia from Q1 to Q2 saw an increase of $6B in datacenter. Up 141% from Q1. Per AMDs Q2 report, they are sampling MI300 and customer interest in AMDs AI increased 7x.

AMD and Nvidia combined gaming segments were $4B with limited growth, while datacenter was $11.6B with it seemingly only going to dramatically increase. Capacity is going to be consumed by AI and GPGPU hardware that prints money. A number of businesses are excelling at AI already and many other businesses don't know what AI is or what they can do with it, but they too are already all in. If AMD wants to have a shot at significantly increasing their revenue, they basically need to ignore what consumers want and go after what businesses want.

Radeon 7900XTX for example is ~530mm^2 while the MI210 is ~770mm^2 and the MI250 is 2 GCDs. The MI300 is going to have SKUs that are going to be absolutely huge.
Great post and accurate.
Posted on Reply
#9
john_
"a cost justification of developing high-end GPUs to push enough volumes over the product lifecycle,"
Why spend hundred of millions or even billions to build something competitive to Nvidia, when you know that gamers will just pay more to buy something slower, just for having that Nvidia logo on it? We have seen it. People paying for RTX 3050 instead of buying RX 6600, people paying for RTX 3060, instead of buying RX 6700XT etc. The same with the top cards? RX 7900XTX $200 cheaper than RTX 4080 and beating it in some titles? TOO EXPENSIVE.

With AMD selling 1/8th of what Nvidia sells, they are probably losing money even at these price levels that are considered high from many, because they expect from AMD to sell at cost. Everyone suddenly wants CUDA support to play games, for example or run CP 2077 Overdrive at over 60 fps.

Well, that
"AMD build something good to buy cheaper Intel and Nvidia hardware".
attitude came back and bites hard.

Enjoy the next RTX 5060, with 96bit data bus at $400. It will be faster than the RX 8700XT and probably more efficient, considering the RX 8700XT will have a very limited budget for research and development and probably a better deal than anything Intel will be offering at that time. So, yeah, we will be reading that "$400 is a good price".
Posted on Reply
#10
TheinsanegamerN
john_Why spend hundred of millions or even billions to build something competitive to Nvidia, when you know that gamers will just pay more to buy something slower, just for having that Nvidia logo on it? We have seen it. People paying for RTX 3050 instead of buying RX 6600, people paying for RTX 3060, instead of buying RX 6700XT etc.
Let us not forget that, during the height of GPU demand, lisa su limited GPU orders to focus on CPUS instead, because those mad emore money. But as we all know, unless you wear a leather jacket you cant be guilty of depriving markets for more $$$$.
john_The same with the top cards? RX 7900XTX $200 cheaper than RTX 4080 and beating it in some titles? TOO EXPENSIVE.
Unless you turn RT on, which AMD fans will INSIST doesnt matter, but clearly it does.
john_Enjoy the next RTX 5060, with 96bit data bus at $400. It will be faster than the RX 8700XT and probably more efficient, considering the RX 8700XT will have a very limited budget for research and development and probably a better deal than anything Intel will be offering at that time. So, yeah, we will be reading that "$400 is a good price".
AMD does not exist in a vaccum. If they want to SELL, they have to COMPETE. And the sales of cards like the 7800xt show there is a large market interested in AMD hardware. Where does this myth that 7900xtx cards dont sell come from? The 7900xt sure, it was priced wrong, but it seems that AMD is making plenty of cash fromt heir radeon division. Y'all are acting like its 2015 and AMD is trying to compete with maxwell using warmed over 7970s.
Posted on Reply
#11
dj-electric
AMD haven't made bank off top end gaming GPUs in forever. You cannot be an analyst in this company and advocate that investing even more in them is a great idea.
These things are costly.
Posted on Reply
#12
Denver
mrnagantGotta fallow the money. Nvidia from Q1 to Q2 saw an increase of $6B in datacenter. Up 141% from Q1. Per AMDs Q2 report, they are sampling MI300 and customer interest in AMDs AI increased 7x.

AMD and Nvidia combined gaming segments were $4B with limited growth, while datacenter was $11.6B with it seemingly only going to dramatically increase. Capacity is going to be consumed by AI and GPGPU hardware that prints money. A number of businesses are excelling at AI already and many other businesses don't know what AI is or what they can do with it, but they too are already all in. If AMD wants to have a shot at significantly increasing their revenue, they basically need to ignore what consumers want and go after what businesses want.

Radeon 7900XTX for example is ~530mm^2 while the MI210 is ~770mm^2 and the MI250 is 2 GCDs. The MI300 is going to have SKUs that are going to be absolutely huge.
Exactly... this is the logic of business, allocating resources to the most profitable sector. Especially when production capacity is limited.

Compared to AI and DataCenter, Gaming is a terrible business at the moment, you launch products below $1000, after selling you have to invest in support via software (driver) for many years, pay partner studios + promotions, in addition to sharing profits with AIBs, resellers etc... you squeeze profit margins and still receive blows from gamers unhappy with the prices.

On the other side, you have a market where you sell a truckload of GPUs costing thousands of dollars each and no one will cry about the prices, plus, you don't share profits with any AIB partner. :rolleyes:
Posted on Reply
#13
R0H1T
TheinsanegamerNLet us not forget that, during the height of GPU demand, lisa su limited GPU orders to focus on CPUS instead, because those mad emore money. But as we all know, unless you wear a leather jacket you cant be guilty of depriving markets for more $$$$.
At the height of that demand Nvidia GPU's were selling for anywhere between 2-10x their MSRP & of course everyone & their cat were trying to get in on the crypto gold rush! While it was only by chance but AMD not contributing that madness was a good thing & yes I know they did badly in the previous crypto booms.

This is in some ways pretty similar with the AI fad :shadedshu:
Posted on Reply
#14
john_
TheinsanegamerNLet us not forget that, during the height of GPU demand, lisa su limited GPU orders to focus on CPUS instead, because those mad emore money. But as we all know, unless you wear a leather jacket you cant be guilty of depriving markets for more $$$$.
Why invest in GPUs that the casual gamer will not buy because "Nvidia logo is better than real performance at lower price" and not invest in EPYC CPUs where buyers are like "Your CPUs are better, take my money"? I mean, why throw logic in the trash? As for the leather jacket, Nvidia disctates pricing in the GPU market. They have the performance, the feature set, the brand recognition, or should I say brand worshiping, the more wafers from TSMC to spare.
Unless you turn RT on, which AMD fans will INSIST doesnt matter, but clearly it does.
I was screaming about RT performance when RX 7900XTX reviews came out and everyone was calling me Nvidia shill and that RT is not as important as raster. Well I will keep insisting that it is a huge marketing advantage and when combined with the love for Nvidia's logo, is a "must have" feature, even when intended to only run... MAME.

Now you here and every Nvidia fan in general points at RT performance like it is color, like games being in B&W format without RT enabled and set at the "ultra super duper high pathtracing, no restriction, real life man" setting. And this through marketing and online posts and youtubers and any other way, passes as the norm to consumers, making them buy an RTX 3050 because "it is better in RT than RX 6600".
AMD does not exist in a vaccum. If they want to SELL, they have to COMPETE. And the sales of cards like the 7800xt show there is a large market interested in AMD hardware. Where does this myth that 7900xtx cards dont sell come from? The 7900xt sure, it was priced wrong, but it seems that AMD is making plenty of cash fromt heir radeon division. Y'all are acting like its 2015 and AMD is trying to compete with maxwell using warmed over 7970s.
They do compete. Not in every market segment, but they do compete. I am pretty sure that many will throw a dozen excuses to someone to make them chose an RTX 3050 over an RX 6600. This is a reality. JPR numbers are also a reality. Steam servey numbers are also a reality. AMD's and Nvidia's financial reports are also a reality. All those say that AMD does NOT sell as many GPUs as needed to keep them interested in this market, throw more resources, make more and better models, put better prices. Of course they also do mistakes. I 'll keep repeating myself about RT performance, raster performance from RDNA3 was also a failure, performing as RDNA2. But the tech press, youtubers, even individuals and trolls should realize that the future will be expensive Nvidia hardware and sub par AMD and intel hardware if things don't change. And unfortunately today, most of them are celebrating going in a monopoly, like if their favorite team is winning.
Posted on Reply
#15
pavle
Oh so now the whine about chiplet design complexity? Who started it? Adios My Dineros for ya.
Posted on Reply
#16
TheinsanegamerN
john_Why invest in GPUs that the casual gamer will not buy because "Nvidia logo is better than real performance at lower price" and not invest in EPYC CPUs where buyers are like "Your CPUs are better, take my money"? I mean, why throw logic in the trash? As for the leather jacket, Nvidia disctates pricing in the GPU market. They have the performance, the feature set, the brand recognition, or should I say brand worshiping, the more wafers from TSMC to spare.
Because any GPU that was available was selling. There was very hot demand for GPUs, as was evident that every 6800XT retailers got sold out in 30 seconds, often for 80%+ over MSRP.

AMD has some good reasons to prioritize the higher margins of EPYC, but if we're going to admit that AMD chased the higher margin parts, lets stop this childish "muh nvidia fans" excuse for AMD GPUs not selling. They COULD have made millions of additional GPUS and sold every single one, they CHOSE not to. That is on AMD, not nvidia, not the consumer.
john_I was screaming about RT performance when RX 7900XTX reviews came out and everyone was calling me Nvidia shill and that RT is not as important as raster. Well I will keep insisting that it is a huge marketing advantage and when combined with the love for Nvidia's logo, is a "must have" feature, even when intended to only run... MAME.

Now you here and every Nvidia fan in general points at RT performance like it is color, like games being in B&W format without RT enabled and set at the "ultra super duper high pathtracing, no restriction, real life man" setting. And this through marketing and online posts and youtubers and any other way, passes as the norm to consumers, making them buy an RTX 3050 because "it is better in RT than RX 6600".
Pretty sure I have advocated, more then once, for AMD to either focus more on RT OR focus on beating the snot out of nvidia at rasterization so they could stand out. I simply pointed out that the general market, when given the choice of two similar GPUs, one is cheaper and one offers far superior RT performance, will chose the one with better performance.
john_They do compete. Not in every market segment, but they do compete. I am pretty sure that many will throw a dozen excuses to someone to make them chose an RTX 3050 over an RX 6600. This is a reality. JPR numbers are also a reality. Steam servey numbers are also a reality. AMD's and Nvidia's financial reports are also a reality. All those say that AMD does NOT sell as many GPUs as needed to keep them interested in this market, throw more resources, make more and better models, put better prices. Of course they also do mistakes. I 'll keep repeating myself about RT performance, raster performance from RDNA3 was also a failure, performing as RDNA2. But the tech press, youtubers, even individuals and trolls should realize that the future will be expensive Nvidia hardware and sub par AMD and intel hardware if things don't change. And unfortunately today, most of them are celebrating going in a monopoly, like if their favorite team is winning.
So AMD DOES compete, and as we can see with GPUs like the 7800xt, they sell well when they do so.

So is it nvidia holding AMD down? da "troolz"? Or could it be that AMD seems to have a serious issue competing consistently? Which story are you going with here? Maybe go outside and calm down a bit, nobody here is cheering on an nvidia monopoly, only you are saying that.
R0H1TAt the height of that demand Nvidia GPU's were selling for anywhere between 2-10x their MSRP & of course everyone & their cat were trying to get in on the crypto gold rush! While it was only by chance but AMD not contributing that madness was a good thing & yes I know they did badly in the previous crypto booms.

This is in some ways pretty similar with the AI fad :shadedshu:
Producing less, making less, and allowing your main competitor to print out billions in extra $$ is now a "good thing".
Posted on Reply
#17
R0H1T
Nvidia was on an inferior node, had better(?) mining GPU's & probably a lot more capacity at Samsung for them than AMD had at TSMC ~ they certainly had a lot less capacity just for the graphics cards because they were shipping a lot more EPYC's back then. Must've also missed that class where a once in a century event, that also closed 90% of the world, would suddenly allow AMD to print GPU's out of thin air. That & the fact that the miners & scalpers probably made a whole lot more than Nvidia on those cards! Good job skipping those classes :ohwell:
Posted on Reply
#18
john_
TheinsanegamerNBecause any GPU that was available was selling. There was very hot demand for GPUs, as was evident that every 6800XT retailers got sold out in 30 seconds, often for 80%+ over MSRP.

AMD has some good reasons to prioritize the higher margins of EPYC, but if we're going to admit that AMD chased the higher margin parts, lets stop this childish "muh nvidia fans" excuse for AMD GPUs not selling. They COULD have made millions of additional GPUS and sold every single one, they CHOSE not to. That is on AMD, not nvidia, not the consumer.

Pretty sure I have advocated, more then once, for AMD to either focus more on RT OR focus on beating the snot out of nvidia at rasterization so they could stand out. I simply pointed out that the general market, when given the choice of two similar GPUs, one is cheaper and one offers far superior RT performance, will chose the one with better performance.

So AMD DOES compete, and as we can see with GPUs like the 7800xt, they sell well when they do so.

So is it nvidia holding AMD down? da "troolz"? Or could it be that AMD seems to have a serious issue competing consistently? Which story are you going with here? Maybe go outside and calm down a bit, nobody here is cheering on an nvidia monopoly, only you are saying that.

Producing less, making less, and allowing your main competitor to print out billions in extra $$ is now a "good thing". Must have missed my clown world classes.
Really? I mean REALLY? Pointing at a period of mining craze and saying that Radeon was selling, it's NOT an argument. We are talking about gaming, or at least I was.

Reality is not "childish "muh nvidia fans" excuse". Now if you insist of being just that, a childish excuse, remember to argue in a few years that "AMD fans are responsible for graphics cards becoming too expensive, because they only had "childish "muh nvidia fans" excuses" instead of pressing AMD to start giving away graphics cards at cost, to pressure Nvidia to lower their prices".

Well AMD tried to increase performance in raster, they failed. Not increasing performance in RT much more, got them in a possition where RDNA3 looks like RDNA2. If they had considerably increased performance in RT, instead of expecting to gain some incremental gains in hi end from the higher number of CUs and in middle low end from, don't know where, they lost the opportunity of having RDNA3 look like a good upgrade over RDNA2 for RT performance alone.

Are you going to totally distort the meaning of what I wrote?
"Truck has wheels, bicylcle has wheels, so you admit that bicycle is a truck". Stop that.
Posted on Reply
#19
ARF
oxrufiioxoThey'll always have a mid-tier option due the PS/Xbox sporting AMD hardware the 7800XT or a variation of it seems to be going into the PS5 Pro. An 8700XT or 9700XT could be the basis for a future console so They'll likely continue to develop the mid range. They haven't talked about RDNA5 yet but supposedly part of the reason high end RDNA4 was canceled was so that it's development could be expedited. I guess only time will tell
Why do you think that the next consoles will have AMD hardware? Is it already fixed?
ZoneDymoIt would be interesting if AMD just dipped out of dedicated gpu's for a few years, see if that affects anything.
They don't have so much time. In a few years there won't be such a business, at all.
"AMD will sacrifice next Radeon gaming GPUs (RX 8000) output at TSMC in order to pump up FPGA and GPGPU production.
This actually means that AMD will not release any RX 8000 GPUs.
Posted on Reply
#20
oxrufiioxo
ARFWhy do you think that the next consoles will have AMD hardware? Is it already fixed?
It's the only solution that offers full backwards compatibility and a large APU. If they switched to intel it would break backwards compatibility and Nvidia is too expensive for a console the whole reason Microsoft ditched them almost 20 years ago well unless Microsoft goes the switch route and uses ultra low end hardware in their next console although again it would break backwards compatibility. I'm not a betting man but I would bet on the PS6 and Xbox Series XXX to use AMD hardware.
Posted on Reply
#21
ARF
oxrufiioxoIt's the only solution that offers full backwards compatibility and a large APU. If they switched to intel it would break backwards compatibility and Nvidia is too expensive for a console the whole reason Microsoft ditched them almost 20 years ago well unless Microsoft goes the switch route and uses ultra low end hardware in their next console although again it would break backwards compatibility. I'm not a betting man but I would bet on the PS6 and Xbox Series XXX to use AMD hardware.
I don't get why you think that intel's x86-64 architecture won't work?
Also, if AMD switches to a chiplet design - CPU cores die + GPU cores die + some IO bridge, it won't matter if those "chiplets" are delivered by nvidia + intel, instead.

And I'm not sure that they need "backwards" compatibility. Maybe they will drop it. In order to make some monster with IBM, ARM, or whatever else is available. Like the previous Cell processor.
Posted on Reply
#22
R0H1T
No one's gonna make them as cheap/good as AMD ~ they have a history & neither Intel nor Nvidia generally play nice at those margins!
Posted on Reply
#23
Xaled
ZoneDymoits not about what they will or wont have, its more of just an interesting scenario, or maybe it wont affect anything and nothing really will happen.

75% of Steam users are on Nvidia, seems the market can live just fine without AMD in it.
Yeah nothing would really happen, except you buying 4050s (or later xx50s) for the price of 4090s
Posted on Reply
#24
Hoopi
It's almost as if both AMD and Nvidia are literally handing over the keys to Intel and abandoning the consumer gaming GPU market.
Posted on Reply
#25
Random_User
I don't want to hurt anyone's feelings. I know I would be beaten for my rubbish layman's opinion, and I definitely shouldn't respond to the rumour thread, but...

AMD is their own worst enemy. They have chosen this path they are now moving. This is not any news. They were always sluggish and leaving the many deeds and areas unfinished, when they obviously could do otherwise. There were better investments, but AMD knew about AI boom beforehand, and they bought Xilinx exactly for this very reason. They could heavily invest in software departments, and have the drivers improvements, no-worse degree than intel did in less than a year. But they didn't. If AMD doesn't want their stuff to be purchased, whose fault is this? And at this pace, one is known for sure, that AMD doesn't care about regular consumers.

Surely there are many talented engineers working at AMD, and making brilliant devices and stuff. But all these achievements are moot, if heads of the AMD defining the future of the products they do.
It doesn't matter if the product is good, if it doesn't have good and developed ecosystem, and support. I don't say AMD products are bad. Quite the reverse, but the support and PR is lacking. They should do more. Thus it requires the investments, and why do that, if the enterprise market has more money, and they don't whine at reddits and forums about driver bugs, or about lack of fake frames.

Cutting production of high end dGPUs is not a very good idea, considering the only reasonable iGPU are yet non-existent. It could be understandable, if AMD flooded the market with APUs and laptop iGPUs of RX6600 level. But this is not the case. And I sincerely doubt it will ever be. There surely will be better iGPUs, but their quantity will be scarce. And even then, there are people requiring a more performant GPUs.

As other people wrote before, AMD is interested only in highest as possible margins. And that's AI. So unless there will be some powerful dedicated ASIC will appear, the GPU allocation will be moved to enterprise, and nobody at consumer level will be able do anything about this.

Eventually all GPUs will allocate to AI and datacenters only, and regular consumers are already being pushed to the subscription, meaning in the end people will get only some weak tablets/laptops/portable PCs to look at the screen, while the nVidia and AMD GPGPUs will generate some fake frames with remote access/streaming.

At the end of the day, the whole Ryzen and Radeon thing is just bigger sandbox and by-products of EPYC and MI, and gives AMD big amount free beta-testers to troubleshoot the enterprise R&D for shorter periods. It may seem be unrelated, but Ryzen is still heavily cut EPYC. The features set is way lesser, but the core architecture is the same. And the gaming Radeon is just crumbs compared to GPGP profits.
Posted on Reply
Add your own comment
Dec 18th, 2024 01:05 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts