Wednesday, September 13th 2023
AMD Accelerators Rumored to Nudge Out Higher-End Radeon RX 8000 GPUs
We heard murmurings back in early August about a slightly disappointing future for RDNA 4—multiple sources claimed that AMD had dropped development of a Navi 31 successor. Rumored reasons included "a cost justification of developing high-end GPUs to push enough volumes over the product lifecycle," as well as the apparent complexity of chiplet designs making it difficult to climb up the ladder of generational performance improvements.
The "narrowed" RDNA 4 product lineup is said to only encompass Navi 43 and Navi 44 GPUs, with a heavier focus on mid-range offerings. Lately, Bits And Chips has decided to pile on with another theory, likely obtained from their favored inside source: "AMD will sacrifice next Radeon gaming GPUs (RX 8000) output at TSMC in order to pump up FPGA and GPGPU production." The AI hardware market is in a boom phase, and Team Red is keen to catch up with NVIDIA—past reports have suggested that Team Green production priorities have shifted away GeForce RTX 4090 GPUs, in favor of an output uptick of "immensely profitable" H100 AI GPUs. Is AMD simply copying the market leader's homework?
Sources:
Techspot, Bits & Chips
The "narrowed" RDNA 4 product lineup is said to only encompass Navi 43 and Navi 44 GPUs, with a heavier focus on mid-range offerings. Lately, Bits And Chips has decided to pile on with another theory, likely obtained from their favored inside source: "AMD will sacrifice next Radeon gaming GPUs (RX 8000) output at TSMC in order to pump up FPGA and GPGPU production." The AI hardware market is in a boom phase, and Team Red is keen to catch up with NVIDIA—past reports have suggested that Team Green production priorities have shifted away GeForce RTX 4090 GPUs, in favor of an output uptick of "immensely profitable" H100 AI GPUs. Is AMD simply copying the market leader's homework?
44 Comments on AMD Accelerators Rumored to Nudge Out Higher-End Radeon RX 8000 GPUs
AMD and Nvidia combined gaming segments were $4B with limited growth, while datacenter was $11.6B with it seemingly only going to dramatically increase. Capacity is going to be consumed by AI and GPGPU hardware that prints money. A number of businesses are excelling at AI already and many other businesses don't know what AI is or what they can do with it, but they too are already all in. If AMD wants to have a shot at significantly increasing their revenue, they basically need to ignore what consumers want and go after what businesses want.
Radeon 7900XTX for example is ~530mm^2 while the MI210 is ~770mm^2 and the MI250 is 2 GCDs. The MI300 is going to have SKUs that are going to be absolutely huge.
75% of Steam users are on Nvidia, seems the market can live just fine without AMD in it.
With AMD selling 1/8th of what Nvidia sells, they are probably losing money even at these price levels that are considered high from many, because they expect from AMD to sell at cost. Everyone suddenly wants CUDA support to play games, for example or run CP 2077 Overdrive at over 60 fps.
Well, that
"AMD build something good to buy cheaper Intel and Nvidia hardware".
attitude came back and bites hard.
Enjoy the next RTX 5060, with 96bit data bus at $400. It will be faster than the RX 8700XT and probably more efficient, considering the RX 8700XT will have a very limited budget for research and development and probably a better deal than anything Intel will be offering at that time. So, yeah, we will be reading that "$400 is a good price".
These things are costly.
Compared to AI and DataCenter, Gaming is a terrible business at the moment, you launch products below $1000, after selling you have to invest in support via software (driver) for many years, pay partner studios + promotions, in addition to sharing profits with AIBs, resellers etc... you squeeze profit margins and still receive blows from gamers unhappy with the prices.
On the other side, you have a market where you sell a truckload of GPUs costing thousands of dollars each and no one will cry about the prices, plus, you don't share profits with any AIB partner. :rolleyes:
cryptogold rush! While it was only by chance but AMD not contributing that madness was a good thing & yes I know they did badly in the previous crypto booms.This is in some ways pretty similar with the AI fad :shadedshu:
Now you here and every Nvidia fan in general points at RT performance like it is color, like games being in B&W format without RT enabled and set at the "ultra super duper high pathtracing, no restriction, real life man" setting. And this through marketing and online posts and youtubers and any other way, passes as the norm to consumers, making them buy an RTX 3050 because "it is better in RT than RX 6600". They do compete. Not in every market segment, but they do compete. I am pretty sure that many will throw a dozen excuses to someone to make them chose an RTX 3050 over an RX 6600. This is a reality. JPR numbers are also a reality. Steam servey numbers are also a reality. AMD's and Nvidia's financial reports are also a reality. All those say that AMD does NOT sell as many GPUs as needed to keep them interested in this market, throw more resources, make more and better models, put better prices. Of course they also do mistakes. I 'll keep repeating myself about RT performance, raster performance from RDNA3 was also a failure, performing as RDNA2. But the tech press, youtubers, even individuals and trolls should realize that the future will be expensive Nvidia hardware and sub par AMD and intel hardware if things don't change. And unfortunately today, most of them are celebrating going in a monopoly, like if their favorite team is winning.
AMD has some good reasons to prioritize the higher margins of EPYC, but if we're going to admit that AMD chased the higher margin parts, lets stop this childish "muh nvidia fans" excuse for AMD GPUs not selling. They COULD have made millions of additional GPUS and sold every single one, they CHOSE not to. That is on AMD, not nvidia, not the consumer. Pretty sure I have advocated, more then once, for AMD to either focus more on RT OR focus on beating the snot out of nvidia at rasterization so they could stand out. I simply pointed out that the general market, when given the choice of two similar GPUs, one is cheaper and one offers far superior RT performance, will chose the one with better performance. So AMD DOES compete, and as we can see with GPUs like the 7800xt, they sell well when they do so.
So is it nvidia holding AMD down? da "troolz"? Or could it be that AMD seems to have a serious issue competing consistently? Which story are you going with here? Maybe go outside and calm down a bit, nobody here is cheering on an nvidia monopoly, only you are saying that. Producing less, making less, and allowing your main competitor to print out billions in extra $$ is now a "good thing".
Reality is not "childish "muh nvidia fans" excuse". Now if you insist of being just that, a childish excuse, remember to argue in a few years that "AMD fans are responsible for graphics cards becoming too expensive, because they only had "childish "muh nvidia fans" excuses" instead of pressing AMD to start giving away graphics cards at cost, to pressure Nvidia to lower their prices".
Well AMD tried to increase performance in raster, they failed. Not increasing performance in RT much more, got them in a possition where RDNA3 looks like RDNA2. If they had considerably increased performance in RT, instead of expecting to gain some incremental gains in hi end from the higher number of CUs and in middle low end from, don't know where, they lost the opportunity of having RDNA3 look like a good upgrade over RDNA2 for RT performance alone.
Are you going to totally distort the meaning of what I wrote?
"Truck has wheels, bicylcle has wheels, so you admit that bicycle is a truck". Stop that.
Also, if AMD switches to a chiplet design - CPU cores die + GPU cores die + some IO bridge, it won't matter if those "chiplets" are delivered by nvidia + intel, instead.
And I'm not sure that they need "backwards" compatibility. Maybe they will drop it. In order to make some monster with IBM, ARM, or whatever else is available. Like the previous Cell processor.
AMD is their own worst enemy. They have chosen this path they are now moving. This is not any news. They were always sluggish and leaving the many deeds and areas unfinished, when they obviously could do otherwise. There were better investments, but AMD knew about AI boom beforehand, and they bought Xilinx exactly for this very reason. They could heavily invest in software departments, and have the drivers improvements, no-worse degree than intel did in less than a year. But they didn't. If AMD doesn't want their stuff to be purchased, whose fault is this? And at this pace, one is known for sure, that AMD doesn't care about regular consumers.
Surely there are many talented engineers working at AMD, and making brilliant devices and stuff. But all these achievements are moot, if heads of the AMD defining the future of the products they do.
It doesn't matter if the product is good, if it doesn't have good and developed ecosystem, and support. I don't say AMD products are bad. Quite the reverse, but the support and PR is lacking. They should do more. Thus it requires the investments, and why do that, if the enterprise market has more money, and they don't whine at reddits and forums about driver bugs, or about lack of fake frames.
Cutting production of high end dGPUs is not a very good idea, considering the only reasonable iGPU are yet non-existent. It could be understandable, if AMD flooded the market with APUs and laptop iGPUs of RX6600 level. But this is not the case. And I sincerely doubt it will ever be. There surely will be better iGPUs, but their quantity will be scarce. And even then, there are people requiring a more performant GPUs.
As other people wrote before, AMD is interested only in highest as possible margins. And that's AI. So unless there will be some powerful dedicated ASIC will appear, the GPU allocation will be moved to enterprise, and nobody at consumer level will be able do anything about this.
Eventually all GPUs will allocate to AI and datacenters only, and regular consumers are already being pushed to the subscription, meaning in the end people will get only some weak tablets/laptops/portable PCs to look at the screen, while the nVidia and AMD GPGPUs will generate some fake frames with remote access/streaming.
At the end of the day, the whole Ryzen and Radeon thing is just bigger sandbox and by-products of EPYC and MI, and gives AMD big amount free beta-testers to troubleshoot the enterprise R&D for shorter periods. It may seem be unrelated, but Ryzen is still heavily cut EPYC. The features set is way lesser, but the core architecture is the same. And the gaming Radeon is just crumbs compared to GPGP profits.