Sunday, August 6th 2023
AMD Retreating from Enthusiast Graphics Segment with RDNA4?
AMD is rumored to be withdrawing from the enthusiast graphics segment with its next RDNA4 graphics architecture. This means there won't be a successor to its "Navi 31" silicon that competes at the high-end with NVIDIA; but rather one that competes in the performance segment and below. It's possible AMD isn't able to justify the cost of developing high-end GPUs to push enough volumes over the product lifecycle. The company's "Navi 21" GPU benefited from the crypto-currency mining swell, but just like with NVIDIA, the company isn't able to push enough GPUs at the high-end.
With RDNA4, the company will focus on specific segments of the market that sell the most, which would be the x700-series and below. This generation will be essentially similar to the RX 5000 series powered by RDNA1, which did enough to stir things up in NVIDIA's lineup, and trigger the introduction of the RTX 20 SUPER series. The next generation could see RDNA4 square off against NVIDIA's next-generation, and hopefully, Intel's Arc "Battlemage" family.
Source:
VideoCardz
With RDNA4, the company will focus on specific segments of the market that sell the most, which would be the x700-series and below. This generation will be essentially similar to the RX 5000 series powered by RDNA1, which did enough to stir things up in NVIDIA's lineup, and trigger the introduction of the RTX 20 SUPER series. The next generation could see RDNA4 square off against NVIDIA's next-generation, and hopefully, Intel's Arc "Battlemage" family.
363 Comments on AMD Retreating from Enthusiast Graphics Segment with RDNA4?
Most people playing anyway 1080p, some playing 1440p and very small amount on 4k
So what's a point invest money, and creating new high end gpus when they have upscaling software.
Also, the 5700 XT wasn't really a success on its own merits. Sure, it was an okay midrange card, but it was a good testing ground for AMD to see how RDNA works. We wouldn't have RDNA 2 without it. If AMD manages to mix things up with RDNA 4's architecture, and make some decent mirage cards with relatively low R&D costs, it could give them a chance to draw some conclusions for RDNA 5 again. Withdraw from the battle to save your strength for the next one is not a bad plan, imo.
Every company drops the ball once in a while, but AMD didn't just drop it, they tripped over it and faceplanted. All they had to do was introduce a moderately competent mid-range GPU series, at moderately sane pricing, and they would've eaten the lunch of 4060 and 4070. Instead it's those GPUs that are eating AMD's lunch! On the one hand, it's easy to say that in hindsight; both companies develop their GPUs essentially in the dark from what the other is doing, so they can't know whether they've chosen rightly or wrongly until the other side launches theirs. You've just gotta put your head down, try to build the best, and hope it's good enough. If it isn't, then you've sunk a lot of money into being a loser, and strangely enough that isn't a compelling market position to be in.
On the other, this is again one of the reasons that it makes so much sense to compete in the mid-range first: if your product isn't as good as your competitor's, you can avoid being a loser by adjusting price. The old adage of "no bad products, only bad prices" will never cease to be true as long as we live in a capitalist world, and sometimes it's smart to take the hit to keep up market- and mindshare. It's a terrible plan, because it allows your competitors to take up the slack that you've left by stepping aside. That allows them to get more of their product onto shelves, which means their brand is further into consumer consciousness while yours recedes. And when you come back to try to compete again you'll find retailers aren't willing to give you "your" shelf space back, because they've already got competitor product there that they know they can move. Nope, absolutely the worst thing any company selling consumer goods can ever do.
The article says AMD is planning to do exactly what you said: compete in the midrange first. Less R&D, lower manufacturing costs, more chips per wafer, higher sales numbers, and an opportunity to learn for RDNA 5. I didn't see that happen after RDNA 1, and I didn't see that happen after Bulldozer. AMD went nearly bankrupt, but they came back. Shelf space is for anything that sells.
Any negativity you took from my comment was aimed at all the nonsense that intel is suddenly going to leap frog AMD, while they’re selling larger dies currently with worse performance, and probably at a loss with how cheap the 750/770 are forced to sell. They’ve got awhile to go.
Also that somehow the 7900/7900xtx are dog while they compete within their price range and beat their counterpart in traditional rasterization (the overwhelming majority of games). Don’t compete on RT, but several posts here are dooming like the cards wouldn’t even run the original quake.
Everything this gen is terrible, Nvidia and AMD alike. But dropping out of the market? Selling the GPU division? Like how would any of that make even the smallest amount of business sense in the immediate future.
But even if its going to be just an RX 8600 and 8700 we know AMD sold GPU's fine when not running the high end when they only had Polaris. And we see the low end 6600-7600 and nvidia's 60's and 50's run well last 2 gens. So much so many are moving down the stack to that as upgrades now due to price hikes. Not that many care about 100 vs 200 fps. Outside a couple titles its becoming mostly epeen to pay for the 1k$+ class of gpus. As is the 600-60 range performs very well and many expect the dual issue FP32's in the shaders will be utilized at some point in game engine updates as both major vendors have moved to it. Personally waiting on a sale for for one to replace the RIP Vega 64. Wish that had lasted a few more years tho. Id bet intel will move to dual issue as well.
But I doubt the abandoning of the high end because high end 6000's and 7000's sold relatively well. I see no reason for them to can that altogether.
It is actually very possible - there is a hypothesis that the Navi 41 and Navi 42 got cancelled during the design stage just before tape-out because they didn't meet the performance targets.
They definitely need to design a new graphics architecture, and focus on RT if they can.
192 RT cores (instead of only 96) in Navi 31 could have helped a lot, a beefed the ray-tracing computation capabilities of the chip.
90% of people demand very light ultrabooks with graphics power and Zen 4 Phoenix is perfect, dGPU is not needed because more ultrabooks are not coming out because Zen 4 is compatible with: DDR5 + USB 4.0 + HDMI 2.1 + RDNA 3 + artificial intelligence.
AMD is making very large increases in it's R&D budget and their recent Xilinx acquisition was expensive, things you definitely want to factor in when looking at quarterly reports. The Xilinx purchase alone is why there is such a big difference between GAAP and non-GAAP reports, the non-GAAP report excludes the cost. You can see the drag on AMD's earnings from the acquisition on several of the past quarterly reports. AMD has grown from a net worth of 1.5 billion in 2016 to 183.6 billion in 2023 (they peaked at over 200 Billion). It should go without saying that they are important, hence why Intel is moving to a chiplet based architecture as well.
I'm certainly not a fan of AMD's new pricing and product approach but let's not take the current market conditions as a reason to throw the wrong things like chiplets under the bus.
The RX 6000 series was the first in a long time to give the nvidia flagship (at the time the 3090) a run for its money, and AMD couldnt keep the things in stock. Withdrawing from a market is a TERRIBLE idea. AMD proved that with bulldozer, even once ryzen arrived it took a few years to rebuild momentum. I could buy it being a rDNA issue. rDNA 1 didnt have RT at all, and it really does look like it was slapped onto rDNA2.
Which is fine, r300 and evergreen only went so far. That's debateable. The 7040 series is a good product, but not the be all end all.
It takes time for designs to roll out. Given OEMs are still waiting for the official phoenix GPU drivers they are in no rush to get these things out there.
Not sure what that bolded part is even trying to say.
So if that were true, then Nvidia had been selling Tegra chips to Nvidia at a loss, which is not the case.
I can fit 10,000 tegra chips in my closet. They're not exactly big.
I don't know where you are getting 50% as the techpowerup review of the 7900 XT shows an average 20~25% improvement. Which considering how AMD improved clocks, added more CUs/WGP, etc etc, would really mean that architecturally-wise AMD really didn't manage to improve much despite obvious big goals(dual-issue, new RT instructions, MDIA, pixel wait sync, etc).
You can see that in the 780M vs 680M as improvements are really mostly due to clocks(22% increase in clocks and it ended up with 20~30% perf improvement).
To me AMD and Intel suck, I have tried them so nobody will convince me that they don't suck because I have tried them all and the best in Nvidia, so now I'm glad they are releasing a low profile 4060.
The only thing that I would choose AMD is for a handheld because intel or nvidia doesn't have a good windows based handheld, Intel has but is not as popular as AMD in this field.
I reported (tried to) by their forums, nobody from AMD responded, also sent a support ticket by mail, they told me that they couldn't help me, and also reported to their bug tool.
Has been a year since I reported the VRR and limited 8 bit color in my Smart TV and about 8 months since the issue with Kodi, guess what, the bugs are still there.
Plus I got random windows reboots.