Sunday, August 6th 2023

AMD Retreating from Enthusiast Graphics Segment with RDNA4?

AMD is rumored to be withdrawing from the enthusiast graphics segment with its next RDNA4 graphics architecture. This means there won't be a successor to its "Navi 31" silicon that competes at the high-end with NVIDIA; but rather one that competes in the performance segment and below. It's possible AMD isn't able to justify the cost of developing high-end GPUs to push enough volumes over the product lifecycle. The company's "Navi 21" GPU benefited from the crypto-currency mining swell, but just like with NVIDIA, the company isn't able to push enough GPUs at the high-end.

With RDNA4, the company will focus on specific segments of the market that sell the most, which would be the x700-series and below. This generation will be essentially similar to the RX 5000 series powered by RDNA1, which did enough to stir things up in NVIDIA's lineup, and trigger the introduction of the RTX 20 SUPER series. The next generation could see RDNA4 square off against NVIDIA's next-generation, and hopefully, Intel's Arc "Battlemage" family.
Source: VideoCardz
Add your own comment

363 Comments on AMD Retreating from Enthusiast Graphics Segment with RDNA4?

#127
R0H1T
Oh don't worry this will be picked up by MLID eventually & when everyone forgets about it we'll also forget about the "internal" sources :slap:

Next up ~ Intel rumored to exit x86 business, leaving AMD/Via to fight for scraps :pimp:
Posted on Reply
#128
bug
But let's be honest here, we've all been saying $1,000+ video cards must go. As long as we see competition in the $200-500 segment, I'm good.
Posted on Reply
#129
rv8000
bugBut let's be honest here, we've all been saying $1,000+ video cards must go. As long as we see competition in the $200-500 segment, I'm good.
This won’t happen any time soon. While sales haven’t been stellar for either this gen, people are still buying, not to mention the 4090 and 7900 XTX seemed to have done fairly well considering their exorbitant prices. Crypto, ai, and semi professional GPU use of consumer GPUs has done lasting damage on the market. Nvidias most recent stance on AI doesn’t paint any relief in the near future.
Posted on Reply
#130
SCP-001
enb141No VRR and limited to 8 bit color in Smart TV, they the drivers since this year, have an issue with Kodi that you can't watch videos (only audio) if you enable HDR on Windows.

I reported (tried to) by their forums, nobody from AMD responded, also sent a support ticket by mail, they told me that they couldn't help me, and also reported to their bug tool.

Has been a year since I reported the VRR and limited 8 bit color in my Smart TV and about 8 months since the issue with Kodi, guess what, the bugs are still there.

Plus I got random windows reboots.
Have you made sure that you were plugged into an HDMI 2.1 port on the TV? I believe VRR only works on the 2.1 ports, or in a very limited sense, on HDMI 2.0.
Posted on Reply
#131
R0H1T
rv8000Crypto, ai, and semi professional GPU use of consumer GPUs has done lasting damage on the market.
You mean we've spoilt these companies even more? At least 2 of the 3 things were/are majorly consumer driven!
Posted on Reply
#132
rv8000
R0H1TYou mean we've spoilt these companies even more? At least 2 of the 3 things were/are majorly consumer driven!
Why buy Quadros or other professional line GPUs when consumer GPUs can fit the bill for a quarter the cost?
Posted on Reply
#133
R0H1T
Ok I was omitting that from the three because Crypto & yes even AI (growth) is fueled by regular consumers the globe. How do you think TikTok became probably the biggest virus this century?
Posted on Reply
#134
ARF
bugBut let's be honest here, we've all been saying $1,000+ video cards must go. As long as we see competition in the $200-500 segment, I'm good.
200$ is RX 6600 8GB, 500$ is RX 6800 XT three ! ! years after its release.
200$ is RTX 3050 8GB, 500$ is RTX 3070 Ti 8GB more than two !! years after its release.

What are you going to do with these cards ? Slide show "gaming" at 1080p? :banghead:

I don't think the 1000+$ cards must go, actually the opposite - everyone should focus on them and try to buy only them. Instead of upgrading every year or two, just buy that 1000$ beast and stay with it for the next five-seven ! ! years with ease.
Posted on Reply
#135
enb141
SCP-001Have you made sure that you were plugged into an HDMI 2.1 port on the TV? I believe VRR only works on the 2.1 ports, or in a very limited sense, on HDMI 2.0.
Tried Display Port to HDMI, HDMI to HDMI, even with 8K cables, same problem.
R0H1TThe initial chip x1 was introduced back in 2015 ~ yes that long back, it was a major flop at the time & given it was targeted at tablets(flagship mobiles?) Nvidia most definitely had at least a few million sitting somewhere in Taiwan or China. As you know consoles are a multi year venture so unless you think console makers want all 122+ million chips to be made at once ~ but that's not how it goes.
Even if they had 10 million units, I doubt they are going to sell those 10 million units at a loss and also they keep producing and selling them to Nintendo, so they definitely are making some money from those chips.
TheinsanegamerNA warehouse with only 10,000?

I can fit 10,000 tegra chips in my closet. They're not exactly big.
Just a fictitious number, my point is, whatever amount they had, I doubt they sold it at a loss.
Posted on Reply
#136
Tek-Check
rv8000Also that somehow the 7900/7900xtx are dog while they compete within their price range and beat their counterpart in traditional rasterization (the overwhelming majority of games). Don’t compete on RT, but several posts here are dooming like the cards wouldn’t even run the original quake.

Everything this gen is terrible, Nvidia and AMD alike.
I have 790XTX reference model. It's a fantastic card for 4K gaming, especially on OLED display. Cannot be more happy and I highly recommend it. I cannot see anything "terrible" with this card. Such complaints are misplaced.
Posted on Reply
#137
LabRat 891
Looking at the MI250 and MI300, I can't help but think this implies combining multiple Navi 43/44 with a shared IOD. The 'thing' that's been the issue with multi-GPU, is each GPU is discrete, and cannot seamlessly share resources.
AMD has demonstrated with the Instinct series, that limitation has been overcome.

community.amd.com/t5/instinct-accelerators/advancing-hpc-to-the-next-level-of-sustainability-with-amd/ba-p/611507
  • MCM - world’s first multichip GPU, designed to maximize compute and data throughput in a single package. The MI250 and MI250X use two AMD CDNA 2 Graphic Core Dies (GCD) in a single package to deliver 58 billion transistors in a highly condensed package with 1.8X more cores and 2.6X higher memory bandwidth vs. AMD previous generation accelerators (2). The two GCDs are tied together by a high-speed interface for chip-to-chip communication.
www.anandtech.com/show/18721/ces-2023-amd-instinct-mi300-data-center-apu-silicon-in-hand-146b-transistors-shipping-h223

Alternatively, there's potentially insider info that nVidia is pushing to "12 on a 10 scale" (power, heat, size, die area, etc.), and AMD will 'fallback' on high-yield high-margin parts for a gen.
IMO, the RX 480/580 and 5700XT weren't a failure, and were similar scenarios.

Another possibility is, (like Polaris) there's another chip entirely being developed to take up the high-end. Say, bringing some CDNA-derivative and HBM onto a Radeon?
I find that 'pretty unlikely' since, Fury, Vega, and VII(especially) didn't work out so great.
Posted on Reply
#138
rv8000
Tek-CheckI have 790XTX reference model. It's a fantastic card for 4K gaming, especially on OLED display. Cannot be more happy and I highly recommend it. I cannot see anything "terrible" with this card. Such complaints are misplaced.
Its the card to buy at its price range imo, unless you really need RT for some reason.
Posted on Reply
#139
Minus Infinity
Vayra86Right, so Intel is going for that fabled 250~300% performance boost by 2025? Noice! And even then they're just looking at AMD's last gen offering pretty much.



I'm in popcorn/wait and see mode here in every possible way. Just glad I bought a 7900XT as it is because the coming gens aren't looking to get much better and the lower you go in current gen stacks, the worse it gets. That still strikes me as a novelty many fail to recognize. High end purchases are equally effective cost/frame, and therefore better as they resell much more easily too and hold value.

What's not a novelty here though is AMD's 'consistency' wrt their GPU updates. Mother of god, what a mess.
Why are we comparing A770 which is at best a 3060'sih class card to these cards? Also Alchemist was a debacle and they are at least making a go of fixing the drivers and getting the hardware to perform. Sure they have a long way to go, but I'm presuming they will execute Battlemage a lot better. With Raja gone and the gpu group under new leadership, Intel won't tolerate another cluster fcuk. If Intel can deliver on their claims about Battlemage and price it well, IMO they will do very well against AMD. But time will tell.
Posted on Reply
#140
AusWolf
ARF200$ is RX 6600 8GB, 500$ is RX 6800 XT three ! ! years after its release.
200$ is RTX 3050 8GB, 500$ is RTX 3070 Ti 8GB more than two !! years after its release.

What are you going to do with these cards ? Slide show "gaming" at 1080p? :banghead:

I don't think the 1000+$ cards must go, actually the opposite - everyone should focus on them and try to buy only them. Instead of upgrading every year or two, just buy that 1000$ beast and stay with it for the next five-seven ! ! years with ease.
What are you talking about? The 6600 is a perfectly fine 1080p card.

Instead of buying $1000+ GPUs for 10 years, how about buying $2-300 GPUs for 5-7 years? If we followed your logic, then we would still be stuck on a 1080 Ti, or the equivalent Titan with no RT or DLSS for the next 3-4 years, whereas one could upgrade from a 1070 to a 4060 or an RX 7600.

If companies see that $1000 GPUs are selling, they'll try to push us to buy $2000 GPUs next time. I'd rather have an RX 6600, thanks.

And I haven't yet talked about price depreciation which gets worse when you move up the tiers.
Posted on Reply
#141
Dr. Dro
The high end market is sensitive about performance and the latest cool features because that's the segment where most enthusiasts are buying. AMD just can't compete here right now. In my humble opinion it's good to know when to retreat and the 5700 XT strategy, sans its launch disasters such as the infamous black screen problems and enough hardware bugs to have 5 steppings issued in the first year, that is.

Although initially I believed in it, RDNA 3 has proven to be a laughing stock of an architecture next to Ada Lovelace, and the product stack is has a positioning nightmare for the company to sort through.

Ada's problem is the opposite, that it's cut-down hardware sold at a performance tier down, price tier up x2 basis, which just makes them horrible value. Both can be fixed by lowering prices, but as long as the situation remains with AMD not keeping up with Nvidia's highly marketable techs, they just have no incentives to do so.

Horrible value as they may be the 40 series GPUs are... refined. That's the word I'd use to describe it, coming from the RTX 3090... Which was already an all around well developed product, mind you.
Posted on Reply
#142
Space Lynx
Astronaut
Dr. DroAlthough initially I believed in it, RDNA 3 has proven to be a laughing stock of an architecture next to Ada Lovelace
not sure how you can logically say that, when my 7900 XT goes toe to toe with your 4080 for half the price (if you factor in the most recent Prime Day sales). 14-19% improvements in fps gains since launch, drivers that are rock solid (for me anyway), all my games smooth as butter. considering the price I paid, I can't complain, and kudos to AMD for the driver improvements, I expect more will come.



www.techspot.com/review/2717-amd-radeon-7900-xt-again/
Posted on Reply
#143
ModEl4
Probably Navi 41 cancelled for good but there is a chance that Navi 42 will be redesigned as a monolithic chip.
I'm expecting the following 5 or 4nm monolithic designs:
Navi 42 128RBEs/80 RDNA4 dual CUs/ 64MB cache
Navi 43 64RBEs/40 RDNA4 dual CUs/ 32MB cache
Navi 44 32RBEs/20 RDNA4 dual CUs/ 16 or 24MB cache
10-15% higher clocks vs RDNA3
5-10% better CU efficiency vs RDNA3
GDDR7
Essentially Navi 42 will match Navi 31 4K performance and be better at lower res and the rest 1.4-1.5X vs RDNA3 (Navi 43 vs Navi 33)
Posted on Reply
#144
G777
enb141No VRR and limited to 8 bit color in Smart TV, they the drivers since this year, have an issue with Kodi that you can't watch videos (only audio) if you enable HDR on Windows.

I reported (tried to) by their forums, nobody from AMD responded, also sent a support ticket by mail, they told me that they couldn't help me, and also reported to their bug tool.

Has been a year since I reported the VRR and limited 8 bit color in my Smart TV and about 8 months since the issue with Kodi, guess what, the bugs are still there.

Plus I got random windows reboots.
Your issues may be limited to the RX 6400, which is a pretty middling card. Even something like a RX 6600 would've given you a much better impression.
Posted on Reply
#145
ratirt
I think AMD wants to put more effort into the chiplet design for RDNA and maybe that is why they announced the skip of high-end for now considering where the prices are headed I'm not surprised. AMD can't rely on customers buying cards for thousands of dollars.
Posted on Reply
#146
AusWolf
Space Lynxnot sure how you can logically say that, when my 7900 XT goes toe to toe with your 4080 for half the price (if you factor in the most recent Prime Day sales). 14-19% improvements in fps gains since launch, drivers that are rock solid (for me anyway), all my games smooth as butter. considering the price I paid, I can't complain, and kudos to AMD for the driver improvements, I expect more will come.



www.techspot.com/review/2717-amd-radeon-7900-xt-again/
I think what he meant was that RDNA 3 didn't achieve the performance AMD was targeting, and their high end still can't compete with Nvidia's highest end, which forced AMD to compete on price alone, which isn't really profitable. They're not bad cards for us, consumers, but they're bad at recouping AMD's investment in the architectural developments.
Posted on Reply
#147
Space Lynx
Astronaut
AusWolfI think what he meant was that RDNA 3 didn't achieve the performance AMD was targeting, and their high end still can't compete with Nvidia's highest end, which forced AMD to compete on price alone, which isn't really profitable. They're not bad cards for us, consumers, but they're bad at recouping AMD's investment in the architectural developments.
I mean, once oc'd, I beat a XTX by 200 points, which would have me tying Assassins Creed Valhalla with a 4090 at 1440p according to techspot reviews. $580 gpu (i got on sale) vs a $1700 gpu... i mean, yeah its only one game, but still.
Posted on Reply
#148
Vayra86
Minus InfinityWhy are we comparing A770 which is at best a 3060'sih class card to these cards? Also Alchemist was a debacle and they are at least making a go of fixing the drivers and getting the hardware to perform. Sure they have a long way to go, but I'm presuming they will execute Battlemage a lot better. With Raja gone and the gpu group under new leadership, Intel won't tolerate another cluster fcuk. If Intel can deliver on their claims about Battlemage and price it well, IMO they will do very well against AMD. But time will tell.
Because there isn't a faster gaming GPU in the Intel stable? Right?

And deliver on claims yeah, sure, if AMD could deliver on its claims a few times in history Nvidia would be under the bus by now. But here we are.
Posted on Reply
#149
Gica
Minus InfinityWell Nivida feels they can do what they want. PC gpu's are a side-show now. They don't even need to release products. AMD insiders AMD doesn't even want to be number 1 even if possible. AMD has actually lost share to Intel despite it's woeful release of Alchemist, but they appear to have steadied the ship. If they can deliver on promises about Battlemage's performance targets and price is right I think they will eat a lot of AMD share. AMD will do well with APU's but Meteor Lake's iGPU should be decent and push it past Phoenix. Not sure about Arrow Lake's iGPU vs Sarlak though.

If AMD is going to have 7900's as their flagship against Nvidia's 5000 series and only offer 8600 or lower class RDNA4 they are in for a world of pain unless they can deliver huge upgrades over 7600 for similar money. Not bothering with a 7700/7800 replacement is very depressing if true unless RDNA5 is not a long time after RDNA4.
Not quite, not exactly.
It's not news that AMD can't keep up, and nVidia has seen its way. There were GTX 1080 and RTX 2080 at reference prices, although AMD did not cover those segments. Only the miners disrupted the market. It was an illusion that AMD could compete with nVidia in terms of performance, when it chose the Samsung solution. Losing the advantage of the manufacturing node, AMD offers solutions that are weaker in performance and consume more. The only weapon remains the price, but how much can you cut from it in order not to destroy the profit?
They will probably return to the 5700 XT era as a flagship and leave nVidia to handle the enthusiasts alone. They have big problems competing with DLSS, CUDA, OptiX and Ray Tracing. The RTX 4060 is equal to the RX 7600 in rasterization, but it effectively destroys it when it uses these technologies.
Posted on Reply
#150
Vayra86
Space LynxI mean, once oc'd, I beat a XTX by 200 points, which would have me tying Assassins Creed Valhalla with a 4090 at 1440p according to techspot reviews. $580 gpu (i got on sale) vs a $1700 gpu... i mean, yeah its only one game, but still.
Ehh yeah. Next you're going to say the 7900XT took you to Mars. Its not only one game, its no single game.
Let's not exaggerate and try to see things for what they are. There is nearly 25% between the XT and the XTX :) There are no OCs' on a 7900XT for more than 15% perf, and even then you're doing something special.
Posted on Reply
Add your own comment
Nov 21st, 2024 10:59 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts