Monday, September 9th 2024
AMD Confirms Retreat from the Enthusiast GPU Segment, to Focus on Gaining Market-Share
AMD in an interview with Tom's Hardware, confirmed that its next generation of gaming GPUs based on the RDNA 4 graphics architecture will not target the enthusiast graphics segment. Speaking with Paul Alcorn, AMD's Computing and Graphics Business Group head Jack Huynh, said that with its next generation, AMD will focus on gaining market share in the PC gaming graphics market, which means winning price-performance battles against NVIDIA in key mainstream- and performance segments, similar to what it did with the Radeon RX 5000 series based on the original RDNA graphics architecture, and not get into the enthusiast segment that's low-margin with the kind of die-sizes at play, and move low volumes. AMD currently only holds 12% of the gaming discrete GPU market, something it sorely needs to turn around, given that its graphics IP is contemporary.
On a pointed question on whether AMD will continue to address the enthusiast GPU market, given that allocation for cutting-edge wafers are better spent on data-center GPUs, Huynh replied: "I am looking at scale, and AMD is in a different place right now. We have this debate quite a bit at AMD, right? So the question I ask is, the PlayStation 5, do you think that's hurting us? It's $499. So, I ask, is it fun to go King of the Hill? Again, I'm looking for scale. Because when we get scale, then I bring developers with us. So, my number one priority right now is to build scale, to get us to 40 to 50 percent of the market faster. Do I want to go after 10% of the TAM [Total Addressable Market] or 80%? I'm an 80% kind of guy because I don't want AMD to be the company that only people who can afford Porsches and Ferraris can buy. We want to build gaming systems for millions of users. Yes, we will have great, great, great products. But we tried that strategy [King of the Hill]—it hasn't really grown. ATI has tried this King of the Hill strategy, and the market share has kind of been...the market share. I want to build the best products at the right system price point. So, think about price point-wise; we'll have leadership."Alcorn pressed: "Price point-wise, you have leadership, but you won't go after the flagship market?," to which Huynh replied: "One day, we may. But my priority right now is to build scale for AMD. Because without scale right now, I can't get the developers. If I tell developers, 'I'm just going for 10 percent of the market share,' they just say, 'Jack, I wish you well, but we have to go with Nvidia.' So, I have to show them a plan that says, 'Hey, we can get to 40% market share with this strategy.' Then they say, 'I'm with you now, Jack. Now I'll optimize on AMD.' Once we get that, then we can go after the top."
The exchange seems to confirm that AMD's decision to withdraw from the enthusiast segment is driven mainly by the low volumes it is seeing for the kind of engineering effort and large wafer costs spent building enthusiast-segment GPUs. The company saw great success with its Radeon RX 6800 series and RX 6900 series mainly because the RDNA 2 generation benefited from the GPU-accelerated cryptomining craze, where high-end GPUs were in demand. This demand disappeared by the time AMD rolled out its next-generation Radeon RX 7900 series powered by RDNA 3, and the lack of performance leadership compared to the GeForce RTX 4090 and RTX 4080 with ray tracing enabled, hurt the company's prospects. News of AMD focusing on the performance segment (and below), aligns with the rumors that with RDNA 4, AMD is making a concerted effort to improving its ray tracing performance, to reduce the performance impact of enabling ray tracing. This, raster performance, and efficiency, could be the company's play in gaining market share.
The grand assumption AMD is making here, is that it has a product problem, and not a distribution problem, and that with a product that strikes the right performance/Watt and performance/price equations, it will gain market-share.
Catch the full interview in the source link below.
Source:
Tom's Hardware
On a pointed question on whether AMD will continue to address the enthusiast GPU market, given that allocation for cutting-edge wafers are better spent on data-center GPUs, Huynh replied: "I am looking at scale, and AMD is in a different place right now. We have this debate quite a bit at AMD, right? So the question I ask is, the PlayStation 5, do you think that's hurting us? It's $499. So, I ask, is it fun to go King of the Hill? Again, I'm looking for scale. Because when we get scale, then I bring developers with us. So, my number one priority right now is to build scale, to get us to 40 to 50 percent of the market faster. Do I want to go after 10% of the TAM [Total Addressable Market] or 80%? I'm an 80% kind of guy because I don't want AMD to be the company that only people who can afford Porsches and Ferraris can buy. We want to build gaming systems for millions of users. Yes, we will have great, great, great products. But we tried that strategy [King of the Hill]—it hasn't really grown. ATI has tried this King of the Hill strategy, and the market share has kind of been...the market share. I want to build the best products at the right system price point. So, think about price point-wise; we'll have leadership."Alcorn pressed: "Price point-wise, you have leadership, but you won't go after the flagship market?," to which Huynh replied: "One day, we may. But my priority right now is to build scale for AMD. Because without scale right now, I can't get the developers. If I tell developers, 'I'm just going for 10 percent of the market share,' they just say, 'Jack, I wish you well, but we have to go with Nvidia.' So, I have to show them a plan that says, 'Hey, we can get to 40% market share with this strategy.' Then they say, 'I'm with you now, Jack. Now I'll optimize on AMD.' Once we get that, then we can go after the top."
The exchange seems to confirm that AMD's decision to withdraw from the enthusiast segment is driven mainly by the low volumes it is seeing for the kind of engineering effort and large wafer costs spent building enthusiast-segment GPUs. The company saw great success with its Radeon RX 6800 series and RX 6900 series mainly because the RDNA 2 generation benefited from the GPU-accelerated cryptomining craze, where high-end GPUs were in demand. This demand disappeared by the time AMD rolled out its next-generation Radeon RX 7900 series powered by RDNA 3, and the lack of performance leadership compared to the GeForce RTX 4090 and RTX 4080 with ray tracing enabled, hurt the company's prospects. News of AMD focusing on the performance segment (and below), aligns with the rumors that with RDNA 4, AMD is making a concerted effort to improving its ray tracing performance, to reduce the performance impact of enabling ray tracing. This, raster performance, and efficiency, could be the company's play in gaining market share.
The grand assumption AMD is making here, is that it has a product problem, and not a distribution problem, and that with a product that strikes the right performance/Watt and performance/price equations, it will gain market-share.
Catch the full interview in the source link below.
272 Comments on AMD Confirms Retreat from the Enthusiast GPU Segment, to Focus on Gaining Market-Share
Didn't know buying a $1200 gpu was the equivalent of buying a $120k car (911 carrera base model) o_O; i'll gladly trade my 4080 for a used 911 if anyone is out there! :D
Where is AMD's current mid-range cards on that list? It's still dominated by Nvidia.
I'd say it's still going to be difficult for AMD to break into that mid-range tier when Nvidia will still have an opposing product next gen.
I am wishing Radeons well. We need it!
And still viable and in support, ergo still around.
"Incomplete DirectX API support"
Ah yes, lack of support for feature level 12_2, which includes... things it doesn't have in its design drafted in 2018. When DX12 Ultimate came out in 2020. Be real.
"No matrix multiplication/tensor/RT support"
When the competition barely had all of that in the card's heyday? When the people that have/are interested in a 5700XT probably won't be considering workloads like that?
"Buggy drivers"
This I'll concede, but that applies more to the card's earlier years than it does now.
"Deficient encoding hardware"
I will also concede VCN kind of sucking but that fact has not changed in comparison to NVENC/QSV at any point in the last decade or so. Didn't kneecap its sales nor sales of RX 6000 and RX 7000.
"Reliability issues requiring several revisions"
Much of what I could even dig up is solved by now or had existing workarounds at the time, cards that are still around can be/are fixed.
"Limited VRAM size."
Lemme grip you by the ear and rattle off some models you might be familiar with. 2070. 2070 SUPER. 2080. 2080 SUPER. 3060Ti. 3070. 3070Ti. 4060. 4060Ti 8G. A580. A750. A770 8G. Released around the same time or newer, or far newer, all hampered by the same 'lack of VRAM' you rest the blame on AMD for as if they were supposed to have some moment of divine providence to realize that The Last of Us Part 1 Remastered: Extra Shitty Port Edition (2023) will need more than 8GB. When the cards that roughly match it in performance have the same memory sizes and are still being used TODAY.
I highlighted how long the 5700XT has lasted as a card that you can slap into a PC and still use within its means, specifically in the segment that consumer Radeon targets: value-conscious gaming. Even its geriatric Polaris predecessor the RX 480/580 is still seeing use. Much of what you cite as it being a 'bad example' are issues that were either relevant only in its youth or a result of the card, shocker, being old. Get a grip.
That's why AMD is limiting their investment in the PC gaming market. Why invest in a market where the consumers will go with any logical or illogical excuse and buy the competitor's product? Why invest in a hostile market?
what they need to adjust is price - fix to $500
otherwise whole interview is whishful thinking
People also complained that AMD was forcing it's sponsored games to use a large amount of VRAM, again with zero proof.
Meanwhile when Nvidia has a bug or an issue like the terrible 12VHPWR adapter, 3000 series transient spikes and noise feedback in the 12v sense pin, New World bricking cards, or the discord bug that lowered clocks, people blamed everyone but Nvidia. Yep, AMD needs a Ryzen moment for their GPUs. They need to provide enough of a value advantage to make customer take notice, because most aren't even considering AMD.
That said I'm not sure they could have a Ryzen moment because Nvidia has been very aggressive in the past with it's pricing to prevent AMD from gaining marketshare. Nvidia could lower mid to low end GPU prices temporarily just to crash AMD and then things would return to normal the gen after. As we've seen with their AIBs and the AI market, they aren't afraid of coercion and other illegal tactics either.
My last Geforce was the NVIDIA GeForce 8800 GTS 320 in 2007,
I loved it, but kept buying ATi then AMD because it supported 3 displays (Eyefinity), sometimes it was better and sometimes it was cheaper.
Now I starting to hate it since AMD doing nothing to make the cool stuff accessible with their GPU-s (mostly AI which sometimes I use) and the incredibly s*#t drivers.
They just not deserving my money anymore... I don't have recent experience with Nvidia drivers... but I can tell AMD is not doing well on that department.
Nothing wrong with that, but as rumors have it RDNA 4 will top at RX7900XT to RX7900XTX performance with rt performance of 4070 to super and price of 500-600$, at best it will be OK performance/dollar improvement from last gen with stronger RT performance. All nvidia has to do is price their card within 10-15% more from AMD equivalent and people will pay the extra.
Those products (4090) not only sells relatively well but mainly serve as a flagship, a technological demonstrator and it works really well for Nvidia.
Now tbh, 2000-2500 EUR RTX5090, is that really relevant ? It's a very restricted class of consumers that can purchase one, it's way more than one month of minimum wage ("smic" around 1300 EUR) in France just to give reference.
...but you see a RTX 5090, marvel of engineering, it's a gigantic soft power, a gigantic ad for many consumers.
I think they should be in-between, no need to pursue the 4090/5090, but a 5080Ti would be the perfect spot for a 8950XTX, something with a smaller delta than a 7900XTX/4090..but not a direct 5090 contender, a cheaper in-between one that shows you still can pull a fight
4070 Ti Super's 16 GB memory is not a hindrance to its performance, even at the highest resolutions that go beyond this class of hardware's capabilities in today's most VRAM hungry titles
4070 Ti Super will match the 7900 XT's raster performance
4070 Ti Super should be more than 20% faster at RT than the 7900 XT
4070 Ti Super is more power efficient than 7900 XT even in its strongest factory overclocked models, while pushing the heaviest ray tracing workloads - generating less heat to do so - in this specific case, the 7900 XT is likely power limited, but even at full tilt its still ~20 watts of difference under full load
4070 Ti Super features AD103 dual-engine NVENC which will record in 4:4:4 and HDR at high resolutions and frame rates at near zero latency
developer.nvidia.com/video-encode-and-decode-gpu-support-matrix-new
4070 Ti Super has access to NVIDIA Studio drivers which have stability and productivity application compatibility in mind
www.nvidia.com/en-us/studio/
4070 Ti Super supports the full breadth of functionality in NVIDIA's RTX ecosystem, including Canvas, RTX AI, DLSS frame generation and ray reconstruction technologies, as well as full support for Intel's XeSS and AMD FidelityFX
I could go on, but it's obvious that AMD cannot compete and this gap will significantly widen as RDNA 4 targets RTX 4080's performance, lacks most of its feature set, and Blackwell's release is imminent. The return isn't worth the investment, Radeon buyers' mindshare doesn't line up, the wafers can be used at more profitable segments, the investment software-side is almost if not more of an Herculean effort to get up to par than the hardware... it's good that they came clean. AMD just cannot do it.
Competitive games like Counter Strike and DOTA 2 where high frames matter, you will see greater than 200+ frames in 4K max settings. I have a high 4k monitor with 240 hz. refresh, but it's just investment for future cards. The monitor really isn't a use case quite yet.
I honestly felt like AMD already gave this market away with their absence in competition for the 4090. Going further, would be a deal breaker for me on future cards with AMD... This generation, I would have loved to have seen a 7950 xtx (similar to the Asrock version I have www.techpowerup.com/review/asrock-radeon-rx-6950-xt-oc-formula/) or at minimum a binned Navi 31 similar to the XTXH (found in the power color ultimate version I have www.techpowerup.com/gpu-specs/powercolor-red-devil-rx-6900-xt-ultimate.b8752 ). With that being said, again they have already given up the enthusiast market. I'm already disappointed, and going further would be a bad move by AMD in my opinion.
Right now I have multiple AMD cards, 7900 XTX, 7900 XT, 6950 XT, 6900 XTXH, 6700 XT, and 6750 XT. These cards perform well for all their use cases. However, for my main PC, I'm looking for very good raster in 4K. AMD giving this market away loses me as customer.
AMD is in great position to compete, but investors want gains in AI.
I've just moved to all 4k (after our monitor discussion I pulled the trigger on a MSI MPG321URX) so 4k/high refresh is the aim. I'll be playing league of legends and solitaire at high refresh I guess.
I think after seeing what Wukong, Star Wars outlaws and now Space Marine 2 take to run at 4k and not get even near 120fps I'll be moderating my expectations for a generation or 3.
And exactly what are we referring to as the "enthusiast GPU segment"? You've got the APEX flagship, higher tier, mid and low. Not competing (or can't) with the 5090 makes sense but when did 5080/5070 level performance fall short of the "enthusiast GPU". Seem likes we have to blow insane amounts of money at the fastest brick in town to get a green card for the enthusiast graphics club?
People jumped to conclusions when AMD's marketing didn't explicitly say they weren't blocking DLSS but AMD's marketing department has got to be the worst marketing department of any company in the world. You cant rule out that it's just incompetence. Frank Azor came in shortly after that and clarified that they do not mandate FSR only in AMD sponsored games.
Could AMD have restricted DLSS prior to public outcry? Sure, but why haven't these people also complained on any of the occasions a game released with DLSS and not FSR? Why were these people silent about any of the numerous worse things Nvidia has done like GameWorks, GPP, ect? GPP never went away, all the top SKUs from Gigabyte, MSI, and ASUS are Nvidia only. Public and press response? Nothing. I explicitly remember people giving Nvidia a pass for anti-competitive gameworks features when there was explicit proof provided showing games like Crysis 2 used way too high levels of tessellation (far beyond any visual benefit and even tessellated objects outside of view, there's a video on Youtube demonstrating this still up) or when games like The Witcher 3 was running hairworks on models that weren't even visible.
The cope was either "Nvidia has a right to push it's prorietary features" or outright denial. I pointed out to TPU multiple times that the top SKU from MSI, Gigabyte, and ASUS weren't available on AMD cards anymore and they said 'wait a few weeks'. Well I let them know a few months after that that they still weren't available but I guess TPU didn't care for the story. If we are going to hang AMD for starfield not launching with DLSS and FSR based some frankley pretty weak evidence, it's a complete double standard to not do the same to Nvidia, particularly when in many cases the evidence is stronger.
And no matter what the full truth is, the fallout was actually great for everyone, since the Starfield fiasco there's been a perceptible shift and more titles are getting all 3 major competing upscalers and nobody is intentionally blocking anything, everybody wins.
Frank Azor from AMD did clarify a few days later but obviously by then the damage was already done.
None of your comment of which touches on the fact that the same auspices which started the uproar over a lack of DLSS at launch were present in reverse in many titles that supported only DLSS at launch. It's a clear double standard. Notwithstanding any of the other things Nvidia has done (as pointed out in my last comment) where an equal or greater level of evidence was provided than what you are using to spear AMD here, yet I don't see people complaining about those. You could say that of every company nowadays, Nvidia included. Tech Outlets both didn't come to that conclusion nor should they given the lack of concrete evidence. It would be insanely bad journalism to publish an article with a definitive conclusion based on mere observation. It's one thing to report on a story, it's another to pass something as fact.
Yes, clearly I'm unlikely to change your mind because your own idea of what happened is mismatched with what actually happened. In your mind, every outlet confirmed AMD blocked DLSS (they didn't) and every reasonable person. By extension you are dismissing people who disagree as unreasonable. Your logic doesn't leave any room for disagreement or room to change your mind. You are completely assuming said titles wouldn't have had them regardless. This is classic confirmation bias, you are seeing this in a way that confirms your self-admitted non-flexible version of events.
Why do you insist I am non flexible on the version of events? I told you why I believe what I believe and readily concede it isn't definitive proof, it just seems to me, to be the most likely thing that happened when I consider all evidence I have consumed. You came to a different conclusion, that doesn't make your version fact however, and if anything, you are coming across completely non-flexible on the topic. I would readily entertain irrefutable proof that this didn't happen. Hell, I'd have even taken AMD's word for it at the time, but they didn't even want to offer that.
So now that we've opened this can of worms, do we keep repeating ourselves or is there something new to add that might sway you or I? if there is nothing new, well I think we both said our piece on why we believe what we believe to have occurred, and in the absence of something revelatory this conversation is going nowhere.
All I heard was "RDNA 3 is cheaper to produce" and yet the 7600/7700 series was nothing special both performance and price wise. Even the 7800XT (power assumption aside) was the same performance as the 6800XT
But since the subject is AMD/Nvidia, that is not a surprise (and both fanboys are guilty)
But anyway, AMD is somewhat Lying, or well masquerading the truth at least.
- Current Gen is not competitive at the spec level with Nvidia.
- They declared in another news today that they want to unify RDNA/CDNA because they have too much issue designing all theses chips. Also they changed their whole architecture way too much. If you look on the Zen side, they redesign the I/O die every 2 gen. Since RDNA, the whole GPU got redesigned each generation and they had to do something similar for CNDA. This is way too much for them and it's not efficient at all.
- I assume that RDNA 4 is bellow initial expectation (when they did the first design). They aren't expecting to be able to compete with it on the high end no matter what they do, it will be a loss.
- They probably have low hope too for RDNA 5 but they need to focus quickly and get back on track to regain market share and be competitive for the next generation of consoles.
- They then make the decision to scrap what would be unprofitable anyway, the high end, and focus on the mid range where they could still sell some GPU.
This just seems like their strategy is to survive for few years while they get their shit together.
That is fine, they were not on the path of winning anyway with Radeon. They do not seems to divest from the adventure. They just seems to focus. Let's hope they get back on their feets quickly. Maybe they can do in few years a "real Ryzen moment". I do not think they right now know what need to be done to regain market leadership.
Nvidia having no competition is great for them on the short term. But company that have near monopolistic market share and low competition in tech have always failed in the long run. They generally transform themselves after few year in a market milking machine instead of being a innovation machine and they get caught flat footed when competition is back.
It have happen to Intel, IBM, etc,
Personally i don't care much if the market slow down. I would for sure like to be able to upgrade to newer higher end GPU for cheap frequently, but at the same time, i recall how long i had my i7 2600K and i wouldn't mind keeping a GPU that is capable to playing games greatly for that long.
There is positive in everything.