Monday, September 9th 2024
AMD Confirms Retreat from the Enthusiast GPU Segment, to Focus on Gaining Market-Share
AMD in an interview with Tom's Hardware, confirmed that its next generation of gaming GPUs based on the RDNA 4 graphics architecture will not target the enthusiast graphics segment. Speaking with Paul Alcorn, AMD's Computing and Graphics Business Group head Jack Huynh, said that with its next generation, AMD will focus on gaining market share in the PC gaming graphics market, which means winning price-performance battles against NVIDIA in key mainstream- and performance segments, similar to what it did with the Radeon RX 5000 series based on the original RDNA graphics architecture, and not get into the enthusiast segment that's low-margin with the kind of die-sizes at play, and move low volumes. AMD currently only holds 12% of the gaming discrete GPU market, something it sorely needs to turn around, given that its graphics IP is contemporary.
On a pointed question on whether AMD will continue to address the enthusiast GPU market, given that allocation for cutting-edge wafers are better spent on data-center GPUs, Huynh replied: "I am looking at scale, and AMD is in a different place right now. We have this debate quite a bit at AMD, right? So the question I ask is, the PlayStation 5, do you think that's hurting us? It's $499. So, I ask, is it fun to go King of the Hill? Again, I'm looking for scale. Because when we get scale, then I bring developers with us. So, my number one priority right now is to build scale, to get us to 40 to 50 percent of the market faster. Do I want to go after 10% of the TAM [Total Addressable Market] or 80%? I'm an 80% kind of guy because I don't want AMD to be the company that only people who can afford Porsches and Ferraris can buy. We want to build gaming systems for millions of users. Yes, we will have great, great, great products. But we tried that strategy [King of the Hill]—it hasn't really grown. ATI has tried this King of the Hill strategy, and the market share has kind of been...the market share. I want to build the best products at the right system price point. So, think about price point-wise; we'll have leadership."Alcorn pressed: "Price point-wise, you have leadership, but you won't go after the flagship market?," to which Huynh replied: "One day, we may. But my priority right now is to build scale for AMD. Because without scale right now, I can't get the developers. If I tell developers, 'I'm just going for 10 percent of the market share,' they just say, 'Jack, I wish you well, but we have to go with Nvidia.' So, I have to show them a plan that says, 'Hey, we can get to 40% market share with this strategy.' Then they say, 'I'm with you now, Jack. Now I'll optimize on AMD.' Once we get that, then we can go after the top."
The exchange seems to confirm that AMD's decision to withdraw from the enthusiast segment is driven mainly by the low volumes it is seeing for the kind of engineering effort and large wafer costs spent building enthusiast-segment GPUs. The company saw great success with its Radeon RX 6800 series and RX 6900 series mainly because the RDNA 2 generation benefited from the GPU-accelerated cryptomining craze, where high-end GPUs were in demand. This demand disappeared by the time AMD rolled out its next-generation Radeon RX 7900 series powered by RDNA 3, and the lack of performance leadership compared to the GeForce RTX 4090 and RTX 4080 with ray tracing enabled, hurt the company's prospects. News of AMD focusing on the performance segment (and below), aligns with the rumors that with RDNA 4, AMD is making a concerted effort to improving its ray tracing performance, to reduce the performance impact of enabling ray tracing. This, raster performance, and efficiency, could be the company's play in gaining market share.
The grand assumption AMD is making here, is that it has a product problem, and not a distribution problem, and that with a product that strikes the right performance/Watt and performance/price equations, it will gain market-share.
Catch the full interview in the source link below.
Source:
Tom's Hardware
On a pointed question on whether AMD will continue to address the enthusiast GPU market, given that allocation for cutting-edge wafers are better spent on data-center GPUs, Huynh replied: "I am looking at scale, and AMD is in a different place right now. We have this debate quite a bit at AMD, right? So the question I ask is, the PlayStation 5, do you think that's hurting us? It's $499. So, I ask, is it fun to go King of the Hill? Again, I'm looking for scale. Because when we get scale, then I bring developers with us. So, my number one priority right now is to build scale, to get us to 40 to 50 percent of the market faster. Do I want to go after 10% of the TAM [Total Addressable Market] or 80%? I'm an 80% kind of guy because I don't want AMD to be the company that only people who can afford Porsches and Ferraris can buy. We want to build gaming systems for millions of users. Yes, we will have great, great, great products. But we tried that strategy [King of the Hill]—it hasn't really grown. ATI has tried this King of the Hill strategy, and the market share has kind of been...the market share. I want to build the best products at the right system price point. So, think about price point-wise; we'll have leadership."Alcorn pressed: "Price point-wise, you have leadership, but you won't go after the flagship market?," to which Huynh replied: "One day, we may. But my priority right now is to build scale for AMD. Because without scale right now, I can't get the developers. If I tell developers, 'I'm just going for 10 percent of the market share,' they just say, 'Jack, I wish you well, but we have to go with Nvidia.' So, I have to show them a plan that says, 'Hey, we can get to 40% market share with this strategy.' Then they say, 'I'm with you now, Jack. Now I'll optimize on AMD.' Once we get that, then we can go after the top."
The exchange seems to confirm that AMD's decision to withdraw from the enthusiast segment is driven mainly by the low volumes it is seeing for the kind of engineering effort and large wafer costs spent building enthusiast-segment GPUs. The company saw great success with its Radeon RX 6800 series and RX 6900 series mainly because the RDNA 2 generation benefited from the GPU-accelerated cryptomining craze, where high-end GPUs were in demand. This demand disappeared by the time AMD rolled out its next-generation Radeon RX 7900 series powered by RDNA 3, and the lack of performance leadership compared to the GeForce RTX 4090 and RTX 4080 with ray tracing enabled, hurt the company's prospects. News of AMD focusing on the performance segment (and below), aligns with the rumors that with RDNA 4, AMD is making a concerted effort to improving its ray tracing performance, to reduce the performance impact of enabling ray tracing. This, raster performance, and efficiency, could be the company's play in gaining market share.
The grand assumption AMD is making here, is that it has a product problem, and not a distribution problem, and that with a product that strikes the right performance/Watt and performance/price equations, it will gain market-share.
Catch the full interview in the source link below.
272 Comments on AMD Confirms Retreat from the Enthusiast GPU Segment, to Focus on Gaining Market-Share
Ironically AMD then rested on their laurels and got caught off guard by the 580. Good times.
This "abandon the high end" is the strategy and used with their CPUs in the early 2010s, and their GPUs with both Polaris and rdna1. In none of those instances did it work. AMD lost market share and their competition made bucket loads of cash in all instances.
What I am trying to say is that providing a decent performance at a reasonable price and at reasonable energy efficiency is what most people care for. If AMD provides that at the right prices, even 40% less than Nvidia top offerings performance-wise, many people (not you the uber-rich 0.000001% of the market) will vote with their wallets. Not to mention that with the current levels of PC emulators (more CPU than GPU dependent) you can get an AAAA+ games for X360, PS2/3/4, Wii/GC and what not else that run at 120+ FPS, with super effective upscaling, outstanding visuals and addons (many of which never released or to be released for PC ever) that will give you everything you want for years to come compared to unfinished, unoptimized, full of glitches cut-off versions of what is considered an AAA game at the moment. And by the way, I bought my 7900XTX for half of the RTX4090 price and unless making money out of it will never go further for up to 20-30% more generational FPS, +15% RT and at 1/3 higher price over the current crazy ones. 99% of the so called gaming community will do the same and will keep their rig for 5-7 years with possible CPU/GPU upgrades half-way, if at all.
The nomenclature is arbitrary, sure, but it was 400 bucks as MSRP (470 was 350, if I recall) and I personally consider the mid-range anything in the 200-400 bracket. Did then, do still so now, although obviously things shifted a fair bit. 5870 was what 4070 is today, but the latter is 50% more expensive, yet people still routinely call it mid-range. More to the discussed topic, the 5700XT was also 400 bucks and THAT card is currently discussed as an example of AMD focusing on mid-range. Plus, when the 5870 was coming out NVidia has already firmly entrenched 500$+ as the high-end/flagship territory.
7900xtx is 23% slower than 4090 and cost half of 4090. will you keep your word and buy one? :p
:p
For whatever reason, it sucked for Blender and yes, Ngreedia had that software side taken care of, so AMD decided to go the same route.
But I do recall reading people saying that they (Blender devs) simply didnt care about AMD due to being fanbois.
Its true or not, dont know since its not something that I looked or participated deep enough, since I never used it. I know and it sucks because it does place them on this place where you want to have a conversation with someone and they will blindly shout "its competing with the xx90!"
5870 was a high end GPU. It was near top of every performance chart, with only the 480 consistently ahead. Thats a high end card. Prices have changed because of inflation. Since the 5870 came out we have roughly 12x as much cash in circulation. Thats gonna affect prices. Ahh the 6900 series. AMD's first emergency GPU. AMD really thought that nvidia wasnt going to do much with fermi that time around, I remember the response and rushed release, it was hilarious.
6970 was still a good card, shame about the drivers. Oh I know. It's really funny to see people whine about 600w being irresponsible and how we never should go over 350 watt, when people were saying the same thing in 2009 with the 480's 350 watt power draw.
The 4090 is a totally different beast. Those fermi coolers SUCKED. And gave birth to the likes of the MSI twin frozr design. Really, most of the custom GPU cooler design we see today is thanks to fermi's amp suckage.
The golden age of GPUs was promoted on artificially low prices stemming from the 2008 crash. Had it not been for 2008 the fermis would have been much closer to $1000 then $500. Dual GPU flagships were $1000 for over a decade, yet when titans come out that outperform those older setups for $1000 people flipped out. Because GPUs do not exist in a bubble. EVERYTHING has gotten massively more expensive. Wages have gone up significantly since the days of Fermi. The cost of wafers has exploded by an order of magnitude compared to 2009. It goes further back. The x1950xtx. x1900. x950. Even the 9800 pro, which was going against the FX 5800 series. The 7900xtx sold well. The 7900xt would have been a lot more successful if AMD hadn't been dicks about pricing. 7900 GRE sold well, when you could find it. The 6800xt/6900 series also sold well, relative to their product lines sales anyway.
so long as AI is a huge profit center, we're gonna keep seeing this, unless theres major expansion in fab availability or the AI market crashes.
I won't deny there IS value. But I'd value that at perhaps 60,- for two great games, because frankly that's what you can buy them for shortly after release, more often than not.
RDNA3 competes with Ada... but not quite.
- DLSS is superior and evolves faster
- RT works better
- Cards are slightly more power efficient
- Cuda as you mentioned...
So the reality is, Nvidia simply has a better product to sell, and people throughout the years have clearly shown preference for the biggest featureset, more so than a slightly lower price. And let's not forget AMD's terrible pricing strategy, waiting far too long with undercutting Nvidia hard, and instead trying to get maximum dollar for what is essentially a lesser offering. Customers don't like that.
Another aspect that cannot be overlooked is AMD's lacking consistency. You're buying a GPU, so you're also buying into an ecosystem of patches and feature updates throughout the years. AMD is not the best partner for a long term investment that way, every gen we're left to wonder what their new stack will look like; whether they will even compete in segment X or Y... or whether they'll even release anything other than rebrands. Nvidia is a lot more consistent that way, and this inspires trust. Customers are clearly ready to pay for that assurance as well.
So yes, I would say it was, is and will always be about the actual offerings. People clearly look straight through these silly game bundles and clearly value featureset, consistency and quality of the experience higher than you think they do. Its not 'the brilliant Nvidia marketing' - its that marketing alongside actually delivering. For that all you need to compare is the development of FSR vs DLSS. You can of course not be a fan of the proprietary approach (I'm not, anyway), but the reality is, the overall experience with DLSS is better, so if you're just gaming, what do you pick? Principles, or optimal gaming?