Monday, September 9th 2024

AMD Confirms Retreat from the Enthusiast GPU Segment, to Focus on Gaining Market-Share

AMD in an interview with Tom's Hardware, confirmed that its next generation of gaming GPUs based on the RDNA 4 graphics architecture will not target the enthusiast graphics segment. Speaking with Paul Alcorn, AMD's Computing and Graphics Business Group head Jack Huynh, said that with its next generation, AMD will focus on gaining market share in the PC gaming graphics market, which means winning price-performance battles against NVIDIA in key mainstream- and performance segments, similar to what it did with the Radeon RX 5000 series based on the original RDNA graphics architecture, and not get into the enthusiast segment that's low-margin with the kind of die-sizes at play, and move low volumes. AMD currently only holds 12% of the gaming discrete GPU market, something it sorely needs to turn around, given that its graphics IP is contemporary.

On a pointed question on whether AMD will continue to address the enthusiast GPU market, given that allocation for cutting-edge wafers are better spent on data-center GPUs, Huynh replied: "I am looking at scale, and AMD is in a different place right now. We have this debate quite a bit at AMD, right? So the question I ask is, the PlayStation 5, do you think that's hurting us? It's $499. So, I ask, is it fun to go King of the Hill? Again, I'm looking for scale. Because when we get scale, then I bring developers with us. So, my number one priority right now is to build scale, to get us to 40 to 50 percent of the market faster. Do I want to go after 10% of the TAM [Total Addressable Market] or 80%? I'm an 80% kind of guy because I don't want AMD to be the company that only people who can afford Porsches and Ferraris can buy. We want to build gaming systems for millions of users. Yes, we will have great, great, great products. But we tried that strategy [King of the Hill]—it hasn't really grown. ATI has tried this King of the Hill strategy, and the market share has kind of been...the market share. I want to build the best products at the right system price point. So, think about price point-wise; we'll have leadership."
Alcorn pressed: "Price point-wise, you have leadership, but you won't go after the flagship market?," to which Huynh replied: "One day, we may. But my priority right now is to build scale for AMD. Because without scale right now, I can't get the developers. If I tell developers, 'I'm just going for 10 percent of the market share,' they just say, 'Jack, I wish you well, but we have to go with Nvidia.' So, I have to show them a plan that says, 'Hey, we can get to 40% market share with this strategy.' Then they say, 'I'm with you now, Jack. Now I'll optimize on AMD.' Once we get that, then we can go after the top."

The exchange seems to confirm that AMD's decision to withdraw from the enthusiast segment is driven mainly by the low volumes it is seeing for the kind of engineering effort and large wafer costs spent building enthusiast-segment GPUs. The company saw great success with its Radeon RX 6800 series and RX 6900 series mainly because the RDNA 2 generation benefited from the GPU-accelerated cryptomining craze, where high-end GPUs were in demand. This demand disappeared by the time AMD rolled out its next-generation Radeon RX 7900 series powered by RDNA 3, and the lack of performance leadership compared to the GeForce RTX 4090 and RTX 4080 with ray tracing enabled, hurt the company's prospects. News of AMD focusing on the performance segment (and below), aligns with the rumors that with RDNA 4, AMD is making a concerted effort to improving its ray tracing performance, to reduce the performance impact of enabling ray tracing. This, raster performance, and efficiency, could be the company's play in gaining market share.

The grand assumption AMD is making here, is that it has a product problem, and not a distribution problem, and that with a product that strikes the right performance/Watt and performance/price equations, it will gain market-share.

Catch the full interview in the source link below.
Source: Tom's Hardware
Add your own comment

272 Comments on AMD Confirms Retreat from the Enthusiast GPU Segment, to Focus on Gaining Market-Share

#51
chrcoluk
Onasi@chrcoluk
That’s just Durante and his PH3 experimenting with cool ideas and dabbing on the so-called AAA porting efforts as usual. Sadly, we are unlikely to see anything similar from big budget PC ports, you will eat poorly optimized blurry upscaled slop and like it.
I know its durante but still was nice to find and post it.
Posted on Reply
#52
mikesg
Real headline: "AMD saw the RTX 3060 top of the Steam hardware survey and now wants that share of the market".
Posted on Reply
#53
Super XP
As the 1st poster stated, AMD is no longer going to follow Nvidia to price oblivion with the worst price/performance. Hopefully AMDs mainstream parts match or exceed the 7900XT-XTX with that chipless design via RDNA5 or RDNA4+ having a high end GPU.
Posted on Reply
#54
TheinsanegamerN
OnasiIt’s a good strategy on paper, as people already pointed out - that’s what drove the success of 5870. Problem is, it also required somewhat of a fumble on NVidias part. Since I assume this go around NV isn’t going to mess up and they also, despite all the talk about “2000 bucks flagship”, will have products at all price points, AMD has to have something truly compelling to actually grab that market share. The “like NV, but slightly worse on features and 10-15% cheaper” hasn’t worked for them so far.
The 5870 was a high end GPU tho? It beat out the 470. AMD didn't just waste the high end back then. Evergreen was a very small design that was quite efficient, and it worked. The 480 was faster but such a hot potato nobody wanted to deal with it.

Ironically AMD then rested on their laurels and got caught off guard by the 580. Good times.

This "abandon the high end" is the strategy and used with their CPUs in the early 2010s, and their GPUs with both Polaris and rdna1. In none of those instances did it work. AMD lost market share and their competition made bucket loads of cash in all instances.
Posted on Reply
#55
d@wn
The number of people upgrading their systems and GPUs each generation and even two is minimal. I have an 7900XTX and have previously had a 1080Ti, while managing to sell it for 1/4 of its purchase price in the end. Many people still have 1080p or 2K monitors and won't go UHD for years while many gave up on PC gaming and went console years ago. Believing that you (a hyper minority rich and PC enthusiast (in their 30 and 40's+) that goes for the top end GPU/CPU every year) is a representation of the 90% of the PC community is mildly said very wrong. Also, the world of gaming is an exaggeration in itself - especially those with money use their rigs for work or work/media production/other CPU/GPS-intensive matters plus gaming rather than gaming per se. They are not children/teenagers that live for gaming and even in this case the number of those going for the top-end hardware is close to 0. Another thing is the games themselves. Could you tell me any absolutely unique or outstanding game that was recently released compared to even the console games from years ago that deserves getting the newest and most powerful hardware with every generation? Ray tracing? Not denying improving the visuals in some games, but a big portion of them is still a mess, very taxing even with the Nvidia flagships and I wouldn't call it a game changer.

What I am trying to say is that providing a decent performance at a reasonable price and at reasonable energy efficiency is what most people care for. If AMD provides that at the right prices, even 40% less than Nvidia top offerings performance-wise, many people (not you the uber-rich 0.000001% of the market) will vote with their wallets. Not to mention that with the current levels of PC emulators (more CPU than GPU dependent) you can get an AAAA+ games for X360, PS2/3/4, Wii/GC and what not else that run at 120+ FPS, with super effective upscaling, outstanding visuals and addons (many of which never released or to be released for PC ever) that will give you everything you want for years to come compared to unfinished, unoptimized, full of glitches cut-off versions of what is considered an AAA game at the moment. And by the way, I bought my 7900XTX for half of the RTX4090 price and unless making money out of it will never go further for up to 20-30% more generational FPS, +15% RT and at 1/3 higher price over the current crazy ones. 99% of the so called gaming community will do the same and will keep their rig for 5-7 years with possible CPU/GPU upgrades half-way, if at all.
Posted on Reply
#56
Ruru
S.T.A.R.S.
Neo_MorpheusI hate the fact that AMD used the 9 moniker, since everyone thinks that they are competing with the 4090.
They've been using it more or less since the HD 6900 series (2010) :D
TheinsanegamerNThe 5870 was a high end GPU tho? It beat out the 470. AMD didn't just waste the high end back then. Evergreen was a very small design that was quite efficient, and it worked. The 480 was faster but such a hot potato nobody wanted to deal with it.
What's funny is that it actually had pretty tame TDP when compared to modern flagships. The cooler just sucked and I remember all the memes well.
Posted on Reply
#57
Onasi
@TheinsanegamerN
The nomenclature is arbitrary, sure, but it was 400 bucks as MSRP (470 was 350, if I recall) and I personally consider the mid-range anything in the 200-400 bracket. Did then, do still so now, although obviously things shifted a fair bit. 5870 was what 4070 is today, but the latter is 50% more expensive, yet people still routinely call it mid-range. More to the discussed topic, the 5700XT was also 400 bucks and THAT card is currently discussed as an example of AMD focusing on mid-range. Plus, when the 5870 was coming out NVidia has already firmly entrenched 500$+ as the high-end/flagship territory.
Posted on Reply
#58
cerulliber
BwazeI'd buy a truckload of cards that perform 15 to 20% slower than 4090 and cost 100% less.

:p

7900xtx is 23% slower than 4090 and cost half of 4090. will you keep your word and buy one? :p
Posted on Reply
#59
Bwaze
cerulliber
7900xtx is 23% slower than 4090 and cost half of 4090. will you keep your word and buy one? :p
Once we're clear that "costing 100% less" means I get it for free.

:p
Posted on Reply
#60
cerulliber
BwazeOnce we're clear that "costing 100% less" means I get it for free.

:p
there's no free pizza. you either buy one or eat bread
Posted on Reply
#61
Neo_Morpheus
BwazeI'd buy a truckload of cards that perform 15 to 20% slower than 4090 and cost 100% less.

:p
You got me there, meant 50%. lol
chrcolukThat is how competition works, it affects prices, but it doesnt necessarily mean everyone buy a specific product.
Competition is one thing, rabid brand loyalty is another and yes, is ironic that I say that, but there is a method to my particular madness.
Onasi…no, nobody sane speculated that Blender developers are “fanbois”. That’s a ridiculous take. They have started implementing HIP as soon as it became available (based on ROCm, yes), it’s just that a lot of features turned out to be actually a challenge to implement (like HIP RT) due to the API being incredibly raw and having issues. To quote one of Blender contributors - it’s a bit like pulling teeth. OptiX, on the other hand, mostly seamlessly was plopped down on top of existing CUDA support because it just worked. I know people have a hard time accepting this, but NV IS actually pretty damn good at the whole “support for the software side of things”.
I dont use Blender but read a bit about it out of curiosity. Since AMD trusted in the good of the community, they mistakenly placed all their eggs in OpenCL.
For whatever reason, it sucked for Blender and yes, Ngreedia had that software side taken care of, so AMD decided to go the same route.
But I do recall reading people saying that they (Blender devs) simply didnt care about AMD due to being fanbois.
Its true or not, dont know since its not something that I looked or participated deep enough, since I never used it.
RuruThey've been using it more or less since the HD 6900 series (2010) :D
I know and it sucks because it does place them on this place where you want to have a conversation with someone and they will blindly shout "its competing with the xx90!"
Posted on Reply
#62
TheinsanegamerN
Onasi@TheinsanegamerN
The nomenclature is arbitrary, sure, but it was 400 bucks as MSRP (470 was 350, if I recall) and I personally consider the mid-range anything in the 200-400 bracket. Did then, do still so now, although obviously things shifted a fair bit. 5870 was what 4070 is today, but the latter is 50% more expensive, yet people still routinely call it mid-range. More to the discussed topic, the 5700XT was also 400 bucks and THAT card is currently discussed as an example of AMD focusing on mid-range. Plus, when the 5870 was coming out NVidia has already firmly entrenched 500$+ as the high-end/flagship territory.
You can claim whatever you want for prices, I could say that anything over $100 is enthusiast tier, that doesnt make a RX 6600 a high end card.

5870 was a high end GPU. It was near top of every performance chart, with only the 480 consistently ahead. Thats a high end card. Prices have changed because of inflation. Since the 5870 came out we have roughly 12x as much cash in circulation. Thats gonna affect prices.
RuruThey've been using it more or less since the HD 6900 series (2010) :D
Ahh the 6900 series. AMD's first emergency GPU. AMD really thought that nvidia wasnt going to do much with fermi that time around, I remember the response and rushed release, it was hilarious.

6970 was still a good card, shame about the drivers.
RuruWhat's funny is that it actually had pretty tame TDP when compared to modern flagships. The cooler just sucked and I remember all the memes well.
Oh I know. It's really funny to see people whine about 600w being irresponsible and how we never should go over 350 watt, when people were saying the same thing in 2009 with the 480's 350 watt power draw.

The 4090 is a totally different beast. Those fermi coolers SUCKED. And gave birth to the likes of the MSI twin frozr design. Really, most of the custom GPU cooler design we see today is thanks to fermi's amp suckage.
Posted on Reply
#63
Ruru
S.T.A.R.S.
Neo_MorpheusI know and it sucks because it does place them on this place where you want to have a conversation with someone and they will blindly shout "its competing with the xx90!"
Yeah, feels just that the majority doesn't realize that. Actually we even had the HD 2900 before that when I think about that.
Posted on Reply
#64
Darmok N Jalad
Personally, I don't have the budget or desire to own a premium card. What I'm looking for is something good in the x600/x700 segment that doesn't require a big PSU. I think there are a lot of buyers in this space that don't have 1KW PSUs and probably don't even have 4K monitors. The GPU market is definitely a place where I see diminishing returns. You'll spend a lot, consume a lot of power, and produce a lot of heat just to make a game look a little bit better. I know I'll get responses that say "I totally can tell the difference," and I don't doubt that, but the juice is just not worth the squeeze for these eyes.
Posted on Reply
#65
Onasi
TheinsanegamerN5870 was a high end GPU. It was near top of every performance chart, with only the 480 consistently ahead. Thats a high end card.
That’s a point, sure, although if performance is the only metric we use we have a terrible situation on our hands where an 800 dollar GPU can be called a “not high end” one since, you know, it’s not near the top of the charts, really. Yes, the 4070Ti, I am looking at you.
TheinsanegamerNPrices have changed because of inflation. Since the 5870 came out we have roughly 12x as much cash in circulation. Thats gonna affect prices.
I have no idea why people keep banging the inflation drum. It’s a thing, sure, but far from the main reason for insane prices. The real one is that NV has a near monopoly and can set prices at whatever the market can bear. No other PC DIY segment of the tech market has such a ridiculous price increase (aside from maaaaaybe motherboards, though there are legitimate-ish production difficulties involved). Look a CPUs. The 4770K was 350 bucks. The same tier 14700K is 400. On AMD side, the 9700X is 360. I don’t see inflation blowing out the prices there, not to the degree we are seeing with GPUs. It’s mostly the lack of competition. If AMD was an actual contender this would not be an issue. People unironically thinking otherwise are coping. One company holding 80%+ of AIB GPU market share is not a healthy market and THAT is driving the prices up. Not inflation.
Posted on Reply
#66
Ruru
S.T.A.R.S.
OnasiThat’s a point, sure, although if performance is the only metric we use we have a terrible situation on our hands where an 800 dollar GPU can be called a “not high end” one since, you know, it’s not near the top of the charts, really. Yes, the 4070Ti, I am looking at you.
And first they were going to release it branded as "RTX 4080 12GB" :rolleyes:
Posted on Reply
#67
Niceumemu
If AMD can't release mid range cards for less than half the cost of Nvidia's competing cards this sounds to me like a PR cover up of AMD engineers being inferior and simply being incapable of making high end cards.
Posted on Reply
#68
Dr. Dro
It is not a "grand assumption", AMD does have a product problem and its fans are the only people who don't see that. This has been argued ad nauseam on this forum anyway - AMD needs this "retreat" to rearchitect and reevaluate its market positioning. Their high-end cards never stood a chance.
Posted on Reply
#69
InVasMani
cerulliberthere's no free pizza. you either buy one or eat bread
Is there pizza sauce and cheese on the bread? Also what type of bread are talking about here!!? Perhaps the former and a bit of garlic on some Mario Made Bread. It's a mii Mariiio I makeah tha breadza mayyybe adda the ball-o-meat.
Posted on Reply
#70
TheinsanegamerN
OnasiThat’s a point, sure, although if performance is the only metric we use we have a terrible situation on our hands where an 800 dollar GPU can be called a “not high end” one since, you know, it’s not near the top of the charts, really. Yes, the 4070Ti, I am looking at you.
The 8800 ultra was a $880 GPU in 2006. That's $1100 adjusted for inflation People always forget their history.

The golden age of GPUs was promoted on artificially low prices stemming from the 2008 crash. Had it not been for 2008 the fermis would have been much closer to $1000 then $500. Dual GPU flagships were $1000 for over a decade, yet when titans come out that outperform those older setups for $1000 people flipped out.
OnasiI have no idea why people keep banging the inflation drum.
Because GPUs do not exist in a bubble. EVERYTHING has gotten massively more expensive. Wages have gone up significantly since the days of Fermi. The cost of wafers has exploded by an order of magnitude compared to 2009.
RuruYeah, feels just that the majority doesn't realize that. Actually we even had the HD 2900 before that when I think about that.
It goes further back. The x1950xtx. x1900. x950. Even the 9800 pro, which was going against the FX 5800 series.
Dr. DroIt is not a "grand assumption", AMD does have a product problem and its fans are the only people who don't see that. This has been argued ad nauseam on this forum anyway - AMD needs this "retreat" to rearchitect and reevaluate its market positioning. Their high-end cards never stood a chance.
The 7900xtx sold well. The 7900xt would have been a lot more successful if AMD hadn't been dicks about pricing. 7900 GRE sold well, when you could find it. The 6800xt/6900 series also sold well, relative to their product lines sales anyway.

so long as AI is a huge profit center, we're gonna keep seeing this, unless theres major expansion in fab availability or the AI market crashes.
Posted on Reply
#71
Vayra86
Neo_MorpheusActually, i was happy with the games that i was going to buy anyway, so in my case, i did save 170.

Normally, those bundles are meh, but if it is a good game that you were planning in buying anyways, then yes, it does have value, since you wont spend that extra money.

I would argue that RDNA3 competes with Ngreedias current offerings except on RT (still haven’t found a definitive reason to care for it yet), CUDA if needed (ROCm is slowly removing that), dlss (FSR is good enough, so im good there) and perhaps the 4090, which in raster and optimized code, a 7900xtx offers a compelling alternative.
So no, its not the offerings, is the mediocre marketing by AMD plus the bribed influencers that keep pushing Ngreedias gpu downs everyone’s throat on a daily basis.

So when someone not that technical see that, they inevitably think that all AMD gpus are absolute trash.

Hell, even more technically adept buyers are falling for it. See how everyone is ok with Ngreedias monopoly, crazy prices and anti consumer practices.

Either way, this will go on for ever since as many times before, you, like others in here, will simply laugh at my points.
Perhaps personal then, but buying two games at full launch price? I'd have to be very heavily hyped to even consider that, but then, I learned buy at launch is generally not really a benefit to begin with. To be very honest I think there's a healthy load of cognitive dissonance in saying you got 170,- worth of value out of these games. Wait a few months and you'll pay 30,- per game. You can't even play both at the same time, so there's no reason not to postpone at least one. Think about this for a minute and reflect ;) Perhaps its really worth it to you and that's fine. But I strongly doubt this is a rational calculation here.

I won't deny there IS value. But I'd value that at perhaps 60,- for two great games, because frankly that's what you can buy them for shortly after release, more often than not.

RDNA3 competes with Ada... but not quite.
- DLSS is superior and evolves faster
- RT works better
- Cards are slightly more power efficient
- Cuda as you mentioned...

So the reality is, Nvidia simply has a better product to sell, and people throughout the years have clearly shown preference for the biggest featureset, more so than a slightly lower price. And let's not forget AMD's terrible pricing strategy, waiting far too long with undercutting Nvidia hard, and instead trying to get maximum dollar for what is essentially a lesser offering. Customers don't like that.

Another aspect that cannot be overlooked is AMD's lacking consistency. You're buying a GPU, so you're also buying into an ecosystem of patches and feature updates throughout the years. AMD is not the best partner for a long term investment that way, every gen we're left to wonder what their new stack will look like; whether they will even compete in segment X or Y... or whether they'll even release anything other than rebrands. Nvidia is a lot more consistent that way, and this inspires trust. Customers are clearly ready to pay for that assurance as well.

So yes, I would say it was, is and will always be about the actual offerings. People clearly look straight through these silly game bundles and clearly value featureset, consistency and quality of the experience higher than you think they do. Its not 'the brilliant Nvidia marketing' - its that marketing alongside actually delivering. For that all you need to compare is the development of FSR vs DLSS. You can of course not be a fan of the proprietary approach (I'm not, anyway), but the reality is, the overall experience with DLSS is better, so if you're just gaming, what do you pick? Principles, or optimal gaming?
Posted on Reply
#72
Dr. Dro
TheinsanegamerNThe 7900xtx sold well. The 7900xt would have been a lot more successful if AMD hadn't been dicks about pricing. 7900 GRE sold well, when you could find it. The 6800xt/6900 series also sold well, relative to their product lines sales anyway.

so long as AI is a huge profit center, we're gonna keep seeing this, unless theres major expansion in fab availability or the AI market crashes.
And let's be frank - we have AI to thank for that. The XTX sold less than the RTX 4080 - and due to its high cost, the 4080 is the worst performing 80-class card Nvidia has ever released, if we're talking about market performance. It sold a lot less units than its 80-class predecessors.
Posted on Reply
#73
Onasi
TheinsanegamerNThe 8800 ultra was a $880 GPU in 2006. That's $1100 adjusted for inflation People always forget their history.
It was a halo card like the 4090. It can cost whatever, price is irrelevant in that segment. The actual high end 8800 GTX was actually reasonably priced.
TheinsanegamerNBecause GPUs do not exist in a bubble. EVERYTHING has gotten massively more expensive. Wages have gone up significantly since the days of Fermi. The cost of wafers has exploded by an order of magnitude compared to 2009.
I like how you conveniently ignored my point about CPUs. Where are the 700 dollar i7/R7s? I don’t see them. Inflation isn’t supposed to be selective, right?
TheinsanegamerNThe 7900xtx sold well. The 7900xt would have been a lot more successful if AMD hadn't been dicks about pricing. 7900 GRE sold well, when you could find it. The 6800xt/6900 series also sold well, relative to their product lines sales anyway.
Define “well”. From my understanding, the 4090 alone has sold more than the entire RDNA 3 product stack put together. AMD needs a massive win in the next generation or two to be an actual competitor. Console chips are a nice source of constant income, but it’s not enough and is very low-margin.
Posted on Reply
#74
Ruru
S.T.A.R.S.
OnasiIt was a halo card like the 4090. It can cost whatever, price is irrelevant in that segment. The actual high end 8800 GTX was actually reasonably priced.
And 8800 Ultra was just a factory-overclocked GTX with a redesigned cooler.
Posted on Reply
#75
Onasi
RuruAnd 8800 Ultra was just a factory-overclocked GTX with a redesigned cooler.
Yeah, halo cards have varied over the years. 4090 is actually an example of a GOOD one since it’s actually a top dog and by a noticeable amount. The Ultra was meh and the janky dual-chip cards that also had been in that segment at various points were questionable as well.
Posted on Reply
Add your own comment
Nov 30th, 2024 06:32 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts