Sunday, August 6th 2023

AMD Retreating from Enthusiast Graphics Segment with RDNA4?

AMD is rumored to be withdrawing from the enthusiast graphics segment with its next RDNA4 graphics architecture. This means there won't be a successor to its "Navi 31" silicon that competes at the high-end with NVIDIA; but rather one that competes in the performance segment and below. It's possible AMD isn't able to justify the cost of developing high-end GPUs to push enough volumes over the product lifecycle. The company's "Navi 21" GPU benefited from the crypto-currency mining swell, but just like with NVIDIA, the company isn't able to push enough GPUs at the high-end.

With RDNA4, the company will focus on specific segments of the market that sell the most, which would be the x700-series and below. This generation will be essentially similar to the RX 5000 series powered by RDNA1, which did enough to stir things up in NVIDIA's lineup, and trigger the introduction of the RTX 20 SUPER series. The next generation could see RDNA4 square off against NVIDIA's next-generation, and hopefully, Intel's Arc "Battlemage" family.
Source: VideoCardz
Add your own comment

363 Comments on AMD Retreating from Enthusiast Graphics Segment with RDNA4?

#351
Ruru
S.T.A.R.S.
ViperXZYea that's typical multi-GPU stuttering for you (game not supported/no frame pacing produces this issue), fps is high but it's not producing the quality you'd expect, ie fluidity that usually gets hand in hand with high fps.
Anyway, can't confirm your point since that was years ago and I don't have any newer dual cards than two HD 4890s currently. :/
Posted on Reply
#352
ViperXZ
KissamiesAnyway, can't confirm your point since that was years ago and I don't have any newer dual cards than two HD 4890s currently. :/
I didn't play BF1 but with the games I tried back then (7800 XT's in CF), it ran flawlessly, they were fully supported though. Before they introduced frame pacing into the driver, it was a mess and not worth using (for me), as it was stuttery unless your fps was sky high.
Posted on Reply
#353
HIGHLANDER58
Perhaps if AMD and Nvidia as well lowered the prices of their high end cards out of the stratosphere and down to a level where many more people could actually afford to buy the video cards, alot more people would be buying them
Posted on Reply
#354
AusWolf
HIGHLANDER58Perhaps if AMD and Nvidia as well lowered the prices of their high end cards out of the stratosphere and down to a level where many more people could actually afford to buy the video cards, alot more people would be buying them
The high end has never been about sales numbers. Most people don't need a high end graphics card anyway.
Posted on Reply
#355
Fishworldwar
As an RX 6600XT owner and budget-build user, we honestly need more budget-level and mid-range GPUs. Intel can handle the low-end (which they kinda already do), AMD can have the mid-range (like they did with the RX 480), and NVIDIA can remain in the enthusiast market. This is only theoretical of course.
Posted on Reply
#356
Melvis
Smart Move as the mid range is where the market is anyway and only the rich or dumb would buy any 4000Series GPU's from Nvidia and AMD give alot more VRAM which is already proven to be a must these days with modern games at a lower over all Cost especially in AUS. If I wanted a NVIDIA card now with over 10GB VRAM it would literally cost me twice as much as a AMD Card even if the AMD card didnt get as much FPS but good enough still for 1440P Gaming or under. I think this is also why the 1080 Ti was such a good card even today, good performance and good amount of VRAM, RIP my 1080 Ti.
Posted on Reply
#357
Tek-Check
Let's focus on what matters. AMD is also rumoured to be developing a halo RDNA4 SKU. It could look like this.
Posted on Reply
#358
Denver
Tek-CheckLet's focus on what matters. AMD is also rumoured to be developing a halo RDNA4 SKU. It could look like this.
I was thinking of something similar (with just two mid-end GDCs running together), I can't fathom how they'd pull it off, but it would indeed be an impressive feat.

it would replicate the successful multi-chip strategy used in Ryzen, EPYC, TR etc...
Posted on Reply
#359
95Viper
Keep it on topic.
Stop the personal arguing.
Posted on Reply
#360
TheoneandonlyMrK
OP.

After much consideration this story true or not is offensive.

Untrue.

Wierdly biased.


I and many many others were Enthusiasts, Way way before most of you, way way before I ever had money and way way before the PRESS.

YOU OP.

Suggest you have to pay X amount to be an Enthusiast.


HIGH END is the appropriate phrase.

Stop creating fake narratives that ensure people feel they should be selling kidneys for GPU.

That's why we're here.
Posted on Reply
#361
R-T-B
TheoneandonlyMrKOP.

After much consideration this story true or not is offensive.

Untrue.

Wierdly biased.


I and many many others were Enthusiasts, Way way before most of you, way way before I ever had money and way way before the PRESS.

YOU OP.

Suggest you have to pay X amount to be an Enthusiast.


HIGH END is the appropriate phrase.

Stop creating fake narratives that ensure people feel they should be selling kidneys for GPU.

That's why we're here.
My man, that's just what the industry has named the ultra highend bracket for some time now. Don't shoot the messenger.
Posted on Reply
#362
Tek-Check
DenverI was thinking of something similar (with just two mid-end GDCs running together), I can't fathom how they'd pull it off, but it would indeed be an impressive feat.
it would replicate the successful multi-chip strategy used in Ryzen, EPYC, TR etc...
Two GCDs could be N42 package. entirely possible. It looks like they are gunning for flexible modularity, i.e. two dies clocking high and then combine those dies in different packages. It will be fascinating to see this. They have similar concept already implemented in MI300.
Posted on Reply
Add your own comment
Nov 21st, 2024 11:35 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts