Sunday, August 6th 2023

AMD Retreating from Enthusiast Graphics Segment with RDNA4?

AMD is rumored to be withdrawing from the enthusiast graphics segment with its next RDNA4 graphics architecture. This means there won't be a successor to its "Navi 31" silicon that competes at the high-end with NVIDIA; but rather one that competes in the performance segment and below. It's possible AMD isn't able to justify the cost of developing high-end GPUs to push enough volumes over the product lifecycle. The company's "Navi 21" GPU benefited from the crypto-currency mining swell, but just like with NVIDIA, the company isn't able to push enough GPUs at the high-end.

With RDNA4, the company will focus on specific segments of the market that sell the most, which would be the x700-series and below. This generation will be essentially similar to the RX 5000 series powered by RDNA1, which did enough to stir things up in NVIDIA's lineup, and trigger the introduction of the RTX 20 SUPER series. The next generation could see RDNA4 square off against NVIDIA's next-generation, and hopefully, Intel's Arc "Battlemage" family.
Source: VideoCardz
Add your own comment

363 Comments on AMD Retreating from Enthusiast Graphics Segment with RDNA4?

#26
AnotherReader
Minus InfinityEven if true that doesn't appear to be happening until RDNA5, so 3 years at least and I wouldn't bet on there ever being a RDNA5. AMD will struggle next year against Intel's Battlemage IMO and drop to third in a few years. They won't even bother trying to compete other than with APU's.
More than a year after release, Arc is very hit or miss. Most of the time, it struggles to beat the RX 7600 despite a die that's twice as large and consuming 50% more power. I don't believe BattleMage will overcome such a large deficit in performance per square mm and Watt.
Posted on Reply
#27
Tsukiyomi91
They prefer to sell other products or scale back most of their stuffs coz they realize not everyone is defending their shitty decisions and complacency with their wallets or opinions anymore.
Posted on Reply
#28
TechLurker
I highly doubt this. Between AMD having learned how to do MCM GPUs, on top of now having Xilinx FPGAs supposedly being integrated with AMD's next-gen datacenter GPUs to provide a boost to AI learning, I see it more as AMD staying the course with lessons learned from RDNA3 and using FPGAs to maybe improve ray-tracing or provide AI/modeling calculations while improving upon their MCM GPU process.

Moreso when their GPUs are still pretty popular alternatives to NVIDIA and they're capable of reaching anywhere from 90 to 100% of the non-raytraced performance of NVIDIA's XX80 equivalent at lower cost, depending on the game and secondary features activated (DLSS/RIS for example).
Posted on Reply
#29
Post Nut Clairvoyance
AnotherReaderMore than a year after release, Arc is very hit or miss. Most of the time, it struggles to beat the RX 7600 despite a die that's twice as large and consuming 50% more power. I don't believe BattleMage will overcome such a large deficit in performance per square mm and Watt.
But putting events in the scale of GPU history timeline, Intel has put up an impressive display so far. Of course they will still be playing the catch up, but the moment it sparks serious competition with AMD, the dreaded duopoly will end. I don't think AMD will give Intel free marketshares, and the competition between them should see NV stop its anti-everyone attitude.

I mean, personally I don't mind Intel spending money on GPU R&D. It only serves to potentially benefit me down the line. I do hold 10 shares but don't really care about that potential loss.
TechLurkerI highly doubt this. Between AMD having learned how to do MCM GPUs, on top of now having Xilinx FPGAs supposedly being integrated with AMD's next-gen datacenter GPUs to provide a boost to AI learning, I see it more as AMD staying the course with lessons learned from RDNA3 and using FPGAs to maybe improve ray-tracing or provide AI/modeling calculations while improving upon their MCM GPU process.

Moreso when their GPUs are still pretty popular alternatives to NVIDIA and they're capable of reaching anywhere from 90 to 100% of the non-raytraced performance of NVIDIA's XX80 equivalent at lower cost, depending on the game and secondary features activated (DLSS/RIS for example).
I just wonder what their xx70 XT offering involves this time.
It would probably be a 256-bit memory controller? I think it would make no sense to kneecap yourself at 6/7700XT level of die size. And 5700XT was a 256-bit card.
I think its a sensible choice but time will tell. RTG is not exactly on a "not dissatisfying everyone" streak with their recent performance.
Imma "Let RTG cook".
Posted on Reply
#30
BoboOOZ
TechLurkerI highly doubt this. Between AMD having learned how to do MCM GPUs, on top of now having Xilinx FPGAs supposedly being integrated with AMD's next-gen datacenter GPUs to provide a boost to AI learning, I see it more as AMD staying the course with lessons learned from RDNA3 and using FPGAs to maybe improve ray-tracing or provide AI/modeling calculations while improving upon their MCM GPU process.

Moreso when their GPUs are still pretty popular alternatives to NVIDIA and they're capable of reaching anywhere from 90 to 100% of the non-raytraced performance of NVIDIA's XX80 equivalent at lower cost, depending on the game and secondary features activated (DLSS/RIS for example).
Yepp, same here. It makes no sense, other than to increase the attractivity of current high end AMD offerings (7900 XTX will be more "future proof"). It just looks like somebody misinterpreted the information that AMD will stop making large dies (because of MCM) or some guerilla marketing tactic. I'm waiting for a confirmation or dismissal from a leaker that has multiple sources inside AMD.
Post Nut ClairvoyanceBut putting events in the scale of GPU history timeline, Intel has put up an impressive display so far. Of course they will still be playing the catch up, but the moment it sparks serious competition with AMD, the dreaded duopoly will end. I don't think AMD will give Intel free marketshares, and the competition between them should see NV stop its anti-everyone attitude.

I mean, personally I don't mind Intel spending money on GPU R&D. It only serves to potentially benefit me down the line. I do hold 10 shares but don't really care about that potential loss.


I just wonder what their xx70 XT offering involves this time.
It would probably be a 256-bit memory controller? I think it would make no sense to kneecap yourself at 6/7700XT level of die size. And 5700XT was a 256-bit card.
I think its a sensible choice but time will tell. RTG is not exactly on a "not dissatisfying everyone" streak with their recent performance.
Imma "Let RTG cook".
Intel has less money to spend with GPU than AMD has, and it has a longer way to go. I would love Intel to come with great performance and value GPU in the future, but I think what we can expect at best from them is to make just 1 or 2 mid range GPU per generation while polishing up their drivers.
Posted on Reply
#31
gffermari
….that means that the nvidia 5000 series will just be an Ada refresh with more/ different type of vram and bandwidth.

The 5080 may be a 352bit GDDR7 22GB full ad103 10240 CUDA cores.
Posted on Reply
#32
kondamin
That would be longterm suicide, putting things on the back burner for a year as the economy collapses and the ai craze dies down is more like it.
Posted on Reply
#33
Zubasa
And as usual the original "soruce" is Twitter "leaks".
Posted on Reply
#34
evernessince
This rumor makes little sense. Two major reasons AMD switched to chiplets was the cost savings on combined total die size and scalability. Not utilizing either of those advantages seems extremely unlikely. AMD can produce high end GPUs at a fraction of the cost Nvidia can because it's individual dies are much much smaller and thus the amount of wasted wafer is drastically reduced. On the flipside, Nvidia's costs increase exponentially as they increase die size.

It doesn't make sense from a business standpoint for AMD to retreat from the high end when that's where chiplets specifically lend their benefits. Even if you assume that AMD does allocate most of it's wafer towards AI or other segments, it still makes sense for AMD to have a card for the high end.

This is just another rumor at the end of the day and 99% of them turn out to be false.
Posted on Reply
#35
Assimilator
tajoh111Gamers want prices to stay the same even if costs are going up.
Completely untrue. All we want are prices to be sane for what's on offer. That's not "toxic" in any way shape or form, unless you're one of the greedy parasites in charge at AMD or NVIDIA.
Posted on Reply
#36
ARF
evernessinceThis rumor makes little sense. Two major reasons AMD switched to chiplets was the cost savings on combined total die size and scalability. Not utilizing either of those advantages seems extremely unlikely. AMD can produce high end GPUs at a fraction of the cost Nvidia can because it's individual dies are much much smaller and thus the amount of wasted wafer is drastically reduced. On the flipside, Nvidia's costs increase exponentially as they increase die size.

It doesn't make sense from a business standpoint for AMD to retreat from the high end when that's where chiplets specifically lend their benefits. Even if you assume that AMD does allocate most of it's wafer towards AI or other segments, it still makes sense for AMD to have a card for the high end.

This is just another rumor at the end of the day and 99% of them turn out to be false.
And still nvidia posts huge profits, while AMD posts losses. Why is that? AMD's strategies don't work?
I mean it's cool all those chiplets and things, but do they actually make a difference?



www.techpowerup.com/309125/nvidia-announces-financial-results-for-first-quarter-fiscal-2024-gaming-down-38-yoy-stock-still-jumps-25#g309125


www.techpowerup.com/forums/threads/amd-reports-second-quarter-2023-financial-results-revenue-down-18-yoy.311976/
Posted on Reply
#37
Dan.G
Maybe they want to focus on iGPUs more? :)
Posted on Reply
#38
TriCyclops
I wonder if that's true. Seems like a really bad idea...
Posted on Reply
#39
Dan.G
Subscription-based GPUs? Seems like an industry trend, to be honest.
Take BMW, for example: subscription-based heated seats!
They might launch a mid-range GPU and for a "measly" 20 $ / month you can turn it into a graphics monster!
Don't know if that's possible, but it wouldn't suprise me if it happened. Though I don't think many (if any) people would pay for that...
Dark times are comming...
Posted on Reply
#40
Chomiq
ARFAnd still nvidia posts huge profits, while AMD posts losses. Why is that? AMD's strategies don't work?
I mean it's cool all those chiplets and things, but do they actually make a difference?



www.techpowerup.com/309125/nvidia-announces-financial-results-for-first-quarter-fiscal-2024-gaming-down-38-yoy-stock-still-jumps-25#g309125


www.techpowerup.com/forums/threads/amd-reports-second-quarter-2023-financial-results-revenue-down-18-yoy.311976/
Both companies have nearly identical drop in gaming segment revenue. That's to be expected seeing how last year both companies were still riding the mining wave and writing it off as "gaming" segment.
Posted on Reply
#41
john_
Finally. All those worshiping Nvidia, no matter what, all those years, will be asked to pay twice the price for their next GPU. They will be blaming AMD obviously while begging Intel to come and rescue them.
Going to buy some champagne to celebrate.


obviously I am for the madhouse with this comment, but understand this, 15 years reading the same crap "Don't buy AMD. This AMD model is faster than the Nvidia one and cheaper also, but "AMD this and AMD that and AMD the other.....and excuses"
Posted on Reply
#42
ViperXTR
KissamiesReminds me of Polaris.


38x0 wasn't that far from 8800 GT(S) and they had 3870X2 as the flagship. Though multi-GPU is dead and buried so no X2 this time.
Thanks for reminding me, forgot the x2 exist
Posted on Reply
#43
AusWolf
ARFI don't think "it worked". How many people do you know (or think) have that mid-range RX 5700 XT?
Not many now, but it was a decent competitor to the 2070.
Posted on Reply
#44
Vya Domus
What do "enthusiast" and "performance" segments even mean ?
Posted on Reply
#45
Dan.G
Vya DomusWhat do "enthusiast" and "performance" segments even mean ?
For me:
nVidia lineup:
xx50 - entry level;
xx60 - low-end;
xx70 - mid-range;
xx80 - high-end (performance);
xx90 - enthusiast.
AMD lineup:
x400, x500 - entry level;
x6x0 - low-end;
x7x0 - mid-range;
x8x0 - high-end (performance);
x9x0 - enthusiast.
Posted on Reply
#46
R0H1T
Tsukiyomi91They prefer to sell other products or scale back most of their stuffs coz they realize not everyone is defending their shitty decisions and complacency with their wallets or opinions anymore.
How the eff do you explain Nvidia's record profits then? Ok, not most of it was from gaming but even if JHH sells his t*** for a discount the Nvidia zealots will buy it at a premium :nutkick:

It's only lose lose for AMD at this point :shadedshu:
ARFAnd still nvidia posts huge profits, while AMD posts losses. Why is that? AMD's strategies don't work?
I mean it's cool all those chiplets and things, but do they actually make a difference?
I wouldn't say they don't work, otherwise they wouldn't be making the inroads they have with Zen into servers & now Xilinx as well. Nvidia has an inherent advantage with CUDA & they spend gazillions on that, even Intel vastly outspends AMD on the software support front but that is changing albeit slowly.
Posted on Reply
#47
bug
Keep in mind this is rumor.

That said, this is routine by now, coming from AMD. When they lag behind, they claim something like this. If memory serves me well, they did it with Radeon HD 3000 or 4000, they did it with Radeon 200 and they did it with initial RDNA. It never stuck.
Posted on Reply
#48
john_
KissamiesReminds me of Polaris.


38x0 wasn't that far from 8800 GT(S) and they had 3870X2 as the flagship. Though multi-GPU is dead and buried so no X2 this time.
38x0 was considered mostly a fix for the power hungry 2900 cards, nothing more. While 8800 GT(S) wasn't so great cards and 38x0 had some chance against them, Nvidia had released the 8800 GT and that card was clearly ahead. Only with 4800 series cards AMD put real pressure on Nvidia.
Posted on Reply
#49
Jism
HisDivineOrderSo AMD waits a year-ish to release their midrange product line only to, according to the rumors, create an entire generation of just midrange product line to come out next year to replace it?
Market is saturated right now. You got 2 generations competing against itself.

I mean it's great value for everyone, but no need to push every 6 months a new bunch of cards out.
Posted on Reply
Add your own comment
Nov 21st, 2024 11:47 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts