Sunday, August 6th 2023
AMD Retreating from Enthusiast Graphics Segment with RDNA4?
AMD is rumored to be withdrawing from the enthusiast graphics segment with its next RDNA4 graphics architecture. This means there won't be a successor to its "Navi 31" silicon that competes at the high-end with NVIDIA; but rather one that competes in the performance segment and below. It's possible AMD isn't able to justify the cost of developing high-end GPUs to push enough volumes over the product lifecycle. The company's "Navi 21" GPU benefited from the crypto-currency mining swell, but just like with NVIDIA, the company isn't able to push enough GPUs at the high-end.
With RDNA4, the company will focus on specific segments of the market that sell the most, which would be the x700-series and below. This generation will be essentially similar to the RX 5000 series powered by RDNA1, which did enough to stir things up in NVIDIA's lineup, and trigger the introduction of the RTX 20 SUPER series. The next generation could see RDNA4 square off against NVIDIA's next-generation, and hopefully, Intel's Arc "Battlemage" family.
Source:
VideoCardz
With RDNA4, the company will focus on specific segments of the market that sell the most, which would be the x700-series and below. This generation will be essentially similar to the RX 5000 series powered by RDNA1, which did enough to stir things up in NVIDIA's lineup, and trigger the introduction of the RTX 20 SUPER series. The next generation could see RDNA4 square off against NVIDIA's next-generation, and hopefully, Intel's Arc "Battlemage" family.
363 Comments on AMD Retreating from Enthusiast Graphics Segment with RDNA4?
Moreso when their GPUs are still pretty popular alternatives to NVIDIA and they're capable of reaching anywhere from 90 to 100% of the non-raytraced performance of NVIDIA's XX80 equivalent at lower cost, depending on the game and secondary features activated (DLSS/RIS for example).
I mean, personally I don't mind Intel spending money on GPU R&D. It only serves to potentially benefit me down the line. I do hold 10 shares but don't really care about that potential loss. I just wonder what their xx70 XT offering involves this time.
It would probably be a 256-bit memory controller? I think it would make no sense to kneecap yourself at 6/7700XT level of die size. And 5700XT was a 256-bit card.
I think its a sensible choice but time will tell. RTG is not exactly on a "not dissatisfying everyone" streak with their recent performance.
Imma "Let RTG cook".
The 5080 may be a 352bit GDDR7 22GB full ad103 10240 CUDA cores.
It doesn't make sense from a business standpoint for AMD to retreat from the high end when that's where chiplets specifically lend their benefits. Even if you assume that AMD does allocate most of it's wafer towards AI or other segments, it still makes sense for AMD to have a card for the high end.
This is just another rumor at the end of the day and 99% of them turn out to be false.
I mean it's cool all those chiplets and things, but do they actually make a difference?
www.techpowerup.com/309125/nvidia-announces-financial-results-for-first-quarter-fiscal-2024-gaming-down-38-yoy-stock-still-jumps-25#g309125
www.techpowerup.com/forums/threads/amd-reports-second-quarter-2023-financial-results-revenue-down-18-yoy.311976/
Take BMW, for example: subscription-based heated seats!
They might launch a mid-range GPU and for a "measly" 20 $ / month you can turn it into a graphics monster!
Don't know if that's possible, but it wouldn't suprise me if it happened. Though I don't think many (if any) people would pay for that...
Dark times are comming...
Going to buy some champagne to celebrate.
nVidia lineup:
xx50 - entry level;
xx60 - low-end;
xx70 - mid-range;
xx80 - high-end (performance);
xx90 - enthusiast.
AMD lineup:
x400, x500 - entry level;
x6x0 - low-end;
x7x0 - mid-range;
x8x0 - high-end (performance);
x9x0 - enthusiast.
It's only lose lose for AMD at this point :shadedshu: I wouldn't say they don't work, otherwise they wouldn't be making the inroads they have with Zen into servers & now Xilinx as well. Nvidia has an inherent advantage with CUDA & they spend gazillions on that, even Intel vastly outspends AMD on the software support front but that is changing albeit slowly.
That said, this is routine by now, coming from AMD. When they lag behind, they claim something like this. If memory serves me well, they did it with Radeon HD 3000 or 4000, they did it with Radeon 200 and they did it with initial RDNA. It never stuck.
I mean it's great value for everyone, but no need to push every 6 months a new bunch of cards out.
Crypto MinersAI Data Centers.It won't get any better, just get used to overpriced mid tier cards that barely deliver the performance of 3 year old tech.