Sunday, August 6th 2023
AMD Retreating from Enthusiast Graphics Segment with RDNA4?
AMD is rumored to be withdrawing from the enthusiast graphics segment with its next RDNA4 graphics architecture. This means there won't be a successor to its "Navi 31" silicon that competes at the high-end with NVIDIA; but rather one that competes in the performance segment and below. It's possible AMD isn't able to justify the cost of developing high-end GPUs to push enough volumes over the product lifecycle. The company's "Navi 21" GPU benefited from the crypto-currency mining swell, but just like with NVIDIA, the company isn't able to push enough GPUs at the high-end.
With RDNA4, the company will focus on specific segments of the market that sell the most, which would be the x700-series and below. This generation will be essentially similar to the RX 5000 series powered by RDNA1, which did enough to stir things up in NVIDIA's lineup, and trigger the introduction of the RTX 20 SUPER series. The next generation could see RDNA4 square off against NVIDIA's next-generation, and hopefully, Intel's Arc "Battlemage" family.
Source:
VideoCardz
With RDNA4, the company will focus on specific segments of the market that sell the most, which would be the x700-series and below. This generation will be essentially similar to the RX 5000 series powered by RDNA1, which did enough to stir things up in NVIDIA's lineup, and trigger the introduction of the RTX 20 SUPER series. The next generation could see RDNA4 square off against NVIDIA's next-generation, and hopefully, Intel's Arc "Battlemage" family.
363 Comments on AMD Retreating from Enthusiast Graphics Segment with RDNA4?
videocardz.com/newz/amd-rumored-to-be-skipping-high-end-radeon-rx-8000-rdna4-gpu-series
Next up ~ Intel rumored to exit x86 business, leaving AMD/Via to fight for scraps :pimp:
200$ is RTX 3050 8GB, 500$ is RTX 3070 Ti 8GB more than two !! years after its release.
What are you going to do with these cards ? Slide show "gaming" at 1080p? :banghead:
I don't think the 1000+$ cards must go, actually the opposite - everyone should focus on them and try to buy only them. Instead of upgrading every year or two, just buy that 1000$ beast and stay with it for the next five-seven ! ! years with ease.
AMD has demonstrated with the Instinct series, that limitation has been overcome.
community.amd.com/t5/instinct-accelerators/advancing-hpc-to-the-next-level-of-sustainability-with-amd/ba-p/611507www.anandtech.com/show/18721/ces-2023-amd-instinct-mi300-data-center-apu-silicon-in-hand-146b-transistors-shipping-h223
Alternatively, there's potentially insider info that nVidia is pushing to "12 on a 10 scale" (power, heat, size, die area, etc.), and AMD will 'fallback' on high-yield high-margin parts for a gen.
IMO, the RX 480/580 and 5700XT weren't a failure, and were similar scenarios.
Another possibility is, (like Polaris) there's another chip entirely being developed to take up the high-end. Say, bringing some CDNA-derivative and HBM onto a Radeon?
I find that 'pretty unlikely' since, Fury, Vega, and VII(especially) didn't work out so great.
Instead of buying $1000+ GPUs for 10 years, how about buying $2-300 GPUs for 5-7 years? If we followed your logic, then we would still be stuck on a 1080 Ti, or the equivalent Titan with no RT or DLSS for the next 3-4 years, whereas one could upgrade from a 1070 to a 4060 or an RX 7600.
If companies see that $1000 GPUs are selling, they'll try to push us to buy $2000 GPUs next time. I'd rather have an RX 6600, thanks.
And I haven't yet talked about price depreciation which gets worse when you move up the tiers.
Although initially I believed in it, RDNA 3 has proven to be a laughing stock of an architecture next to Ada Lovelace, and the product stack is has a positioning nightmare for the company to sort through.
Ada's problem is the opposite, that it's cut-down hardware sold at a performance tier down, price tier up x2 basis, which just makes them horrible value. Both can be fixed by lowering prices, but as long as the situation remains with AMD not keeping up with Nvidia's highly marketable techs, they just have no incentives to do so.
Horrible value as they may be the 40 series GPUs are... refined. That's the word I'd use to describe it, coming from the RTX 3090... Which was already an all around well developed product, mind you.
www.techspot.com/review/2717-amd-radeon-7900-xt-again/
I'm expecting the following 5 or 4nm monolithic designs:
Navi 42 128RBEs/80 RDNA4 dual CUs/ 64MB cache
Navi 43 64RBEs/40 RDNA4 dual CUs/ 32MB cache
Navi 44 32RBEs/20 RDNA4 dual CUs/ 16 or 24MB cache
10-15% higher clocks vs RDNA3
5-10% better CU efficiency vs RDNA3
GDDR7
Essentially Navi 42 will match Navi 31 4K performance and be better at lower res and the rest 1.4-1.5X vs RDNA3 (Navi 43 vs Navi 33)
And deliver on claims yeah, sure, if AMD could deliver on its claims a few times in history Nvidia would be under the bus by now. But here we are.
It's not news that AMD can't keep up, and nVidia has seen its way. There were GTX 1080 and RTX 2080 at reference prices, although AMD did not cover those segments. Only the miners disrupted the market. It was an illusion that AMD could compete with nVidia in terms of performance, when it chose the Samsung solution. Losing the advantage of the manufacturing node, AMD offers solutions that are weaker in performance and consume more. The only weapon remains the price, but how much can you cut from it in order not to destroy the profit?
They will probably return to the 5700 XT era as a flagship and leave nVidia to handle the enthusiasts alone. They have big problems competing with DLSS, CUDA, OptiX and Ray Tracing. The RTX 4060 is equal to the RX 7600 in rasterization, but it effectively destroys it when it uses these technologies.
Let's not exaggerate and try to see things for what they are. There is nearly 25% between the XT and the XTX :) There are no OCs' on a 7900XT for more than 15% perf, and even then you're doing something special.