Sunday, August 6th 2023

AMD Retreating from Enthusiast Graphics Segment with RDNA4?

AMD is rumored to be withdrawing from the enthusiast graphics segment with its next RDNA4 graphics architecture. This means there won't be a successor to its "Navi 31" silicon that competes at the high-end with NVIDIA; but rather one that competes in the performance segment and below. It's possible AMD isn't able to justify the cost of developing high-end GPUs to push enough volumes over the product lifecycle. The company's "Navi 21" GPU benefited from the crypto-currency mining swell, but just like with NVIDIA, the company isn't able to push enough GPUs at the high-end.

With RDNA4, the company will focus on specific segments of the market that sell the most, which would be the x700-series and below. This generation will be essentially similar to the RX 5000 series powered by RDNA1, which did enough to stir things up in NVIDIA's lineup, and trigger the introduction of the RTX 20 SUPER series. The next generation could see RDNA4 square off against NVIDIA's next-generation, and hopefully, Intel's Arc "Battlemage" family.
Source: VideoCardz
Add your own comment

363 Comments on AMD Retreating from Enthusiast Graphics Segment with RDNA4?

#176
ARF
Tek-CheckNames are less important. Marketing departments of both companies use it to confuse people and make comparisons harder. We need to take official names on the face value for what they are, and simply have a healthy distance to it by comparing performance and features.
Names are extremely important. This is the same as putting a low-performance 1.4 litre engine in your brand new s-class mercedes. Will you agree? Not, of course!
Posted on Reply
#177
Colddecked
ARFNames are extremely important. This is the same as putting a low-performance 1.4 litre engine in your brand new s-class mercedes. Will you agree? Not, of course!
But if Mercedes could they would love to put the 1.4 l turbocharged engine that gives you the same "feel" as the 5l v8 and charge the same as the previous model. This is what nvidia is essentially doing, finding new more efficient ways to generate graphics, but not passing the savings on to consumers since its still a "premium" product.
Posted on Reply
#178
Assimilator
rv8000Until there’s a GPU and engine capable of full path tracing at 60 fps min, rasterization and or hybrid rendering will never be dead. Unless either company can magically quintuple RT performance gen to gen, were years away from that being any sort of reality.
You've very obviously never looked at RTX 4090 ray-traced benchmarks. It is capable of over 60 FPS in 4K in every title tested but one. There's no need for them to quintuple RT performance each generation when it's increasing by close to 50% in 4K generation-on-generation.

Inb4 you try to move the goalposts with "I meant a mainstream GPU": 4060 Ti handily beats my 2080 Ti in RT, that's equivalent RT performance moving from the ultra-high-end to the mid-upper-end in a mere two generations. No reason to expect that to slow down anytime soon.
TheoneandonlyMrK@Assimilator RT full path tracing being THE way is years off IMHO and yet even then indy raster game's will happen, I disagree then.
Sure, rasterised games will continue to be made, but rasterisation performance of new hardware won't continue to increase. Because it makes far more sense to spend that precious die space on a technology that isn't on terminal life support. Once games that actually matter (i.e. not indie ones) start using RT rendering exclusively, nobody will care about rasterisation performance every again.
Tek-CheckThere are serious doubts about it, unless both companies substantially improve RT performance across all classes of GPUs, and fast, without breaking customers' piggy-banks.
Is 4060Ti capable of "acceptable RT performance" for $500? No. Even 4070 chokes with RT in more demanding titles and becomes a stuttering mess. So, the mainstream market GPUs still have RT performance in its infancy. Raster is dead - long live the raster.
Which is why you turn on DLSS.
Posted on Reply
#179
R0H1T
ColddeckedBut if Mercedes could they would love to put the 1.4 l turbocharged engine that gives you the same "feel" as the 5l v8 and charge the same as the previous model. This is what nvidia is essentially doing, finding new more efficient ways to generate graphics, but not passing the savings on to consumers since its still a "premium" product.
But wouldn't you rather get a 1000 mile range SS battery powered EV instead?
Posted on Reply
#180
Dan.G
enb141When I had my 1070 was worse
Yes, but I'd say it's because you have a Core i9 13th gen now and a Core i7 8th gen then.
TheoneandonlyMrKYou sort of prove yourself wrong
You're right. I'm sorry. I shall strike-through that phrase. :toast:
Posted on Reply
#181
rv8000
AssimilatorYou've very obviously never looked at RTX 4090 ray-traced benchmarks. It is capable of over 60 FPS in 4K in every title tested but one. There's no need for them to quintuple RT performance each generation when it's increasing by close to 50% in 4K generation-on-generation.

Inb4 you try to move the goalposts with "I meant a mainstream GPU": 4060 Ti handily beats my 2080 Ti in RT, that's equivalent RT performance moving from the ultra-high-end to the mid-upper-end in a mere two generations. No reason to expect that to slow down anytime soon.


Sure, rasterised games will continue to be made, but rasterisation performance of new hardware won't continue to increase. Because it makes far more sense to spend that precious die space on a technology that isn't on terminal life support. Once games that actually matter (i.e. not indie ones) start using RT rendering exclusively, nobody will care about rasterisation performance every again.


Which is why you turn on DLSS.
There’s no goal posts to move? 99.99% of games are either traditional rendering or hybrid.

The list of “path traced” (actual ray tracing), is so infinitesimally small, it’s not even worth mentioning. Last time I checked the 4090 was averaging, what, 12 fps at 4k in CP2077 with overdrive settings (path tracing). So yes, they absolutely need to quintuple if not more. Keep living in some warped green reality.
Posted on Reply
#182
TheoneandonlyMrK
ARFNames are extremely important. This is the same as putting a low-performance 1.4 litre engine in your brand new s-class mercedes. Will you agree? Not, of course!
Oh dear.

That's exactly what's happening with Ice this generation.

1litres masquerade as 30's Audi are first but not last with this.

We had bogof, we bought. Tooofers and 3 for 2 deals.


Now marketeer's are literally selling less for more all over.
Posted on Reply
#183
Colddecked
R0H1TBut wouldn't you rather get a 1000 mile range SS battery powered EV instead?
Sure, but not at the price that something like that will cost when first released. I'd give it a few years to mature/get costs down.
Posted on Reply
#184
TechLurker
On the subject of Tegras, it was an easy deal for Nintendo and Nvidia. No one wanted Nvidia's Tegras, so Nintendo could buy them in bulk for cheap, and converted them into a basic handheld and made bank. Heck the first few generations of Switches were easily hackable because they were literally a modified Tegra tablet. For Nvidia, their throwaway SoC now had some value and the development cost was already paid for, so it was easy just ordering more as Nintendo needed and selling them. It's also why it's taken forever for a Switch 2 to be a thing; because that requires Nintendo fork up the cash to get Nvidia to produce a one-off SoC design for them, as Nvidia doesn't really do custom solutions. Rather, Nvidia prefers to make customers conform to fit their ecosystem, much like how Apple does.

Meanwhile, for a broader audience, the Steam Deck, ROG Ally, and the various other handhelds all run on either custom or semi-custom AMD APUs/SoCs, and AMD can support this as they have a dedicated team who can specifically customize a solution for a buyer, which is also what allowed AMD to win over MS and Sony after both got burned by Nvidia. The PS3 was originally intended to run on Nvidia, but Nvidia refused to design a custom GPU for Sony, causing Sony to seek out ATI/AMD with the PS4, and MS couldn't convince Nvidia to produce a custom GPU for their X360, leading MS to seek out a deal with ATI instead, that was carried over when AMD and ATI merged.

Which brings to mind another point; given that mobile gaming is on the rise now that there's the infrastructure to support it (the PS Vita was too ahead of its time), AMD refocusing for this growing market could be one reason for the supposed rumors that AMD is just aiming for the mid-range. A few months barely goes by before we hear of a new handheld with an AMD SoC, or that AMD's SoCs are now edging into the space where Intel NUCs used to reign, so it does make sense to saturate that market with AMD IP and get more code optimized for AMD, just like AMD has been doing with the Enterprise/Corporate sector via EPYC and Threadripper. Of course, with RDNA3 having proven that MCM GPUs are possible, AMD could now just MCM multiple mid-range chiplets into a high-end product rather than producing a separate line of chiplets just for high-end purposes, and continue to refine the MCM approach that way (moreso if they delve into 3DV-cache or even an FPGA or mini-CPU block as some rumors claim).
Posted on Reply
#185
R0H1T
If AMD+Exynos takes off QC/Apple could be in some trouble at the top end. Though Apple will remain the most profitable phone maker for years to come, it won't necessarily on the back of their uber powerful & efficient SoC's perhaps.
Posted on Reply
#186
Tek-Check
AssimilatorWhich is why you turn on DLSS
No. DLSS should always be treated as an additional perk to be used at your convenience, not as a feature that masks hardware deficiencies and 'must-turn-on' to bring performance to barely acceptable level in demanding scenarios.
Posted on Reply
#187
AusWolf
ARFNames are extremely important. This is the same as putting a low-performance 1.4 litre engine in your brand new s-class mercedes. Will you agree? Not, of course!
So you would buy a Mercedes S-class with a low-performance 1.4 l engine just for its name?
Tek-CheckNo. DLSS should always be treated as an additional perk to be used at your convenience, not as a feature that masks hardware deficiencies and 'must-turn-on' to bring performance to barely acceptable level in demanding scenarios.
Finally someone with common sense! :toast:
Posted on Reply
#188
Dr. Dro
Tek-CheckTough, then I will not be their customer.
5080 for $1,200 can pass my test only if it has: 24GB VRAM, 50% uplift in 4K over 4080 and DisplayPort 2.1 ports (imncluding one USB-C).
Which is likely going to happen, as the 4080 met that same criteria over the 3080. USB-C is likely not coming back, Turing had it and while AMD caught up to that in RDNA 2 (RDNA 1 did not have them), NVIDIA removed the port in Ampere. I understand NVIDIA's reasoning: the trend is for HMDs to go full wireless, and with the extremely high bandwidth and low latency afforded by next-gen networking protocols like Wi-Fi 7, I can easily see true wireless HMDs debutting sometime soon.
TheoneandonlyMrK@Assimilator RT full path tracing being THE way is years off IMHO and yet even then indy raster game's will happen, I disagree then.
Which brings the conversation full circle, as I pointed out initially, indie raster games already run well on Pascal, often in Maxwell and in some cases, even Fermi and earlier as it stands today. A focus in increased rasterization performance is not necessary, the RTX 4080 is knocking on the realm of terapixel and teratexel fillrates (with 300 gpixel/s and 800 gtexel/s its raster fillrates are high enough that targeting a 1440p display and in many cases even a 4K display it'd never be the bottleneck), and the 4090 actually ventures into that domain, its texture fillrate is so high that you don't measure it in gigatexels per second anymore - you can safely use the tera scale for that.
ColddeckedI'll see your 1070 and raise you a Nintendo Switch. Its crazy what devs can run on that thing. Thing has the specs of a flagship phone from 2013!
Indeed, and while we won't claim that the Switch doesn't have a compromised experience (it most definitely does) - it can pull off some wondrous things. I'd say NieR Automata port to Switch is one of the most marvelous achievements ever, as it was clearly designed to run on a much more robust system.
TechLurkerOn the subject of Tegras, it was an easy deal for Nintendo and Nvidia. No one wanted Nvidia's Tegras, so Nintendo could buy them in bulk for cheap, and converted them into a basic handheld and made bank. Heck the first few generations of Switches were easily hackable because they were literally a modified Tegra tablet. For Nvidia, their throwaway SoC now had some value and the development cost was already paid for, so it was easy just ordering more as Nintendo needed and selling them. It's also why it's taken forever for a Switch 2 to be a thing; because that requires Nintendo fork up the cash to get Nvidia to produce a one-off SoC design for them, as Nvidia doesn't really do custom solutions. Rather, Nvidia prefers to make customers conform to fit their ecosystem, much like how Apple does.
I don't think it's even that, the console is selling well, the ecosystem is rich, and developers and gamers are interested alike in it. Thus, there's little need to rush the release of a new system, does Nintendo want a repeat of the Wii U? Even bringing high-definition didn't really make it as popular as the original Wii, and that's because the U didn't have the original's mojo - and mojo is Nintendo's specialty.

Speaking of exploits though, it'd be absolutely hilarious if the PS5 is jailbroken because of the newly discovered Zen 2 Zenbleed vulnerability... of which AMD seems to be having a high degree of difficulty patching, some Zen 2 platforms will only receive AGESA updates next year.
R0H1TIf AMD+Exynos takes off QC/Apple could be in some trouble at the top end. Though Apple will remain the most profitable phone maker for years to come, it won't necessarily on the back of their uber powerful & efficient SoC's perhaps.
I don't see it happening... at least it didn't with Samsung, who already went back to Qualcomm SoCs, even scored a few tailored "Snapdragon for Galaxy" buffed up versions of the 8 Gen 2.... the Exynos 2200 has the RDNA 2-derived Xclipse 920, and it's about 10% faster than the Snapdragon 888's GPU, keeping in mind the 888 was released in late 2020 and is used in 2021 flagships (Galaxy S21, Z Flip 3/Z Fold 3)
AssimilatorAmen.

So many so-called technology enthusiasts simply don't understand that rasterisation is dead. The fact that games, even new ones, still use it is entirely down to the fact that the console GPUs are simply not capable of acceptable RT performance. Assuming AMD manages to mostly address that shortcoming in the next console generation (2027-2028 timeline), we will then finally see the end of rasterisation as the primary graphics rendering technology.
And that's pretty much it. The GPU is evolving and the hyperfixation on raster is clearly a way to stonewall and deflect from the reality AMD currently finds itself in, more accurately described by:

Posted on Reply
#189
The Jniac
Very disappointing if true, although I definitely understand that it might not be financially feasible to produce a high-end card every year or so. I wonder if it might be better for AMD to switch to an alternating cycle where they produce a new generation of cards one year, complete with high-end cards, and then do the budget/midrange cards the other year.

I *really* do not want to be stuck with only nVidia producing high-end cards.
Posted on Reply
#190
AusWolf
I'm not against RT by any means, but if it's really as important as we're led to believe, then why do both AMD and Nvidia offer the same ratio of raster-to-RT hardware and performance ratio as they did in their last generation?
Posted on Reply
#191
The Jniac
darakianAnother read of this is that the high end radeon will be built from multiple smaller dies.
Exactly what I was thinking. AMD has really been leaning into building stuff out of multiple chiplets as opposed to monolithic designs, so at least with my minimal knowledge of chip design and manufacturing, it makes sense to bring that to GPUs as well.
Posted on Reply
#192
rv8000
Dr. DroWhich is likely going to happen, as the 4080 met that same criteria over the 3080. USB-C is likely not coming back, Turing had it and while AMD caught up to that in RDNA 2 (RDNA 1 did not have them), NVIDIA removed the port in Ampere. I understand NVIDIA's reasoning: the trend is for HMDs to go full wireless, and with the extremely high bandwidth and low latency afforded by next-gen networking protocols like Wi-Fi 7, I can easily see true wireless HMDs debutting sometime soon.



Which brings the conversation full circle, as I pointed out initially, indie raster games already run well on Pascal, often in Maxwell and in some cases, even Fermi and earlier as it stands today. A focus in increased rasterization performance is not necessary, the RTX 4080 is knocking on the realm of terapixel and teratexel fillrates (with 300 gpixel/s and 800 gtexel/s its raster fillrates are high enough that targeting a 1440p display and in many cases even a 4K display it'd never be the bottleneck), and the 4090 actually ventures into that domain, its texture fillrate is so high that you don't measure it in gigatexels per second anymore - you can safely use the tera scale for that.



Indeed, and while we won't claim that the Switch doesn't have a compromised experience (it most definitely does) - it can pull off some wondrous things. I'd say NieR Automata port to Switch is one of the most marvelous achievements ever, as it was clearly designed to run on a much more robust system.



I don't think it's even that, the console is selling well, the ecosystem is rich, and developers and gamers are interested alike in it. Thus, there's little need to rush the release of a new system, does Nintendo want a repeat of the Wii U? Even bringing high-definition didn't really make it as popular as the original Wii, and that's because the U didn't have the original's mojo - and mojo is Nintendo's specialty.

Speaking of exploits though, it'd be absolutely hilarious if the PS5 is jailbroken because of the newly discovered Zen 2 Zenbleed vulnerability... of which AMD seems to be having a high degree of difficulty patching, some Zen 2 platforms will only receive AGESA updates next year.



I don't see it happening... at least it didn't with Samsung, who already went back to Qualcomm SoCs, even scored a few tailored "Snapdragon for Galaxy" buffed up versions of the 8 Gen 2.... the Exynos 2200 has the RDNA 2-derived Xclipse 920, and it's about 10% faster than the Snapdragon 888's GPU, keeping in mind the 888 was released in late 2020 and is used in 2021 flagships (Galaxy S21, Z Flip 3/Z Fold 3)



And that's pretty much it. The GPU is evolving and the hyperfixation on raster is clearly a way to stonewall and deflect from the reality AMD currently finds itself in, more accurately described by:

The market both hardware and software clearly prove the opposite of this conclusion/delusion.

Full path traced becoming the norm, let alone being able to be played on a relatively affordable GPU, is still several years away. It’s easy to forget tech snobs (including myself) have a massively warped take on the GPU market. 4090/4080/3090/6900xt/7900xtx owners don’t even come close to being popular outside of the tech forum niche.
Posted on Reply
#193
Dr. Dro
rv8000The market both hardware and software clearly prove the opposite of this conclusion/delusion.

Full path traced becoming the norm, let alone being able to be played on a relatively affordable GPU, is still several years away. It’s easy to forget tech snobs (including myself) have a massively warped take on the GPU market. 4090/4080/3090/6900xt/7900xtx owners don’t even come close to being popular outside of the tech forum niche.
It's not a delusion. It takes several generations for new standards to be widely adopted. It's not because we now have third-generation raytracing cards that can finally more or less pathtrace without instantly croaking (as long as you have the highest end models) that the several generations' worth of traditional hardware will just up and disappear. People are and will continue to happily use their GTX 1080 Ti, like I brought up earlier - and with age advancing, not having superfluous eye candy is a very small price to pay if you're not interested in the bleeding edge.

When DirectX 11 cards finally arrived and unified shaders (introduced 3 hardware generations prior with G80) started to become mandatory because games started using (at the time) advanced GPU computing technologies, it didn't mean that DirectX 9 games suddenly disappeared. Indeed, they were still widely released, with some high profile releases coming out as late as 2014 (Borderlands Pre-Sequel), still fully compatible with Windows XP. By that time it was 13 years old and just out of its quite protracted extended support stage, and Windows 7 and DirectX 11 were well over half a decade old. And by then... a lot of people were still happily gaming on GTX 200 series and HD 4000 cards.

Due to the enormous complexity of undertaking this evolution and market realities (video game fidelity has more or less plateau'd because of the ever rising development costs and the struggle to raise prices to end-users), the recent bubbles of crypto and AI, as well as economic hardships in the wide user base has obviously tremendously slowed adoption, which brings my point yet again full circle: if you don't really care and need the newest graphics technologies, you can happily stay with your Pascal card until something else comes along.

However, if you want to compete at the high-end, where your primary market is technology enthusiasts and people who want to experience the latest and greatest, you must deliver the goods. If AMD retreats to the midrange, and offers their modest support for these newer techs while they build their portfolio, it would prove to be quite the wise move - at the behest of their loyal fanbase, which will be put to the test: will they still buy AMD hardware if they don't place a claim to the upper range?
Posted on Reply
#194
TheoneandonlyMrK
Yeah no , Polaris , 5700Xt , it would be the norm.

They're busy is my take.
Posted on Reply
#195
PapaTaipei
VIPERSRTTo keep competition a live I bought only AMD CPUs and AMD Radeion GPUs. without that we will still at 4cores and 8 threads CPUs for ever and you will not see any cheap GPU in the market
There is no competition.
Posted on Reply
#196
kapone32
I have owned both the XT and XTX and they are both absolutely fine for Gaming. Raster is dead? I guess Balder's Gate 3 is focused on RT and DLSS? I guess I don;t enjoy my Games because I don't use or have them. People talking about chiplets being done are not appreciating what Ryzen did and how parallel processing could be even faster with chiplets. The best is today's HU video comparing the 6800XT to the 4060TI 16GB and guess what? The fact (for me) is that people who use DLSS and RT in the same sentence should have at least a 4080 if you are going to complain about the performance of AMD. The other thing is that the 7900XT is still the same price today (Sapphire Pulse) on Newegg as it was at launch. The XTX on the other hand can be found for up to $200 less than launch price.

The narrative is so entrenched even with the launch notification of the 7800XT with 16GB people will still say the 7900XT should be called the 7800XT. The truth is Gaming has never been more enjoyable with nostalgic, high fidelity, unique experiences and new content coming all the time. Balders Gate 3 and Armoured Core 6 are 2 that come to mind but there is also plenty of Games before the age of DLSS like Shadow of War, Grim Dawn and plenty of Racing Games like the Grid Series. to enjoy. If you want to go to the Arcade Everspace 2 and Redout 2 are fun to play but even Red Faction Guerrilla Re Mastered is sweet for Arcade satisfaction.
Posted on Reply
#197
rv8000
Dr. DroIt's not a delusion. It takes several generations for new standards to be widely adopted. It's not because we now have third-generation raytracing cards that can finally more or less pathtrace without instantly croaking (as long as you have the highest end models) that the several generations' worth of traditional hardware will just up and disappear. People are and will continue to happily use their GTX 1080 Ti, like I brought up earlier - and with age advancing, not having superfluous eye candy is a very small price to pay if you're not interested in the bleeding edge.

When DirectX 11 cards finally arrived and unified shaders (introduced 3 hardware generations prior with G80) started to become mandatory because games started using (at the time) advanced GPU computing technologies, it didn't mean that DirectX 9 games suddenly disappeared. Indeed, they were still widely released, with some high profile releases coming out as late as 2014 (Borderlands Pre-Sequel), still fully compatible with Windows XP. By that time it was 13 years old and just out of its quite protracted extended support stage, and Windows 7 and DirectX 11 were well over half a decade old. And by then... a lot of people were still happily gaming on GTX 200 series and HD 4000 cards.

Due to the enormous complexity of undertaking this evolution and market realities (video game fidelity has more or less plateau'd because of the ever rising development costs and the struggle to raise prices to end-users), the recent bubbles of crypto and AI, as well as economic hardships in the wide user base has obviously tremendously slowed adoption, which brings my point yet again full circle: if you don't really care and need the newest graphics technologies, you can happily stay with your Pascal card until something else comes along.

However, if you want to compete at the high-end, where your primary market is technology enthusiasts and people who want to experience the latest and greatest, you must deliver the goods. If AMD retreats to the midrange, and offers their modest support for these newer techs while they build their portfolio, it would prove to be quite the wise move - at the behest of their loyal fanbase, which will be put to the test: will they still buy AMD hardware if they don't place a claim to the upper range?
There’s no reason to dump all their eggs into the RT/path tracing basket. Even Nvidia can’t provide a viable and affordable solution for the non-existent library of path traced games. Both Nvidia and AMD have bigger fish to fry in the AI and server spaces.

Rasterization and hybrid ray tracing are here to stay as the main means of rendering for awhile. AMD isn’t “stonewalling” anything.
Posted on Reply
#198
Dr. Dro
rv8000There’s no reason to dump all their eggs into the RT/path tracing basket. Even Nvidia can’t provide a viable and affordable solution for the non-existent library of path traced games. Both Nvidia and AMD have bigger fish to fry in the AI and server spaces.

Rasterization and hybrid ray tracing are here to stay as the main means of rendering for awhile. AMD isn’t “stonewalling” anything.
Read again, that was directed at their "loyal fans".
Posted on Reply
#199
Luke357
ColddeckedI'll see your 1070 and raise you a Nintendo Switch. Its crazy what devs can run on that thing. Thing has the specs of a flagship phone from 2013!
My Nexus 6 was more powerful and it was 32 bit! One of the first 1440p phones with a GPU powerful enough to push Android without stuttering!
Posted on Reply
#200
R0H1T
Dr. DroIt's not a delusion. It takes several generations for new standards to be widely adopted.
You'll hit the physics wall before you end up anywhere close to full RT RT, wanna bet on that?
Posted on Reply
Add your own comment
Oct 18th, 2024 03:58 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts