Sunday, August 6th 2023

AMD Retreating from Enthusiast Graphics Segment with RDNA4?

AMD is rumored to be withdrawing from the enthusiast graphics segment with its next RDNA4 graphics architecture. This means there won't be a successor to its "Navi 31" silicon that competes at the high-end with NVIDIA; but rather one that competes in the performance segment and below. It's possible AMD isn't able to justify the cost of developing high-end GPUs to push enough volumes over the product lifecycle. The company's "Navi 21" GPU benefited from the crypto-currency mining swell, but just like with NVIDIA, the company isn't able to push enough GPUs at the high-end.

With RDNA4, the company will focus on specific segments of the market that sell the most, which would be the x700-series and below. This generation will be essentially similar to the RX 5000 series powered by RDNA1, which did enough to stir things up in NVIDIA's lineup, and trigger the introduction of the RTX 20 SUPER series. The next generation could see RDNA4 square off against NVIDIA's next-generation, and hopefully, Intel's Arc "Battlemage" family.
Source: VideoCardz
Add your own comment

363 Comments on AMD Retreating from Enthusiast Graphics Segment with RDNA4?

#201
enb141
Dan.GYes, but I'd say it's because you have a Core i9 13th gen now and a Core i7 8th gen then.
Yes but I switched my 1070 to 3070 Ti then I upgraded the CPU, so I had a Core i7 8th gen with a 3070 Ti, the CPU helped but the GPU helped way more.
Posted on Reply
#202
Dr. Dro
R0H1TYou'll hit the physics wall before you end up anywhere close to full RT RT, wanna bet on that?
It's well known we're at the end of silicon and to continue onwards new materials will need to be used, but I would never bet against the tech industry, that is most unwise ;)
Posted on Reply
#203
R0H1T
I'd rather go with Physics.

Posted on Reply
#204
TheoneandonlyMrK
Dr. DroIt's well known we're at the end of silicon and to continue onwards new materials will need to be used, but I would never bet against the tech industry, that is most unwise ;)
Your being extreme, silicon is still being worked on and with and will be for at least 7 more generations of shrink and likely won't be replaced quickly or easily, few materials have a suitably workable bandgap, even GAi devices are very new and niche and they're in the lead experimentally over other silicon alternatives.

I actually think Innovation, possibly photonics will extend the viability of silicon due to the vast amounts of working processes developed for silicon fabrication way way beyond the Angstrom era.
Posted on Reply
#205
Dr. Dro
TheoneandonlyMrKYour being extreme, silicon is still being worked on and with and will be for at least 7 more generations of shrink and likely won't be replaced quickly or easily, few materials have a suitably workable bandgap, even GAi devices are very new and niche and they're in the lead experimentally over other silicon alternatives.

I actually think Innovation, possibly photonics will extend the viability of silicon due to the vast amounts of working processes developed for silicon fabrication way way beyond the Angstrom era.
If anything that kind of reinforces my point, though. It won't be that long.
Posted on Reply
#206
Tek-Check
Dr. DroUSB-C is likely not coming back
USB-C video port on GPU may actually evolve in future to transmit USB and/or PCIe data, alongside video.

Asus has recently shown NVMe drive attached to GPU's PCB transmitting data over PCIe link. By extension, its enough to install USB4 controller on GPU's PCB to inject USB and PCIe data flowing to/from USB-C port alongside DisplayPort video data.
Posted on Reply
#207
Dr. Dro
Tek-CheckUSB-C video port on GPU may actually evolve in future to transmit USB and/or PCIe data, alongside video.

Asus has recently shown NVMe drive attached to GPU's PCB transmitting data over PCIe link. By extension, its enough to install USB4 controller on GPU's PCB to inject USB and PCIe data flowing to/from USB-C port alongside DisplayPort video data.
This is already possible, the CPU graphics on my motherboard are wired to an USB-C DisplayPort source. But even if that is the case, it's still a rather unusual format for monitors.
Posted on Reply
#208
yannus1
I remember a rumour that Intel would stop Arc production after first gen. It wasn't true.
Posted on Reply
#209
enb141
yannus1I remember a rumour that Intel would stop Arc production after first gen. It wasn't true.
Intel still only has one Arc Gen, so still could be true (unlikely) but still possible.
Posted on Reply
#210
AusWolf
Dr. DroIt's well known we're at the end of silicon and to continue onwards new materials will need to be used, but I would never bet against the tech industry, that is most unwise ;)
How about using one GPU only for raster, and one only for RT, similarly to the earliest physics accelerators before Nvidia bought Ageia, the company that made them (or 3DFX cards that didn't have 2D support). ATX is basically just a ton of unused space in a modern gaming PC, so why do raster and RT have to be on the same chip, or even the same card?
Posted on Reply
#211
Dr. Dro
AusWolfHow about using one GPU only for raster, and one only for RT, similarly to the earliest physics accelerators before Nvidia bought Ageia, the company that made them (or 3DFX cards that didn't have 2D support). ATX is basically just a ton of unused space in a modern gaming PC, so why do raster and RT have to be on the same chip, or even the same card?
I'm sure it was thought of but I doubt we have any interconnect technology that's fast enough for that, it'd also need perfect synchronization with the GPU itself... Seems like its own can of worms to me
Posted on Reply
#212
Gica
Space Lynxwell it doesn't beat 4090 in any other game... valhalla is just a heavy amd game. your refusal to admit it doesn't even beat it in one game though is troubling
He is Taliban AMD and you have nothing to discuss with him. I have it on ignore.


RTX 4090 is the undisputed king at the moment. An expensive king, it is right, and the price is the only weapon with which these Taliban, denying the army of technologies that increase the value of an nVidia video card. Anyway, their desperation can be seen in how they use a game sponsored by AMD to hide the drama from the others.

Something funny that stuck in my mind from the review of a well-known compatriot for RTX 4060 Ti/ RX 7600:
4060 Ti - driver installation and running games without problems.
7600 - driver installation and... error, error, error. It was solved after several wasted hours, including reinstalling the OS.
And he also found that he had to use nVidia software to be able to get the results of the AMD video card in the games that did not have a benchmark utility because AMD has nothing similar.

I extracted his results from two reviews because he "omitted" to compare the results, they were too against AMD. How to compare in Cyberpunk the 34 fps obtained by the RX 7600 (igp disaster) with the 111 fps obtained by the 4060?
Video cards were tested only in 1080p



The sources are here
Posted on Reply
#213
AusWolf
GicaHe is Taliban AMD and you have nothing to discuss with him. I have it on ignore.


RTX 4090 is the undisputed king at the moment. An expensive king, it is right, and the price is the only weapon with which these Taliban, denying the army of technologies that increase the value of an nVidia video card. Anyway, their desperation can be seen in how they use a game sponsored by AMD to hide the drama from the others.

Something funny that stuck in my mind from the review of a well-known compatriot for RTX 4060 Ti/ RX 7600:
4060 Ti - driver installation and running games without problems.
7600 - driver installation and... error, error, error. It was solved after several wasted hours, including reinstalling the OS.
And he also found that he had to use nVidia software to be able to get the results of the AMD video card in the games that did not have a benchmark utility because AMD has nothing similar.

I extracted his results from two reviews because he "omitted" to compare the results, they were too against AMD. How to compare in Cyberpunk the 34 fps obtained by the RX 7600 (igp disaster) with the 111 fps obtained by the 4060?
Video cards were tested only in 1080p



The sources are here
Comparing anything with DLSS 3 FG on sounds more like an Nvidia advert than a review to me. One must not forget about the possible input latency issues with FG.
Posted on Reply
#214
Tek-Check
Dr. DroThis is already possible, the CPU graphics on my motherboard are wired to an USB-C DisplayPort source. But even if that is the case, it's still a rather unusual format for monitors.
iGPU has been using USB-C for quite some time. I have Z390 motherboard from Asrock with three monitor ports from CPU - DP, HDMI and Thunderbolt 3.

The new thing would be installing USB4 controller on GPU's PCB, so that USB-C port carries not only DP video data, but other protocols too. We have never had this solution on GPU.
Posted on Reply
#215
Dr. Dro
Tek-CheckiGPU has been using USB-C for quite some time. I have Z390 motherboard from Asrock with three monitor ports from CPU - DP, HDMI and Thunderbolt 3.

The new thing would be installing USB4 controller on GPU's PCB, so that USB-C port carries not only DP video data, but other protocols too. We have never had this solution on GPU.
Aye, I get you. It's a possibility I suppose, although it'll largely depend on how USB 4 is received, given that the 3.x versions didn't even supplant USB 2.0 (and likely due to the branding mess).

Sadly on the MEG Z690 Ace the USB-C port that carries a video signal doesn't work with HDMI. I bought a USB-C to HDMI dongle on Amazon to use after I sold my 3090 and had the 4080 on the way, unfortunately it did not work at all... then I read on the manual that it only supports DisplayPort. I guess even us nerds need to RTFM sometimes. Fortunately, it's not all a waste, apparently it works just fine with my laptop - and provides a way to use the integrated Radeon graphics with an external display, as its native HDMI port is wired directly to the RTX 3050.
Posted on Reply
#216
Colddecked
If AMD really want this chaplet approach to work for GPU, they need to be more aggressive securing advanced nodes from TSMC, or they better be sure Samsung can really deliver and go as advanced as possible with them.

They also really need their FG tech to be as good as DLSS3. I just wish Nvidia would toss us 3000 series people a bone with FG... but guess I'll have to wait for AMD/Intel solutions...
Posted on Reply
#217
Dr. Dro
ColddeckedThey also really need their FG tech to be as good as DLSS3. I just wish Nvidia would toss us 3000 series people a bone with FG... but guess I'll have to wait for AMD/Intel solutions...
I bet that by shortly after FSR 3.0 goes public, if it's really completely hardware agnostic and you can run it on a GTX 980, they'll just announce "DLSS 3.5" with some Ada improvements and "new ability to run on Ampere, Turing, Pascal and Maxwell", with the implication that's going to be the sendoff gift for the 900 series GPUs... it'd be basically a repeat of what they did with the image sharpening feature, they added it as far back as Kepler shortly after AMD rolled it to Polaris and announced "other GPUs were coming soon", basically stealing the spotlight there.
Posted on Reply
#218
Tek-Check
ColddeckedIf AMD really want this chaplet approach to work for GPU, they need to be more aggressive securing advanced nodes from TSMC, or they better be sure Samsung can really deliver and go as advanced as possible with them.
It's no and no currently.
At TSMC, Apple always has a priority and currently uses 90% of 3nm capacity. Plus, they don't pay for defective dies for the first time, which means 3nm yields are lower than expected, perhaps 65-70% at the moment.

At Samsung, yields are unknown on GAA Fet 3nm node. Unclear if it's 50 or 60% currently, which is low. Nvidia really had a problem with them on 8nm.

Everybody wants to be on a cutting edge node for most advanced products, but it takes a few years to improve yields towards 90%. It's a painfully slow process...
Posted on Reply
#219
Colddecked
Tek-CheckIt's no and no currently.
At TSMC, Apple always has a priority and currently uses 90% of 3nm capacity. Plus, they don't pay for defective dies for the first time, which means 3nm yields are lower than expected, perhaps 65-70% at the moment.

At Samsung, yields are unknown on GAA Fet 3nm node. Unclear if it's 50 or 60% currently, which is low. Nvidia really had a problem with them on 8nm.

Everybody wants to be on a cutting edge node for most advanced products, but it takes a few years to improve yields towards 90%. It's a painfully slow process...
That's why I'm saying they need to be more aggressive in securing an advanced node. Doesn't need to be the most advanced, but they can not afford disparity with Nvidia/Intel. Its an investment that pays off. Isn't that supposed to be an advantage of chiplet approach, you get better yields because the chip is not as big/complex?
Posted on Reply
#220
Kyan
GicaSomething funny that stuck in my mind from the review of a well-known compatriot for RTX 4060 Ti/ RX 7600:
4060 Ti - driver installation and running games without problems.
7600 - driver installation and... error, error, error. It was solved after several wasted hours, including reinstalling the OS.
Did he use DDU ? It's well know that nvidia and amd don't like eachothers. Nothing surprising.

It makes me wonder if there's situation where it works fine. Like starting fresh with amd and go nvidia, does it works fine ? Or does starting fresh with nvidia and go amd then works fine ? I've never seen experimentation on this, maybe guru 3d have some test.
GicaAnd he also found that he had to use nVidia software to be able to get the results of the AMD video card in the games that did not have a benchmark utility because AMD has nothing similar.
Im pretty sure i have seen a perf summary somewhere in adrenalin. Isn't a ton of test are made without any manufacturer software anyway ?
Posted on Reply
#221
Tek-Check
ColddeckedThat's why I'm saying they need to be more aggressive in securing an advanced node. Doesn't need to be the most advanced, but they can not afford disparity with Nvidia/Intel. Its an investment that pays off.
I dont think AMD could be more "aggressive". They are the second most preferred client of TSMC currently having access to several latest customised nodes.

People need to realise that Apple pays in advance by building entirely new factories for TSMC's next best. No one else has that kind of money. Perhaps Nvidia in two years.

AMD has secured 3nm for several Zen5 products, such as Turin and Turin dense CPUs. Server takes priority for latest and greatest nodes.

There is currently no major disparity between AMD, Nvidia and Intel in node process. Intel is behind in server chips, Nvidia is in front in AI chips.
ColddeckedIsn't that supposed to be an advantage of chiplet approach, you get better yields because the chip is not as big/complex?
Yes, they can get higher chiplet yield per wafer due to their smaller size, but 3nm wafer itself is significantly more expensive at the moment that only Apple can afford the capacity they had booked and paid two years ago.
Posted on Reply
#222
AusWolf
Dr. DroI bet that by shortly after FSR 3.0 goes public, if it's really completely hardware agnostic and you can run it on a GTX 980, they'll just announce "DLSS 3.5" with some Ada improvements and "new ability to run on Ampere, Turing, Pascal and Maxwell", with the implication that's going to be the sendoff gift for the 900 series GPUs... it'd be basically a repeat of what they did with the image sharpening feature, they added it as far back as Kepler shortly after AMD rolled it to Polaris and announced "other GPUs were coming soon", basically stealing the spotlight there.
Nvidia has already said that the hardware necessary to make DLSS 3 work is there in Turing and Ampere, just "probably" not fast enough to do it at the proper speed (whatever that means). That, to me, is a hint that DLSS 3 for Ampere and Turing is coming soon - probably when Ada sales have reached or exceeded Nvidia's expectations.
Posted on Reply
#223
BoboOOZ
yannus1I remember a rumour that Intel would stop Arc production after first gen. It wasn't true.
That's not that rumour was saying. The rumour was saying that Intel will do just one or 2 cards each generation, with very low ambitions, instead of having a complete lineup and competing at all levels. For all purposes it seems to have been true. But that doesn't mean that this rumor will turn out to be true, hopefully not.
Posted on Reply
#224
R0H1T
There were multiple rumors, I'm sure MLID/WTF tech probably cooked up a few :ohwell:
Posted on Reply
#225
SCP-001
KyanIm pretty sure i have seen a perf summary somewhere in adrenalin. Isn't a ton of test are made without any manufacturer software anyway ?
There is a version built into Adrenanlin. Hit the keys "alt+r" and it'll bring up a performance metrics overlay that you can toggle on and off. I can't remember if there is a logging function in it though.
Posted on Reply
Add your own comment
May 16th, 2024 23:06 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts