Monday, February 20th 2023

AMD RDNA4 Architecture to Build on Features Relevant to Gaming Performance, Doesn't Want to be Baited into an AI Feature Competition with NVIDIA

AMD's next-generation RDNA4 graphics architecture will retain a design-focus on gaming performance, without being drawn into an AI feature-set competition with rival NVIDIA. David Wang, SVP Radeon Technologies Group; and Rick Bergman, EVP of Computing and Graphics Business at AMD; gave an interview to Japanese tech publication 4Gamers, in which they dropped the first hints on the direction which the company's next-generation graphics architecture will take.

While acknowledging NVIDIA's movement in the GPU-accelerated AI space, AMD said that it didn't believe that image processing and performance-upscaling is the best use of the AI-compute resources of the GPU, and that the client segment still hasn't found extensive use of GPU-accelerated AI (or for that matter, even CPU-based AI acceleration). AMD's own image processing tech, FSR, doesn't leverage AI acceleration. Wang said that with the company introducing AI acceleration hardware with its RDNA3 architecture, he hopes that AI is leveraged in improving gameplay—such as procedural world generation, NPCs, bot AI, etc; to add the next level of complexity; rather than spending the hardware resources on image-processing.
AMD also stressed on the need to make the GPU more independent of the CPU in graphics rendering. The company took several steps in this direction over the past many generations, with the most recent being the multi-draw indirect accelerator (MDIA) component introduced with RDNA3. Using this, software can dispatch multiple instanced draw commands that can be issued on the GPU, greatly reducing the CPU-level overhead. RDNA3 is up to 2.3x more efficient at this than RDNA2. Expect more innovations along these lines with RDNA4.

AMD understandably didn't talk anything about the "when," "what," and "how" of RDNA4 as its latest RDNA3 architecture is just off the ground, and awaiting a product ramp through 2023 into the various market-segments spanning from iGPUs to mobile GPUs, and mainstream desktop GPUs. RDNA3 is currently powering the Radeon RX 7900 series high-end graphics cards, and the company's latest 5 nm "Phoenix Point" Ryzen 7000-series mobile processor iGPUs. You can catch the 4Gamer interview in the source link below.
Sources: 4Gamers.net, HotHardware
Add your own comment

221 Comments on AMD RDNA4 Architecture to Build on Features Relevant to Gaming Performance, Doesn't Want to be Baited into an AI Feature Competition with NVIDIA

#101
AusWolf
fevgatosWhoever is calling them gimmicks is clueless / hasn't tried them or is sworn amd fan. It's not debatable among normal sane people
I've tried them all except for frame generation, I'm not an AMD fan, and I'm calling them gimmicks.
Posted on Reply
#102
Vayra86
fevgatosWhoever is calling them gimmicks is clueless / hasn't tried them or is sworn amd fan. It's not debatable among my circlejerk-bubble normal sane people
FTFY. Funnily the first sentence up here proves the last. You should buy a mirror. Free RT.
Posted on Reply
#103
Dr. Dro
Honestly, AMD seems to have enough on their plate. RDOA 3 missed the mark tremendously (falling short of internal projections/AMD's own marketing claims, end-user performance expectations and still had that manufacturing defect at launch), and in the consumer space, Zen 4 was far from a resounding commercial success.

No matter how much you may like AMD, this round fluked in comparison to Navi 21 and Zen 3, which are probably AMD's most solid products in a very, very long time. The extremely poor sales numbers for RDNA 3 lineup in general plus the crazy pack-ins and bundles on Zen 4 show exactly what's happening.

They can either stick to their old guns, or run after innovation, which they cannot afford at the moment. Perhaps, as defeatist as Wang's argument is, it may very well be the best course of action for the company right now, even if the market has clearly spoken that it is interested in Nvidia's technologies and statements: the 4090 is outselling everything else in the market combined right now, even at its absurdly high price. The aforementioned Nokia analogy may very well apply here.

If anything, they have some soul searching to do. Godspeed.

Late edit: I don't hate AMD... Why would I?
Posted on Reply
#104
pavle
What's all this banter about AI, AI is a non-issue? Just find a way (whatever it proves to be) to make ray-tracing usable in graphics if it's so far adopted that it's a part of Direct3D.
With so many graphics people ATi has one has to wonder what they are doing there; or are they peradventure busy patching up all the deficiencies their architecture accumulated through the ages?
Inquiring minds want to know. :)
Posted on Reply
#105
evernessince
nguyenWould be funny if Nvidia delivered an AI model for NPC and bots but run like crap on AMD (4090 has 5x the tensor ops throughput of 7900XTX), basically what David Wang said himself



Then it will become gimmick according to some people :rolleyes:.
The thing is that the AI model would be trained by the developers, not the end user.

I don't really see game engines implementing an AI feature that's going to screw over console compatibility, especially for something that you can't turn off like NPC AI.
Posted on Reply
#106
Denver
Dr. DroHonestly, AMD seems to have enough on their plate. RDOA 3 missed the mark tremendously (falling short of internal projections, AMD's own marketing claims, end-user performance expectations and still had that manufacturing defect at launch), and in the consumer space, Zen 4 was far from a resounding commercial success.

No matter how much you may like AMD, this round fluked in comparison to Navi 21 and Zen 3, which are probably AMD's most solid products in a very, very long time. The extremely poor sales numbers for RDNA 3 lineup in general plus the crazy pack-ins and bundles on Zen 4 show exactly what's happening.

They can either stick to their old guns, or run after innovation, which they cannot afford at the moment. Perhaps, as defeatist as Wang's argument is, it may very well be the best course of action for the company right now, even if the market has clearly spoken that it is interested in Nvidia's technologies and statements: the 4090 is outselling everything else in the market combined right now, even at its absurdly high price. The aforementioned Nokia analogy may very well apply here.

If anything, they have some soul searching to do. Godspeed.
Zen4 is an absolute success, but only for servers whose margin is very high and AVX512 has effective use. While the AM5 platform is relatively expensive, here the motherboards cost twice as much as the Intel models.

In my opinion, knowing that CPUs like R5 and R7 are aimed at gamers, the X3D line should have been released from the beginning and the Non X line would be the cheaper alternatives released later. That would make more sense.
Posted on Reply
#107
ThrashZone
Hi,
Yeah that's exactly what we need AIsplaining :laugh:
Posted on Reply
#108
Dr. Dro
DenverZen4 is an absolute success, but only for servers whose margin is very high and AVX512 has effective use. While the AM5 platform is relatively expensive, here the motherboards cost twice as much as the Intel models.

In my opinion, knowing that CPUs like R5 and R7 are aimed at gamers, the X3D line should have been released from the beginning and the Non X line would be the cheaper alternatives released later. That would make more sense.
Only for servers, the client segment is a biblical flop to the point that bundles giving away motherboards and RAM, in some cases both, were needed to shift stock. The high cost of the motherboards and the DDR5 requirement are a pretty effective deterrent for adoption, especially since gamers could just buy the 5800X3D and have a flawless experience already.
Posted on Reply
#109
ixi
RH92Me ? Nope ... my GPU does though www.nvidia.com/en-us/geforce/news/nvidia-rtx-games-engines-apps/ :



I mean you guys need to wake up , it's 2023 we are well past 2018 , both ray tracing and machine learning anti aliasing have seen wide adoption and aren't going anywhere if anything else they are gaining importance over raster every year ... In risk of repeating myself , AMD is failing to read the room big time !
Well, I didn't say that your GPU can't do it, but wondering do you use them. If you have,.but you don't you use it, so I dunno why we should brag about it. You need it - blah, blah. :D

Of course it is great if you have more options than less options, but on other hand like mentioned. If you don't use it - then meh.

On 3060 ti have never used tensor cores and ray tracing tried only in minecraft and to get 30fps... cubes, cubes give you 30fps... at that moment I was like. Yeah, I have, but it is not enjoyable so meeh. Not to mention it only works beautiful on nvidia few created maps and that is it.
Posted on Reply
#110
JohH
I think it's the right approach. RDNA should focus on gaming performance.

In the future, when AI performance is important, AMD is much better off adding a Xilinx AI accelerator as a separate chiplet.
Posted on Reply
#111
fevgatos
Vayra86FTFY. Funnily the first sentence up here proves the last. You should buy a mirror. Free RT.
Okay, do you think for example w1z considers them gimmicks? Dodge the question, go ahead
Posted on Reply
#113
AusWolf
fevgatosOkay, do you think for example w1z considers them gimmicks? Dodge the question, go ahead
What has that got to do with what I, @Vayra86 or anyone else considers them? I'm under the impression that we're all adults, fully capable of forming independent opinions.
Posted on Reply
#114
evernessince
RH92Me ? Nope ... my GPU does though www.nvidia.com/en-us/geforce/news/nvidia-rtx-games-engines-apps/ :



I mean you guys need to wake up , it's 2023 we are well past 2018 , both ray tracing and machine learning anti aliasing have seen wide adoption and aren't going anywhere if anything else they are gaining importance over raster every year ... In risk of repeating myself , AMD is failing to read the room big time !
DimitrimanAll I read is: "We will not have an answer to DLSS 3.0 with RDNA 4". The speech seems entirely geared towards expectation management.
Seems AMD GPU division is happy to continue living with Nvidia scraps.
From the article

"While acknowledging NVIDIA's movement in the GPU-accelerated AI space, AMD said that it didn't believe that image processing and performance-upscaling is the best use of the AI-compute resources of the GPU, and that the client segment still hasn't found extensive use of GPU-accelerated AI (or for that matter, even CPU-based AI acceleration). AMD's own image processing tech, FSR, doesn't leverage AI acceleration. Wang said that with the company introducing AI acceleration hardware with its RDNA3 architecture, he hopes that AI is leveraged in improving gameplay—such as procedural world generation, NPCs, bot AI, etc; to add the next level of complexity; rather than spending the hardware resources on image-processing."

For the love of god people read the short article. AMD is not giving up on AI, they just focusing on what will have the biggest impact for gaming on their gaming GPUs.


@Dimitriman AMD announced a DLSS 3.0 competitor some time ago during the launch of RNDA3 GPUs. Technically you are correct, AMD won't have a DLSS3.0 competitor with RDNA4 but that's because they already will have it under RDNA3.
Posted on Reply
#115
fevgatos
Steevowww.techpowerup.com/review/hogwarts-legacy-benchmark-test-performance-analysis/7.html

Maybe read this conclusion and much like art draw from it what you want to see. Asking another person to speak for someone else is strange.
Can you be more exact about what you want me to read? Didn't find any mentions of dlss or fg. Go through the trouble and quote the part you are talking about please
AusWolfWhat has that got to do with what I, @Vayra86 or anyone else considers them? I'm under the impression that we're all adults, fully capable of forming independent opinions.
Uhm, because he literally said that they are only considered features in circlejerk bubbles. Anyways, doesn't matter, to think that up scalers are gimmicks just screams ignorance or agenda, and you can't argue with either
Posted on Reply
#116
AsRock
TPU addict
Vya DomusWho really cares, they're both right and wrong, besides upscaling the ML hardware accelerators really are worthless for the consumer space, at the same time they wont be used for anything else any time soon.




You're both beyond utterly wrong though, over 3 billion in revenue is not insignificant by any stretch of the imagination.



They've always made a huge chunk of their money from consumer products, sadly for Nvidia marketing and sponsorship deals don't work very well outside of the consumer market. You can't buy your way to success as easily and actually have to provide extremely competitive pricing because ROI is critical to businesses as opposed to regular consumers so you can't just price everything to infinity and provide shit value.
nVidia will keep on it though as it gives them a selling point, kinda like ray tracing it's all to get you to keep buying at each release.
Posted on Reply
#117
AusWolf
fevgatosUhm, because he literally said that they are only considered features in circlejerk bubbles. Anyways, doesn't matter, to think that up scalers are gimmicks just screams ignorance or agenda, and you can't argue with either
I've tried both DLSS and FSR, and concluded that they look like crap on my 1080p monitor. Where's the agenda? :roll:

If forming an opinion based on first-hand experience means ignorance to you, then maybe you're the one with the agenda and there's nothing left to talk about.
Posted on Reply
#118
Patriot
AusWolfI've tried both DLSS and FSR, and concluded that they look like crap on my 1080p monitor. Where's the agenda? :roll:

If forming an opinion based on first-hand experience means ignorance to you, then maybe you're the one with the agenda and there's nothing left to talk about.
Having a stack of gpus... DLSS 1.0 and FSR 1.0 were like smearing Vaseline on your screen... DLSS 2.1 is fucking black magic and makes things better than without... I also have a 4k 144hz panel and crispness is required.

Stack of cards are Turing gen RTX Quadros, RTX3080/ti/90, RX6700xt/6900xt
FSR2.0 is pretty much on par with DLSS 2.0/2.1 depending on the game.

I have not messed with DLSS 3 as that is frame gen and needs 4k gen that I don't have... I also don't see frame gen as an advantage as I am into competitive low latency play.
Posted on Reply
#119
AusWolf
PatriotHaving a stack of gpus... DLSS 1.0 and FSR 1.0 were like smearing Vaseline on your screen... DLSS 2.1 is fucking black magic and makes things better than without... I also have a 4k 144hz panel and crispness is required.

Stack of cards are Turing gen RTX Quadros, RTX3080/ti/90, RX6700xt/6900xt
FSR2.0 is pretty much on par with DLSS 2.0/2.1 depending on the game.
For the record, I played Cyberpunk 2077 with an RTX 2070 and DLSS 2.4. The 'Ultra Quality' mode was bearable, but nothing like 1080p native. All the other modes looked like a Van Gogh painting instead of a moving, breathing game. Your experience might differ at higher resolutions, but 1080p and upscaling just aren't meant to be, imo. So for me, it's a gimmick.
Posted on Reply
#120
Dr. Dro
AusWolfFor the record, I played Cyberpunk 2077 with an RTX 2070 and DLSS 2.4. The 'Ultra Quality' mode was bearable, but nothing like 1080p native. All the other modes looked like a Van Gogh painting instead of a moving, breathing game. Your experience might differ at higher resolutions, but 1080p and upscaling just aren't meant to be, imo. So for me, it's a gimmick.
Sorry mate, Van Gogh's majestic artworks have far more life than Cyberflop 2077 at 8K native would ever have. I personally don't understand how this game still has any traction, I suppose it's people's way to cope with the fact we've been lied to for years and that they never delivered even half the game they promised after years upon years of delays.
Posted on Reply
#121
AusWolf
Dr. DroSorry mate, Van Gogh's majestic artworks have far more life than Cyberflop 2077 at 8K native would ever have. I personally don't understand how this game still has any traction, I suppose it's people's way to cope with the fact we've been lied to for years and that they never delivered even half the game they promised after years upon years of delays.
It was not a comment on the game (although I didn't find it as bad as people commonly do). It was a reference to the image being blurry as heck with basically any DLSS setting (except for off). I didn't think I had to explain, but here we go.
Posted on Reply
#122
ratirt
PatriotI have not messed with DLSS 3 as that is frame gen and needs 4k gen that I don't have... I also don't see frame gen as an advantage as I am into competitive low latency play.
That is a bummer. 4th gen and then DLSS4 with 5th gen. See a pattern here and that is one of the reasons it is hard for me to buy NV for hard cash. They change their products every gen looking for something I guess. DLSS3 would have been nice for low end card that struggles but the problem is you have to use it on a top end one. RT is the future but we are not there yet and that is why. Even if it is all cotton candy sweet, it is still in the making and not ready.
AusWolfFor the record, I played Cyberpunk 2077 with an RTX 2070 and DLSS 2.4. The 'Ultra Quality' mode was bearable, but nothing like 1080p native. All the other modes looked like a Van Gogh painting instead of a moving, breathing game. Your experience might differ at higher resolutions, but 1080p and upscaling just aren't meant to be, imo. So for me, it's a gimmick.
Upscale is good for 4k or low end card that barely can play anything. 1080p and upscaler is a misunderstanding at best.
AusWolfIt was not a comment on the game (although I didn't find it as bad as people commonly do). It was a reference to the image being blurry as heck with basically any DLSS setting (except for off). I didn't think I had to explain, but here we go.
I think you can sharpen the image. The problem is FSR and DLSS have different scale even though set the same. You have to find a sweet spot for each one separately. DLSS does not have to be blurry as much.
fevgatosThen 4090 is in fact very efficient. Actually it is the most efficient card out there, especially for heavier workloads, not just gaming. I have it with a 320w power limit and it performs better than at stock, I can post you some record breaking numbers at just 320w.
There is plenty of showcases about that so there's no need to do that. If you limit it you will lose performance. 4090 is a very powerful card and I'm sure, chopping off some performance for less power draw is not a disaster but still you do lower the performance for some power saving. It is a good tradeoff though.
Posted on Reply
#123
AusWolf
ratirtUpscale is good for 4k or low end card that barely can play anything. 1080p and upscaler is a misunderstanding at best.


I think you can sharpen the image. The problem is FSR and DLSS have different scale even though set the same. You have to find a sweet spot for each one separately. DLSS does not have to be blurry as much.
My sweet spot is DLSS/FSR turned off. Like I said, mileage may vary at higher resolutions, but for me at 1080p with no plans to upgrade, it's not gonna be a thing. Spending hundreds on a new monitor only to be forced to spend hundreds more on a graphics card and/or use some upscaling trickery for playable framerates sounds counter-intuitive to me. That's why I think it's a gimmick.
Posted on Reply
#124
Dr. Dro
AusWolfIt was not a comment on the game (although I didn't find it as bad as people commonly do). It was a reference to the image being blurry as heck with basically any DLSS setting (except for off). I didn't think I had to explain, but here we go.
Oh, I know. Really needed to vent though, sorry ;):laugh:
Posted on Reply
#125
fevgatos
ratirtThere is plenty of showcases about that so there's no need to do that. If you limit it you will lose performance. 4090 is a very powerful card and I'm sure, chopping off some performance for less power draw is not a disaster but still you do lower the performance for some power saving. It is a good tradeoff though.
Yeah, i lost around 1.3% :roll:
AusWolfI've tried both DLSS and FSR, and concluded that they look like crap on my 1080p monitor. Where's the agenda? :roll:

If forming an opinion based on first-hand experience means ignorance to you, then maybe you're the one with the agenda and there's nothing left to talk about.
1080p dunno, might be terrible, but it's not needed for 1080p anyways.. At 4k you really can't tell the difference. Actually, the magical part is, in lots of games it actually increases image quality. Plenty of games with poor implementation of TAA look like crap at native. So not only do you get a noticeable performance increase, not only is your GPU drawing much less power with it active, in some games you also get better graphics.
Posted on Reply
Add your own comment
Aug 14th, 2024 11:56 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts