Monday, February 20th 2023

AMD RDNA4 Architecture to Build on Features Relevant to Gaming Performance, Doesn't Want to be Baited into an AI Feature Competition with NVIDIA
AMD's next-generation RDNA4 graphics architecture will retain a design-focus on gaming performance, without being drawn into an AI feature-set competition with rival NVIDIA. David Wang, SVP Radeon Technologies Group; and Rick Bergman, EVP of Computing and Graphics Business at AMD; gave an interview to Japanese tech publication 4Gamers, in which they dropped the first hints on the direction which the company's next-generation graphics architecture will take.
While acknowledging NVIDIA's movement in the GPU-accelerated AI space, AMD said that it didn't believe that image processing and performance-upscaling is the best use of the AI-compute resources of the GPU, and that the client segment still hasn't found extensive use of GPU-accelerated AI (or for that matter, even CPU-based AI acceleration). AMD's own image processing tech, FSR, doesn't leverage AI acceleration. Wang said that with the company introducing AI acceleration hardware with its RDNA3 architecture, he hopes that AI is leveraged in improving gameplay—such as procedural world generation, NPCs, bot AI, etc; to add the next level of complexity; rather than spending the hardware resources on image-processing.AMD also stressed on the need to make the GPU more independent of the CPU in graphics rendering. The company took several steps in this direction over the past many generations, with the most recent being the multi-draw indirect accelerator (MDIA) component introduced with RDNA3. Using this, software can dispatch multiple instanced draw commands that can be issued on the GPU, greatly reducing the CPU-level overhead. RDNA3 is up to 2.3x more efficient at this than RDNA2. Expect more innovations along these lines with RDNA4.
AMD understandably didn't talk anything about the "when," "what," and "how" of RDNA4 as its latest RDNA3 architecture is just off the ground, and awaiting a product ramp through 2023 into the various market-segments spanning from iGPUs to mobile GPUs, and mainstream desktop GPUs. RDNA3 is currently powering the Radeon RX 7900 series high-end graphics cards, and the company's latest 5 nm "Phoenix Point" Ryzen 7000-series mobile processor iGPUs. You can catch the 4Gamer interview in the source link below.
Sources:
4Gamers.net, HotHardware
While acknowledging NVIDIA's movement in the GPU-accelerated AI space, AMD said that it didn't believe that image processing and performance-upscaling is the best use of the AI-compute resources of the GPU, and that the client segment still hasn't found extensive use of GPU-accelerated AI (or for that matter, even CPU-based AI acceleration). AMD's own image processing tech, FSR, doesn't leverage AI acceleration. Wang said that with the company introducing AI acceleration hardware with its RDNA3 architecture, he hopes that AI is leveraged in improving gameplay—such as procedural world generation, NPCs, bot AI, etc; to add the next level of complexity; rather than spending the hardware resources on image-processing.AMD also stressed on the need to make the GPU more independent of the CPU in graphics rendering. The company took several steps in this direction over the past many generations, with the most recent being the multi-draw indirect accelerator (MDIA) component introduced with RDNA3. Using this, software can dispatch multiple instanced draw commands that can be issued on the GPU, greatly reducing the CPU-level overhead. RDNA3 is up to 2.3x more efficient at this than RDNA2. Expect more innovations along these lines with RDNA4.
AMD understandably didn't talk anything about the "when," "what," and "how" of RDNA4 as its latest RDNA3 architecture is just off the ground, and awaiting a product ramp through 2023 into the various market-segments spanning from iGPUs to mobile GPUs, and mainstream desktop GPUs. RDNA3 is currently powering the Radeon RX 7900 series high-end graphics cards, and the company's latest 5 nm "Phoenix Point" Ryzen 7000-series mobile processor iGPUs. You can catch the 4Gamer interview in the source link below.
221 Comments on AMD RDNA4 Architecture to Build on Features Relevant to Gaming Performance, Doesn't Want to be Baited into an AI Feature Competition with NVIDIA
Instead of talking about RDNA4, how about actually releasing a new RDNA3 card? The RX 7900 cards came out over two months ago and we haven't really gotten much more than a sniff of what is to come. Thus far, nVidia has released the RTX 4090, 4080, 4070 Ti, is primed to release three versions of the 4070 and has been talking about the 4060 for awhile now. We've heard a few things about the RX 7800 XT but that was a long time ago. We haven't heard anything about when the 7800 cards will be available, let alone the 7700 and 7600 cards!
Maybe AMD should fix the RDNA3 situation before talking about what comes next. I'm sure that the people care more about what's happening now than what will happen in a few years time.
Developers dont care about margins that is something thats not relevant unless....
Is this a Nvidia vs AMD troll discussion or a discussion on if AMD are right to not invest in hardware AI chips?
That $750 price with put it on par with the 6950xt on performance. The headline will be "last gen $1100 performance for only $750!!! what a steal!!"
keep in mind nivinda released a 1030 with ddr4 years later then the 1030 ddr5 was already bad and it had a massive performance hit.... and it decived the avereage tech user look it up game ran 30-40% worse with the same name but 85% of h ppl couldn't tell you what the dif is between dd5 and ddr4 if ask that as a straight question
Major game projects are generally in development for several years, or so we've been told. If that is accurate, then what sort of hardware is in use at the developer's studio during that long development time? Is everyone using the x86 Cray in the basement? Tell me, how do you construct a game that will be dependent on hardware and drivers that don't yet exist that the game will require to run well upon release? (Yes, that is a rhetorical question.) But it seems that frame-generation and upscaling are being employed as the, "Get Out Of Jail Free" card to smooth over game development shortcomings.
As for RT, it won't be a thing for me personally until a $300 card made by IDon'tCare can run a heavily ray-traced, AAA game using Very High IQ settings at 1080p and never drop below 60 fps. And, without resorting to any sort of upscaling or frame-generation slight-of-hand. Until then, I can't be bothered. I refuse to be Captain Ahab chasing the globally illuminated White Whale, Moby Dick, all over the Atlantic Ocean that is thousands of dollars deep. For what? To play Assassin's Creed AI 2077? Forget that.
I don't know precisely what AMD's plans are, but I hope they see the obvious place where they could move some serious product; namely, that lower mid-tier range where most PC gamers live. Something akin to the RX 6750XT class, decently built, maybe with the new encoder and HDMI standard, and with enough AIB incentives to insure being in the $329 ~ $349 USD range at retail. We don't need no stinkin' AI.
Except games are not developed to run on hardware of tomorrow - more often than not, games are in fact tuned to hardware of yesteryear. Because that's where the market is - like you say yourself - the midrange and 300 eur card is the moment a performance level is truly 'enabled' in gaming. But the idea to tune to old hardware is in conflict with the approach of shiny graphics to impress and attract audience for a game. Upscaling technologies kinda solve that conflict.
RT is a funny one though, much like numerous heavy post processing effects, and that's where the FSR/DLSS approaches truly shine. Anything with (volumetric) / dynamic lighting, whether raster or RT based, simply incurs a heavy performance hit because its dynamic, in real time. Anything that is calculated in real time will take render time - continuously! - and incurs an FPS hit. No matter how fast the GPU. Upscaling technologies claw back some of that lost time, and in a way, enable more effects rather than applying a lower quality pass to make the game playable. Now, you can run through Cyberpunk's area's without losing half your FPS because of a volumetric patch of fog - the dip is less pronounced - even though the rest of the game might run a solid 60 FPS.
As for AI... I think its a dangerous development in many ways. It doesn't inspire creativity, but rather tunnel vision much like typical algorithms do: in the end, whether programmed or trained 'entities', human input is the basis for its decision making ability, and as such it is limited by it. We've already seen some shining examples from ChatGPT. For gaming, I see a purpose in game development, but not in the consumer space, and there is the risk of an overall decline in quality titles as the barrier of entry gets lower.
If you will say yeah they have RT's, well AMD's RDNA 3 can do RT's as well their AI accelerators can do that.
But what happens to other things like NPC's, Bots optimization. So we are looking at as to what platform that is more appealing to Game Dev's to make the game more fun and Immersive.
Do the same with any of the AI processing, make it vendor specific, and you'll lose out. To sell your game well, you'll need to make sure as many people as possible can run it. And since games are mostly developed once and ported over that applies not just on PC but also on consoles.
So we're back to RDNA2 compatibility for the given time.
Any developer even remotely interested in using hardware to improve live AI in games would be smart to do it in a way that runs on all of them without hitting performance too much.
Realtime raytracing has to look at it the same way. If it doesn't run sufficiently well on the midrange cards of both (or now all three) vendors, it won't be more than an interesting gimmick.
In the current market that would be a RTX 3060, RX 6650 XT and A750.
Make it run well there at high settings without upsampling and we're back where it used to be for many years.
The sub 350 cards always used to be able to run the then current games at then common resolutions at high settings. And without upsampling, because there was no upsampling.
In short, no developer would bring out a game in 2023 that uses all the AI and RT stuff only to set the hardware requirements to "RDNA3/Ada Lovelace or higher".
Last quarter:
greedia GPU business - 1.57 billion
amdia GPU + console business - 1.6 billion
how were the margins, cough?
FWIW I paid 'extra' for the performance per form factor on this one, didn't want an ageing 1650 or anaemic RX6400. Curious, from your unique perspective, is there a use case/buying case in your mind where it is acceptable to buy Nvidia without the head shake? Or does every buyer fit in the same bucket. I'm not after an argument, I'm genuinely curious what you think.
"Sure, let's make it prettier with RT but uglier with DLSS or FSR!" <- Somehow that makes no sense.
I eluded to this exactly in my post lol, except you cut that all out including my rationale, given I don't draw an arbitrary line where I think the tech is unacceptable despite the results, and just conclude it's ugly, well done. Second question ignored too, nice.