• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD RDNA4 Architecture to Build on Features Relevant to Gaming Performance, Doesn't Want to be Baited into an AI Feature Competition with NVIDIA

You are wrong there. Society is already addicted to instant results, won't take a couple generations.
Would you jump of a cliff if you thought you could get a better TimeSpy score?
At the moment we still have enough people considering "long term benefits" running companies. People postuling as a pure"A.I concept artist/designer" will get refused by big companies.
But I don't know if this will still be the case in 50 years, once the millenial/early gen z retire...
 
Sure, Nvidia has more stuff in their GPUs, but whether you call them features or gimmicks is highly debatable.
Whoever is calling them gimmicks is clueless / hasn't tried them or is sworn amd fan. It's not debatable among normal sane people
 
Modders already incorporated chatGPT into bannerlord game, AI enhanced storytelling will be neat

So yeah, AMD is way behind in everything
Myeah... too bad the game isn't fun. Its a repetitive POS copied over from part 1. AI fits right in, generic, randomly generated BS is the whole game. Game ain't even finished proper, btw, but devs say it is.

If anything this proves AI has emerged in a great place: bottom barrel content :)

Also, how does this affect what GPU you run it on? This is Gaming As A Service, buddy. Not client side AI.

Excellent example..
Yeah, it really is lol.

This is how buzzwords get people's fantasies to run wild. There isn't a single good game with AI in it, and Bannerlord isn't a better game because of it.
 
Whoever is calling them gimmicks is clueless / hasn't tried them or is sworn amd fan. It's not debatable among normal sane people
I've tried them all except for frame generation, I'm not an AMD fan, and I'm calling them gimmicks.
 
Whoever is calling them gimmicks is clueless / hasn't tried them or is sworn amd fan. It's not debatable among my circlejerk-bubble normal sane people
FTFY. Funnily the first sentence up here proves the last. You should buy a mirror. Free RT.
 
Honestly, AMD seems to have enough on their plate. RDOA 3 missed the mark tremendously (falling short of internal projections/AMD's own marketing claims, end-user performance expectations and still had that manufacturing defect at launch), and in the consumer space, Zen 4 was far from a resounding commercial success.

No matter how much you may like AMD, this round fluked in comparison to Navi 21 and Zen 3, which are probably AMD's most solid products in a very, very long time. The extremely poor sales numbers for RDNA 3 lineup in general plus the crazy pack-ins and bundles on Zen 4 show exactly what's happening.

They can either stick to their old guns, or run after innovation, which they cannot afford at the moment. Perhaps, as defeatist as Wang's argument is, it may very well be the best course of action for the company right now, even if the market has clearly spoken that it is interested in Nvidia's technologies and statements: the 4090 is outselling everything else in the market combined right now, even at its absurdly high price. The aforementioned Nokia analogy may very well apply here.

If anything, they have some soul searching to do. Godspeed.

Late edit: I don't hate AMD... Why would I?
 
Last edited:
What's all this banter about AI, AI is a non-issue? Just find a way (whatever it proves to be) to make ray-tracing usable in graphics if it's so far adopted that it's a part of Direct3D.
With so many graphics people ATi has one has to wonder what they are doing there; or are they peradventure busy patching up all the deficiencies their architecture accumulated through the ages?
Inquiring minds want to know. :)
 
Would be funny if Nvidia delivered an AI model for NPC and bots but run like crap on AMD (4090 has 5x the tensor ops throughput of 7900XTX), basically what David Wang said himself



Then it will become gimmick according to some people :rolleyes:.

The thing is that the AI model would be trained by the developers, not the end user.

I don't really see game engines implementing an AI feature that's going to screw over console compatibility, especially for something that you can't turn off like NPC AI.
 
Last edited:
Honestly, AMD seems to have enough on their plate. RDOA 3 missed the mark tremendously (falling short of internal projections, AMD's own marketing claims, end-user performance expectations and still had that manufacturing defect at launch), and in the consumer space, Zen 4 was far from a resounding commercial success.

No matter how much you may like AMD, this round fluked in comparison to Navi 21 and Zen 3, which are probably AMD's most solid products in a very, very long time. The extremely poor sales numbers for RDNA 3 lineup in general plus the crazy pack-ins and bundles on Zen 4 show exactly what's happening.

They can either stick to their old guns, or run after innovation, which they cannot afford at the moment. Perhaps, as defeatist as Wang's argument is, it may very well be the best course of action for the company right now, even if the market has clearly spoken that it is interested in Nvidia's technologies and statements: the 4090 is outselling everything else in the market combined right now, even at its absurdly high price. The aforementioned Nokia analogy may very well apply here.

If anything, they have some soul searching to do. Godspeed.
Zen4 is an absolute success, but only for servers whose margin is very high and AVX512 has effective use. While the AM5 platform is relatively expensive, here the motherboards cost twice as much as the Intel models.

In my opinion, knowing that CPUs like R5 and R7 are aimed at gamers, the X3D line should have been released from the beginning and the Non X line would be the cheaper alternatives released later. That would make more sense.
 
Hi,
Yeah that's exactly what we need AIsplaining :laugh:
 
Zen4 is an absolute success, but only for servers whose margin is very high and AVX512 has effective use. While the AM5 platform is relatively expensive, here the motherboards cost twice as much as the Intel models.

In my opinion, knowing that CPUs like R5 and R7 are aimed at gamers, the X3D line should have been released from the beginning and the Non X line would be the cheaper alternatives released later. That would make more sense.

Only for servers, the client segment is a biblical flop to the point that bundles giving away motherboards and RAM, in some cases both, were needed to shift stock. The high cost of the motherboards and the DDR5 requirement are a pretty effective deterrent for adoption, especially since gamers could just buy the 5800X3D and have a flawless experience already.
 
Me ? Nope ... my GPU does though https://www.nvidia.com/en-us/geforce/news/nvidia-rtx-games-engines-apps/ :

geforce-rtx-dlss-rt-games-ces-2023.jpg


I mean you guys need to wake up , it's 2023 we are well past 2018 , both ray tracing and machine learning anti aliasing have seen wide adoption and aren't going anywhere if anything else they are gaining importance over raster every year ... In risk of repeating myself , AMD is failing to read the room big time !

Well, I didn't say that your GPU can't do it, but wondering do you use them. If you have,.but you don't you use it, so I dunno why we should brag about it. You need it - blah, blah. :D

Of course it is great if you have more options than less options, but on other hand like mentioned. If you don't use it - then meh.

On 3060 ti have never used tensor cores and ray tracing tried only in minecraft and to get 30fps... cubes, cubes give you 30fps... at that moment I was like. Yeah, I have, but it is not enjoyable so meeh. Not to mention it only works beautiful on nvidia few created maps and that is it.
 
I think it's the right approach. RDNA should focus on gaming performance.

In the future, when AI performance is important, AMD is much better off adding a Xilinx AI accelerator as a separate chiplet.
 
FTFY. Funnily the first sentence up here proves the last. You should buy a mirror. Free RT.
Okay, do you think for example w1z considers them gimmicks? Dodge the question, go ahead
 
Okay, do you think for example w1z considers them gimmicks? Dodge the question, go ahead

Maybe read this conclusion and much like art draw from it what you want to see. Asking another person to speak for someone else is strange.
 
Okay, do you think for example w1z considers them gimmicks? Dodge the question, go ahead
What has that got to do with what I, @Vayra86 or anyone else considers them? I'm under the impression that we're all adults, fully capable of forming independent opinions.
 
Me ? Nope ... my GPU does though https://www.nvidia.com/en-us/geforce/news/nvidia-rtx-games-engines-apps/ :

geforce-rtx-dlss-rt-games-ces-2023.jpg


I mean you guys need to wake up , it's 2023 we are well past 2018 , both ray tracing and machine learning anti aliasing have seen wide adoption and aren't going anywhere if anything else they are gaining importance over raster every year ... In risk of repeating myself , AMD is failing to read the room big time !
All I read is: "We will not have an answer to DLSS 3.0 with RDNA 4". The speech seems entirely geared towards expectation management.
Seems AMD GPU division is happy to continue living with Nvidia scraps.

From the article

"While acknowledging NVIDIA's movement in the GPU-accelerated AI space, AMD said that it didn't believe that image processing and performance-upscaling is the best use of the AI-compute resources of the GPU, and that the client segment still hasn't found extensive use of GPU-accelerated AI (or for that matter, even CPU-based AI acceleration). AMD's own image processing tech, FSR, doesn't leverage AI acceleration. Wang said that with the company introducing AI acceleration hardware with its RDNA3 architecture, he hopes that AI is leveraged in improving gameplay—such as procedural world generation, NPCs, bot AI, etc; to add the next level of complexity; rather than spending the hardware resources on image-processing."

For the love of god people read the short article. AMD is not giving up on AI, they just focusing on what will have the biggest impact for gaming on their gaming GPUs.


@Dimitriman AMD announced a DLSS 3.0 competitor some time ago during the launch of RNDA3 GPUs. Technically you are correct, AMD won't have a DLSS3.0 competitor with RDNA4 but that's because they already will have it under RDNA3.
 

Maybe read this conclusion and much like art draw from it what you want to see. Asking another person to speak for someone else is strange.
Can you be more exact about what you want me to read? Didn't find any mentions of dlss or fg. Go through the trouble and quote the part you are talking about please

What has that got to do with what I, @Vayra86 or anyone else considers them? I'm under the impression that we're all adults, fully capable of forming independent opinions.
Uhm, because he literally said that they are only considered features in circlejerk bubbles. Anyways, doesn't matter, to think that up scalers are gimmicks just screams ignorance or agenda, and you can't argue with either
 
Who really cares, they're both right and wrong, besides upscaling the ML hardware accelerators really are worthless for the consumer space, at the same time they wont be used for anything else any time soon.




You're both beyond utterly wrong though, over 3 billion in revenue is not insignificant by any stretch of the imagination.
View attachment 284681


They've always made a huge chunk of their money from consumer products, sadly for Nvidia marketing and sponsorship deals don't work very well outside of the consumer market. You can't buy your way to success as easily and actually have to provide extremely competitive pricing because ROI is critical to businesses as opposed to regular consumers so you can't just price everything to infinity and provide shit value.

nVidia will keep on it though as it gives them a selling point, kinda like ray tracing it's all to get you to keep buying at each release.
 
Uhm, because he literally said that they are only considered features in circlejerk bubbles. Anyways, doesn't matter, to think that up scalers are gimmicks just screams ignorance or agenda, and you can't argue with either
I've tried both DLSS and FSR, and concluded that they look like crap on my 1080p monitor. Where's the agenda? :roll:

If forming an opinion based on first-hand experience means ignorance to you, then maybe you're the one with the agenda and there's nothing left to talk about.
 
I've tried both DLSS and FSR, and concluded that they look like crap on my 1080p monitor. Where's the agenda? :roll:

If forming an opinion based on first-hand experience means ignorance to you, then maybe you're the one with the agenda and there's nothing left to talk about.
Having a stack of gpus... DLSS 1.0 and FSR 1.0 were like smearing Vaseline on your screen... DLSS 2.1 is fucking black magic and makes things better than without... I also have a 4k 144hz panel and crispness is required.

Stack of cards are Turing gen RTX Quadros, RTX3080/ti/90, RX6700xt/6900xt
FSR2.0 is pretty much on par with DLSS 2.0/2.1 depending on the game.

I have not messed with DLSS 3 as that is frame gen and needs 4k gen that I don't have... I also don't see frame gen as an advantage as I am into competitive low latency play.
 
Having a stack of gpus... DLSS 1.0 and FSR 1.0 were like smearing Vaseline on your screen... DLSS 2.1 is fucking black magic and makes things better than without... I also have a 4k 144hz panel and crispness is required.

Stack of cards are Turing gen RTX Quadros, RTX3080/ti/90, RX6700xt/6900xt
FSR2.0 is pretty much on par with DLSS 2.0/2.1 depending on the game.
For the record, I played Cyberpunk 2077 with an RTX 2070 and DLSS 2.4. The 'Ultra Quality' mode was bearable, but nothing like 1080p native. All the other modes looked like a Van Gogh painting instead of a moving, breathing game. Your experience might differ at higher resolutions, but 1080p and upscaling just aren't meant to be, imo. So for me, it's a gimmick.
 
For the record, I played Cyberpunk 2077 with an RTX 2070 and DLSS 2.4. The 'Ultra Quality' mode was bearable, but nothing like 1080p native. All the other modes looked like a Van Gogh painting instead of a moving, breathing game. Your experience might differ at higher resolutions, but 1080p and upscaling just aren't meant to be, imo. So for me, it's a gimmick.

Sorry mate, Van Gogh's majestic artworks have far more life than Cyberflop 2077 at 8K native would ever have. I personally don't understand how this game still has any traction, I suppose it's people's way to cope with the fact we've been lied to for years and that they never delivered even half the game they promised after years upon years of delays.
 
Sorry mate, Van Gogh's majestic artworks have far more life than Cyberflop 2077 at 8K native would ever have. I personally don't understand how this game still has any traction, I suppose it's people's way to cope with the fact we've been lied to for years and that they never delivered even half the game they promised after years upon years of delays.
It was not a comment on the game (although I didn't find it as bad as people commonly do). It was a reference to the image being blurry as heck with basically any DLSS setting (except for off). I didn't think I had to explain, but here we go.
 
I have not messed with DLSS 3 as that is frame gen and needs 4k gen that I don't have... I also don't see frame gen as an advantage as I am into competitive low latency play.
That is a bummer. 4th gen and then DLSS4 with 5th gen. See a pattern here and that is one of the reasons it is hard for me to buy NV for hard cash. They change their products every gen looking for something I guess. DLSS3 would have been nice for low end card that struggles but the problem is you have to use it on a top end one. RT is the future but we are not there yet and that is why. Even if it is all cotton candy sweet, it is still in the making and not ready.

For the record, I played Cyberpunk 2077 with an RTX 2070 and DLSS 2.4. The 'Ultra Quality' mode was bearable, but nothing like 1080p native. All the other modes looked like a Van Gogh painting instead of a moving, breathing game. Your experience might differ at higher resolutions, but 1080p and upscaling just aren't meant to be, imo. So for me, it's a gimmick.
Upscale is good for 4k or low end card that barely can play anything. 1080p and upscaler is a misunderstanding at best.

It was not a comment on the game (although I didn't find it as bad as people commonly do). It was a reference to the image being blurry as heck with basically any DLSS setting (except for off). I didn't think I had to explain, but here we go.
I think you can sharpen the image. The problem is FSR and DLSS have different scale even though set the same. You have to find a sweet spot for each one separately. DLSS does not have to be blurry as much.

Then 4090 is in fact very efficient. Actually it is the most efficient card out there, especially for heavier workloads, not just gaming. I have it with a 320w power limit and it performs better than at stock, I can post you some record breaking numbers at just 320w.
There is plenty of showcases about that so there's no need to do that. If you limit it you will lose performance. 4090 is a very powerful card and I'm sure, chopping off some performance for less power draw is not a disaster but still you do lower the performance for some power saving. It is a good tradeoff though.
 
Last edited:
Back
Top