Thursday, January 23rd 2025

AMD is Taking Time with Radeon RX 9000 to Optimize Software and FSR 4

When AMD announced its upcoming Radeon RX 9000 series of GPUs based on RDNA 4 IP, we expected the general availability to follow soon after the CES announcement. However, it turns out that AMD has scheduled its Radeon RX 9000 series availability for March, as the company is allegedly optimizing the software stack and its FidelityFX Super Resolution 4 (FSR 4) for a butter smooth user experience. In a response on X to Hardware Unboxed, AMD's David McAfee shared, "I really appreciate the excitement for RDNA 4. We are focused on ensuring we deliver a great set of products with Radeon 9000 series. We are taking a little extra time to optimize the software stack for maximum performance and enable more FSR 4 titles. We also have a wide range of partners launching Radeon 9000 series cards, and while some have started building initial inventory at retailers, you should expect many more partner cards available at launch."

AMD is taking its RDNA 4 launch more cautiously than before, as it now faces a significant problem with NVIDIA and its waste portfolio of software optimization and AI-enhanced visualization tools. The FSR 4 introduces a new machine learning (ML) based upscaling component to handle Super Resolution. This will be paired with Frame Generation and an updated Anti-Lag 2 to make up the FSR 4 feature set. Optimizing this is the number one priority, and AMD plans to get more games on FSR 4 so gamers experience out-of-the-box support.
Source: David McAfee
Add your own comment

185 Comments on AMD is Taking Time with Radeon RX 9000 to Optimize Software and FSR 4

#101
Vayra86
AusWolfI don't disagree, but selling one's sympathy for DLSS as a "must-have feature" rather than a personal opinion is a bit daft, especially when 45% of people on a front page poll say they don't use any upscaling.
I do think the market is pretty clear on that. Upscaling is here to stay. It was clear from the onset. The advantages are there, its a huge technological leap and it does enable new levels of graphical fidelity. AMD knows this too, and they actively deploy the tech to sell their console APUs and keep doing so. RDNA4 and FSR4 are entirely in service to that. But again: followers, not leaders. Why the f*ck do we get a single FSR4 preview just moments after Nvidia showed us their meat? What does that implicitly tell us?

It certainly does one thing: instill the idea that FSR4 while in the works, is still not really ready, which again is a repeat of the past that some fools tend to call 'fine wine' when its in fact just AMD finishing its products years post release. Forget beta. You just don't know what it is until they say they're done, and then you can still be left with something that doesn't deliver like the competing tech does. This is the story of FSR in a nutshell so far.
Posted on Reply
#102
AusWolf
AssimilatorClaiming something is a "must-have feature" is a personal opinion, though. It's just an opinion that happens to be borne out by the market as a whole.
"Personal opinion" is a bit different when you're a highly respected reviewer. This is where my most hated word "influencer" comes into play. You influence people's buying decisions. As such, you should be aware that not everyone's needs are the same. This is why you can't use words like "must-have feature" without sounding like a marketing guy.

Personally, I always try my best to recommend products that match the person's needs that I'm recommending for, and not my personal taste.
Assimilator45% of people on (a) an enthusiast forum (b) an enthusiast forum with a historically pro-AMD tilt.
I see that on a few people, but on the forum as a whole... nah.
Posted on Reply
#103
Neo_Morpheus
Chrispy_Will they be working with game developers to migrate popular existing titles to FSR4?

One of the problems with FSR is that if a game shipped with FSR1 or FSR2, it's usually stuck on that ancient, sub-par version of the upscaler. Nvidia's DLSS overrides coming to the Nvidia app are a big deal, especially if DLSS4 fixes the worst problems (smear, ghosting, and detail loss) of DLSS3. AMD kind of need this too, because people don't exclusively play the latest, newest AAA titles. I'm regularly firing up games from the last 5 years or so and my Steam backlog is no joke.
I dont know exactly how that can be achieved but i know that many devs will only implement dlss and ignore fsr, mainly due to market share , maybe by ngreedia paying them or helping them for free, which i have to say, as a cash strapped dev, would be a sweet deal.

Being old school, i hate any “feature” that will lock me to a vendor and will gladly take an inferior option if it means that its an option open to everyone.

That was something that old reviewers and smart consumers would warn you about and would act upon.

Today? Well, you can see for yourself the nice monopoly and its benefits that we are enjoying.
Posted on Reply
#104
AusWolf
Vayra86its a huge technological leap
Yeah, it's slightly better than lowering your game resolution. Wow.
Vayra86and it does enable new levels of graphical fidelity.
Like what? Blur and artefacts? :confused:
Posted on Reply
#105
Assimilator
Chrispy_Will they be working with game developers to migrate popular existing titles to FSR4?
According to the AMD crowd here, working with a developer to implement vendor-specific features into their game is bribing that developer. At least, it is when NVIDIA does it.
AusWolfI don't think the 5090 and 5080 are that important in this regard. They're both priced way out of reach of people looking for a 5070-level card. We've also learned in the last 2-3 generations that the performance of the halo card has little to no effect on the rest of the product stack. Personally, I won't even read their reviews in entirety, just the architectural differences, because how much faster, hungrier and more expensive we can go above the 4090, I honestly don't care.
It's not about affordability but mindshare: basic marketing/psychology 101, get your brand into the collective consciousness as often as possible. That's why we see at least 1 announcement per week from NVIDIA that they are adding DLSS to more games.

And 5090 and 5080 performance will absolutely shape consumer expectations as to the lower cards in the stack. If you consistently deliver market-leading performance, consumers come to associate your brand as a whole with market-leading performance, even if the lower-tier SKUs don't necessarily offer said performance. Again, psychology.
Posted on Reply
#106
CyberPomPom
Neo_MorpheusSee above and its actually worse.
I was merely joking at the typo error in the news, "waste portfolio" for "vaste portfolio".

I don't believe DLSS and FSR techs to be neither the waste "real frame" enjoyers think they are or the panacea NVidia tries to market.
Posted on Reply
#107
Krit
dartuilreally hope 7900xt perf for 9070xt for 600
You have some strange thinking because RX 7900 XT already costs less than 649€. By your logic there should not be p/p improvement over last gen ? Wake up!
Posted on Reply
#108
AusWolf
AssimilatorIt's not about affordability but mindshare: basic marketing/psychology 101, get your brand into the collective consciousness as often as possible. That's why we see at least 1 announcement per week from NVIDIA that they are adding DLSS to more games.

And 5090 and 5080 performance will absolutely shape consumer expectations as to the lower cards in the stack. If you consistently deliver market-leading performance, consumers come to associate your brand as a whole with market-leading performance, even if the lower-tier SKUs don't necessarily offer said performance. Again, psychology.
People think I'm an Nvidia-hater, but this is what I hate, not Nvidia or their products (which I own quite a few of). Herd mentality. Disgusting.
Posted on Reply
#109
Vayra86
AusWolfYeah, it's slightly better than lowering your game resolution. Wow.


Like what? Blur and artefacts? :confused:
It is more performance out of your hardware at the expense of minor graphical impact. It provides a very useful extra option to tweak games to desired performance, especially on lower end cards. Resolution alone cannot achieve that without actually losing a lot of on screen information. Upscaling doesn't limit the viewport in a 3D game, lower resolution does.

And if the input res is high enough, its a superior form of AA and enables RT; so there is also a place for it in the high end.

I can still personally not prefer it, and I still don't. But I'm not blind to its pros.
Posted on Reply
#110
AusWolf
Vayra86It is more performance out of your hardware at the expense of minor graphical impact.
You know what else does that? Lowering your graphics settings. And it doesn't introduce blur, either.
Vayra86It provides a very useful extra option to tweak games to desired performance, especially on lower end cards. Resolution alone cannot achieve that without actually losing a lot of on screen information.
Yes, it's very useful on lower end cards. But how high-end cards can sell on this feature is beyond me.
Posted on Reply
#111
Assimilator
AusWolfI see that on a few people, but on the forum as a whole... nah.
TPU really exploded when W1zz created the cross-flasher for the ATI 5850/5870, and naturally a lot of the people who joined up at that time were owners of ATI hardware who wanted to ask questions about that tool (that's how I started here). Then those people stuck around, and as a lot of other tech sites closed down the pro-ATI and then pro-AMD members of those migrated here, and the end result is a forum that very much skews further to the AMD side of the fence than the median.
AusWolfPeople think I'm an Nvidia-hater, but this is what I hate, not Nvidia or their products (which I own quite a few of). Herd mentality. Disgusting.
Unfortunately that's a basic human psychology issue, and good luck changing it.
Vayra86It is more performance out of your hardware at the expense of minor graphical impact. It provides a very useful extra option to tweak games to desired performance, especially on lower end cards. Resolution alone cannot achieve that without actually losing a lot of on screen information.

And if the input res is high enough, its a superior form of AA.
Let's not get into the whole upscaling/framegen debate and what is or isn't fake/bad/low-quality/etc. here, please? All that's relevant is that the market has spoken and said that it wants upscaling and framegen; whether you personally like or dislike it, and why, is irrelevant because it's here to stay.
Posted on Reply
#112
Vayra86
AusWolfYou know what else does that? Lowering your graphics settings. And it doesn't introduce blur, either.


Yes, it's very useful on lower end cards. But how high-end cards can sell on this feature is beyond me.
Ehhh lowering graphics settings doesn't introduce blur and missing on screen information? I'll remember that next time I look at a block that should have already become a circle but its LOD didn't update yet and the texture isn't there either.

Come on man.
Posted on Reply
#113
Onasi
AusWolfPeople think I'm an Nvidia-hater, but this is what I hate, not Nvidia or their products (which I own quite a few of). Herd mentality. Disgusting.
It’s not herd mentality. It’s actual company strategy at work. One company knows how to sell their products. The other is ran by what feels like genuine cretins who would fail to sell bottled water to a dehydrated man in the desert. No amount of calling people “sheeple” will change that simple fact.
Posted on Reply
#114
Neo_Morpheus
AusWolfWhy would you need any DDU skill for AMD? I haven't used DDU for at least 5 years. It has given me more grief than happiness lately anyway.
Perhaps the urban legend of all AMD drivers suck?
Vayra86It is more performance out of your hardware at the expense of minor graphical impact. It provides a very useful extra option to tweak games to desired performance, especially on lower end cards. Resolution alone cannot achieve that without actually losing a lot of on screen information. Upscaling doesn't limit the viewport in a 3D game, lower resolution does.

And if the input res is high enough, its a superior form of AA and enables RT; so there is also a place for it in the high end.

I can still personally not prefer it, and I still don't. But I'm not blind to its pros.
I understand and agree to a point on the pluses that you mention, but the only thing that I like of both is the image sharpening, which is better than all the others previously use techniques.

The rest is simply “cheating”, in my book.

That said, it reminds me when Imagination Tech introduced tiles with their PowerVR chips.
Before that, polygons were drawn completely regardless if they were visible to the player, which would take away from the available performance of the gpu, but with that technique, only the visible polygons were drawn, hence providing a boost.

The downside? Everyone needed to pay them for the patent, but we the consumers, all benefits from it without being locked into one vendor.
Posted on Reply
#115
AusWolf
Vayra86Ehhh lowering graphics settings doesn't introduce blur and missing on screen information? I'll remember that next time I look at a block that should have already become a circle but its LOD didn't update yet and the texture isn't there either.

Come on man.
That doesn't really happen in modern games. The difference between high/medium (sometimes even low) and ultra is minuscule. Come on, man. :)
OnasiIt’s not herd mentality. It’s actual company strategy at work. One company knows how to sell their products. The other is ran by what feels like genuine cretins who would fail to sell bottled water to a dehydrated man in the desert. No amount of calling people “sheeple” will change that simple fact.
Yet, we're talking about marketing, not the actual product itself. Why people care about marketing is beyond me.
Neo_MorpheusThe rest is simply “cheating”, in my book.
I wouldn't say cheating because it isn't trying to create the illusion of something better. It gives you some extra performance, at a lesser expense of image quality compared to lowering your resolution.

I just don't get how and why it is seen as the holy grail of modern gaming by many. It's a crutch for weaker GPUs, and for people getting their feet wet in 4K. That's it.
Neo_MorpheusPerhaps the urban legend of all AMD drivers suck?
Ah, ok. No comment.
Posted on Reply
#116
Vayra86
AusWolfThat doesn't really happen in modern games. The difference between high/medium (sometimes even low) and ultra is minuscule. Come on, man. :)


Yet, we're talking about marketing, not the actual product itself. Why people care about marketing is beyond me.
Well then I say you're not being quite fair to upscaling either if it cannot have minor artifacts and 'must be perfect' instead. They each deliver different impact on the image quality, but it depends a lot on the game in question what you might see of it.

Again, I don't personally prefer it over lowering settings in a smart way, but I can see why people do, and have seen instances where indeed it is preferable to lowering individual settings. Cyberpunk on my GTX 1080 was a neat example of that; I already WAS lowering settings, but it still wasn't enough, and upscale (FSR, so indeed very sub par, especially in that game!) still allowed me to have a pretty decent experience playing it. 50 FPS. And yes, it all looked like it was rendered at lower input res, certainly. But the fact you can keep a lot of effects in the image and you can still run high textures, helped me a lot more than I would have been helped with running the game on Low.
Posted on Reply
#117
Neo_Morpheus
AusWolfI just don't get how and why it is seen as the holy grail of modern gaming by many
Seems like they have hit a performance wall and are resorting to tricks like this and fake frames to compensate.
Posted on Reply
#118
Frick
Fishfaced Nincompoop
Vayra86Exactly this. AMD never takes a leadership position, never guides something to a new level, they always follow, copy and even fail to make a good copy sometimes. At that point they've got a product that's beyond saving but still developed and ready to go to market, not only less performant than the competition, but also too late. And even while they're too late, they need more time to bring features up to date.

Its like a trifecta of shitty decision making.
Putting out fires mode.
Posted on Reply
#119
Neo_Morpheus
Vayra86Cyberpunk on my GTX 1080 was a neat example of that
I am curious, did they ever implemented fsr3 or 3.1?

if yes, did it make the visuals better?

Lastly, thats why I prefer fsr since its open for everyone and I’m willing to trade a bit of visual quality just for “the greater good”.
Posted on Reply
#120
Vayra86
Neo_MorpheusI am curious, did they ever implemented fsr3 or 3.1?

if yes, did it make the visuals better?

Lastly, thats why I prefer fsr since its open for everyone and I’m willing to trade a bit of visual quality just for “the greater good”.
Last I heard Cyberpunk got a long overdue FSR update but its still not great and people on TPU wondered what had changed at all.
AusWolfYes, it's very useful on lower end cards. But how high-end cards can sell on this feature is beyond me.
Read this just now.. but... that's very simple, Nvidia shows us how that works right? They introduce features that will make games run like you bought a low - end card (create problem) and then they introduce features that fix that performance and still allow you to use the performance hogging features and feel like you didn't actually buy something underspecced for the features on offer.

And the kicker is, Nvidia isn't wrong because they pull this forward and innovate on it. Because they lead, they get to dictate how the paradigm works. AMD never takes that risk/chance, and a big reason they do not do this is because they feel they can't = lack of relevant competence in the company.
Posted on Reply
#121
AusWolf
Neo_MorpheusSeems like they have hit a performance wall and are resorting to tricks like this and fake frames to compensate.
Let's see if AMD has anything more than that. I sure hope so.
Posted on Reply
#122
Hecate91
AusWolf"Personal opinion" is a bit different when you're a highly respected reviewer. This is where my most hated word "influencer" comes into play. You influence people's buying decisions. As such, you should be aware that not everyone's needs are the same. This is why you can't use words like "must-have feature" without sounding like a marketing guy.

Personally, I always try my best to recommend products that match the person's needs that I'm recommending for, and not my personal taste.
I think personal opinions should be excluded from a review, unless the reviewer distinctly states their opinion at the end of the review. The reviewer can't be trusted if they use opinions as an objective fact, and to me thats where the term influencer comes in.
AusWolfI see that on a few people, but on the forum as a whole... nah.
I don't see it, most people in threads like these are Nvidia users, understandable in the aspect of overall marketshare, but on a tech forum nvidia users do it to poke at the brand they don't like.
Neo_MorpheusSeems like they have hit a performance wall and are resorting to tricks like this and fake frames to compensate.
It seems like Nvidia doesn't want to take a hit on profit margins to allocate larger dies, or doesn't want to spend the R&D on creating a chiplet architecture for their consumer gpu's. I said it in a another thread if AMD can make a chiplet for consumers then Nvidia definitely can, its significantly done for cost reasons but allows more for efficient use of die space, if Nvidia wants to keep their profit margins astronomically high then they need to implement a chiplet architecture to increase raster performance, instead of resorting to graphics trickery with fake frames.
Posted on Reply
#123
Neo_Morpheus
AusWolfLet's see if AMD has anything more than that. I sure hope so.
I have the weird suspicion that we might need another PowerVR moment where the industry find another “work smarter, not harder “ way, especially with RT.
Posted on Reply
#124
Hecate91
Chrispy_"AMD is taking its RDNA 4 launch more cautiously than before, as it now faces a significant problem with NVIDIA and its waste portfolio of software optimization and AI-enhanced visualization tools."

I'm sure that's a typo and the word should be "vast" but I think the typo should stay.
I would assume thats a typo, but given what TPU thinks of AMD gpu's I'm not so sure.
Posted on Reply
#125
Chrispy_
AusWolfYou should be able to afford to play those old games on a new GPU without FSR.
"old games" includes titles like CP2077, God of War Ragnarok, Horizon Forbidden West, Elden Ring (with RT), Ratchet and Clank, Jedi Fallen Order - these are all old titles (up to 6 years old) that will cause a 4070 to stumble at 4K, and fail to qualify as "high-refresh" in 1440p.

We have very little info on the 9060 models yet, but presumably they'll outsell the 9070 models and need FSR4 even more.
Posted on Reply
Add your own comment
Jan 23rd, 2025 14:04 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts