Thursday, January 23rd 2025

AMD is Taking Time with Radeon RX 9000 to Optimize Software and FSR 4

When AMD announced its upcoming Radeon RX 9000 series of GPUs based on RDNA 4 IP, we expected the general availability to follow soon after the CES announcement. However, it turns out that AMD has scheduled its Radeon RX 9000 series availability for March, as the company is allegedly optimizing the software stack and its FidelityFX Super Resolution 4 (FSR 4) for a butter smooth user experience. In a response on X to Hardware Unboxed, AMD's David McAfee shared, "I really appreciate the excitement for RDNA 4. We are focused on ensuring we deliver a great set of products with Radeon 9000 series. We are taking a little extra time to optimize the software stack for maximum performance and enable more FSR 4 titles. We also have a wide range of partners launching Radeon 9000 series cards, and while some have started building initial inventory at retailers, you should expect many more partner cards available at launch."

AMD is taking its RDNA 4 launch more cautiously than before, as it now faces a significant problem with NVIDIA and its waste portfolio of software optimization and AI-enhanced visualization tools. The FSR 4 introduces a new machine learning (ML) based upscaling component to handle Super Resolution. This will be paired with Frame Generation and an updated Anti-Lag 2 to make up the FSR 4 feature set. Optimizing this is the number one priority, and AMD plans to get more games on FSR 4 so gamers experience out-of-the-box support.
Source: David McAfee
Add your own comment

251 Comments on AMD is Taking Time with Radeon RX 9000 to Optimize Software and FSR 4

#126
AusWolf
Neo_MorpheusI have the weird suspicion that we might need another PowerVR moment where the industry find another “work smarter, not harder “ way, especially with RT.
I think we hit that point already 2-3 generations ago. AMD even tried with chiplets, only that it didn't work as well as intended.
Posted on Reply
#127
Vayra86
Neo_MorpheusI have the weird suspicion that we might need another PowerVR moment where the industry find another “work smarter, not harder “ way, especially with RT.
This was obvious to me from the onset. Already we saw tweaks to reduce the amount of actual calculating being done. We're already moving further and further away from truly accurate RT, but rather use singular rays to 'determine' what else needs to happen. It all looks brutally inefficient nonetheless, with PT being the king of the castle of waste. As if Nvidia doesn't know this approach is never going to be here to stay, lol.
Posted on Reply
#128
AusWolf
Chrispy_"old games" includes titles like CP2077, God of War Ragnarok, Horizon Forbidden West, Elden Ring (with RT), Ratchet and Clank, Jedi Fallen Order - these are all old titles (up to 6 years old) that will cause a 4070 to stumble at 4K, and fail to qualify as "high-refresh" in 1440p.
I'd subjectively say that's a problem with the expectations. The 4070 isn't a 4K card.
Chrispy_We have very little info on the 9060 models yet, but presumably they'll outsell the 9070 models and need FSR4 even more.
I wouldn't make that assumption just yet. The 7600 didn't outsell the 7800 XT, as far as I know.
Posted on Reply
#129
freeagent
AusWolfThe 4070 isn't a 4K card
It isn't, but it can still play a good chunk of games at 4K.
Posted on Reply
#130
Neo_Morpheus
Hecate91It seems like Nvidia doesn't want to take a hit on profit margins to allocate larger dies, or doesn't want to spend the R&D on creating a chiplet architecture for their consumer gpu's. I said it in a another thread if AMD can make a chiplet for consumers then Nvidia definitely can, its significantly done for cost reasons but allows more for efficient use of die space, if Nvidia wants to keep their profit margins astronomically high then they need to implement a chiplet architecture to increase raster performance, instead of resorting to graphics trickery with fake frames.
Agreed.
We have great cpus now thanks to chiplets and will be honest, considering the price and performance of my 7900xtx, i think that it’s doable on gpus.
Chrispy_"old games" includes titles like CP2077, God of War Ragnarok, Horizon Forbidden West, Elden Ring, Ratchet and Clank, Jedi Fallen Order - these are all old titles (up to 6 years old) that will cause a 4070 to stumble at 4K, and fail to qualify as "high-refresh" in 1440p.
Man, you just reminded me of a meme, that mentions 20 years old consoles. In my mind I’m thinking sega genesis, but in reality we are looking at ps3s.

Edit found it!

Posted on Reply
#131
JustBenching
Vayra86This was obvious to me from the onset. Already we saw tweaks to reduce the amount of actual calculating being done. We're already moving further and further away from truly accurate RT, but rather use singular rays to 'determine' what else needs to happen. It all looks brutally inefficient nonetheless, with PT being the king of the castle of waste.
If we assume RT /PT is the target then Upscalers are absolutely needed (and probably so is FG). Even if we get 100% gen on gen improvements that's not enough to keep up with RT demands. Im not sure people understand how insanely taxing RT.

If RT / PT isn't the target then what is? Graphics have pretty much stagnated the traditional way, RDR2 back in 2018 was already good enough, why would anyone really care about gpus anymore?
Posted on Reply
#132
remekra
Neo_MorpheusI have the weird suspicion that we might need another PowerVR moment where the industry find another “work smarter, not harder “ way, especially with RT.
That is what we could argue nvidia is doing now. Working smarter, rather than harder. Upscale with AI with minimal loss of detail, generate a frame or two instead of rendering it and also use ReSTIR for GI and DI.
People like to say that Path Tracing performance is shit, without even considering that real time Path Tracing was a pipe dream 4-5 years ago, it took minutes to render one frame. And with ReSTIR is also working smarter because you don't need to take into account every sample. And it completely removes any trickery when it comes to lighting the game. You place whatever light you like, wherever you want and you have GI, DI in real time.
Posted on Reply
#133
Hecate91
JustBenchingWhy are people in here so salty because other people aren't buying the brand they want them to buy? The last few weeks I've been reading the same stuff over and over and over again. Sheep buy Nvidia because stores tell them not to buy amd and whatever other excuse you guys are coming up with.

Let's for once have a decent amd card and then we can look at the sales. I'm pretty confident they will be good.
Because people are tired of the same people always going "but nvidia" in every AMD thread, or the same stuff over again saying AMD is screwed. And yes people do buy Nvidia because a store tells them to, or friends tell them to buy nvidia because they heard the outdated claim of the drivers being bad.

If the cards are good, we can wait and see, I don't see the need for a bunch of threads spreading FUD about launch when AMD never said the launch would be right after CES.
Vayra86Last I heard Cyberpunk got a long overdue FSR update but its still not great and people on TPU wondered what had changed at all.


Read this just now.. but... that's very simple, Nvidia shows us how that works right? They introduce features that will make games run like you bought a low - end card (create problem) and then they introduce features that fix that performance and still allow you to use the performance hogging features and feel like you didn't actually buy something underspecced for the features on offer.

And the kicker is, Nvidia isn't wrong because they pull this forward and innovate on it. Because they lead, they get to dictate how the paradigm works. AMD never takes that risk/chance, and a big reason they do not do this is because they feel they can't = lack of relevant competence in the company.
I'm not surprised, because Cyberpunk is essentially an Nvidia tech demo game. CDPR put little effort into actually implementing good FSR like most AAA devs because it's easier to add DLSS when Nvidia hands companies piles of money to implement the feature first.

As for marketing features, I agree AMD sucks horribly at it, they really could've made sure features worked right before launch and put some ambition behind making sure consumers know about the feature, and I think AMD should be spreading more awareness on their features which aren't locked down to their brand.
Posted on Reply
#134
remekra
Also AMD marketing department seems to not get the memo that launch is in March:



"Play now"
But I do like the looks of the card itself.
Posted on Reply
#135
JustBenching
Hecate91Because people are tired of the same people always going "but nvidia" in every AMD thread, or the same stuff over again saying AMD is screwed. And yes people do buy Nvidia because a store tells them to, or friends tell them to buy nvidia because they heard the outdated claim of the drivers being bad.
But that's not what's happening. It's you and a bunch of other people beating the ngreedia drum every damn thread man.
Posted on Reply
#136
Chrispy_
Vayra86Last I heard Cyberpunk got a long overdue FSR update but its still not great and people on TPU wondered what had changed at all.
Just frame-gen really. FSR3 with framegen was still pretty smeary, but so is DLSS in CP2077, so it's hardly an AMD-only problem.
Posted on Reply
#137
TheinsanegamerN
Hecate91Because people are tired of the same people always going "but nvidia" in every AMD thread, or the same stuff over again saying AMD is screwed. And yes people do buy Nvidia because a store tells them to, or friends tell them to buy nvidia because they heard the outdated claim of the drivers being bad.

If the cards are good, we can wait and see, I don't see the need for a bunch of threads spreading FUD about launch when AMD never said the launch would be right after CES.
My favorite thing to see repeated ad nauseum is 'well well well people just buy nvidia because they are dumb poopy heads and cant see AMD is clearly better, its just mindshare bro, AMD just needs marketing bro".

Like, the concept that people want consistent support, better RT, and better upscaling, and AMD is generally worse at these things so they buy nvidia is just a totally lost concept; no the consumer just buys what they are told, they have no free will!
Hecate91I'm not surprised, because Cyberpunk is essentially an Nvidia tech demo game. CDPR put little effort into actually implementing good FSR like most AAA devs because it's easier to add DLSS when Nvidia hands companies piles of money to implement the feature first.
Hecate91As for marketing features, I agree AMD sucks horribly at it, they really could've made sure features worked right before launch and put some ambition behind making sure consumers know about the feature, and I think AMD should be spreading more awareness on their features which aren't locked down to their brand.
So....... game devs use DLSS and ignore FSR because nGREEDia pays them, and at the same time AMD's features suck? Surely their focus on DLSS has NOTHING to do with AMD's features sucking. Hmmm......
Posted on Reply
#138
JustBenching
TheinsanegamerNMy favorite thing to see repeated ad nauseum is 'well well well people just buy nvidia because they are dumb poopy heads and cant see AMD is clearly better, its just mindshare bro, AMD just needs marketing bro".

Like, the concept that people want consistent support, better RT, and better upscaling so they buy nvidia is just a totally lost concept, no the consumer just buys what they are told, they have no free will!


So....... game devs use DLSS and ignore FSR because nGREEDia pays them, and at the same time AMD's features suck? Surely their focus on DLSS has NOTHING to do with AMD's features sucking. Hmmm......
Just FYI, since I'm using fsr excessively on my amd laptop, FSR is present in 95% of nvidia sponsored games. There are nvidia sponsored games that have fsr and not dlss,lol. Now are you ready to know what % of amd sponsored games have dlss? It will be a shock...
Posted on Reply
#139
Neo_Morpheus
remekraThat is what we could argue nvidia is doing now. Working smarter, rather than harder. Upscale with AI with minimal loss of detail, generate a frame or two instead of rendering it and also use ReSTIR for GI and DI.
People like to say that Path Tracing performance is shit, without even considering that real time Path Tracing was a pipe dream 4-5 years ago, it took minutes to render one frame. And with ReSTIR is also working smarter because you don't need to take into account every sample. And it completely removes any trickery when it comes to lighting the game. You place whatever light you like, wherever you want and you have GI, DI in real time.
Maybe, but at the same time, think about how the current options do remove data from the visible field of view (lowering the res then upscaling) and worse, insert data (fake frames) that were never there or placed by the game developer.

Its something that can be and will be abused. There are rumors that Ngreedia wants to insert 10 or more frames per each real one and since the market will be ok with that, AMD will have to do the same.

So developers will do what then? Its crazy to think about what is coming.
Posted on Reply
#140
Vayra86
JustBenchingIf we assume RT /PT is the target then Upscalers are absolutely needed (and probably so is FG). Even if we get 100% gen on gen improvements that's not enough to keep up with RT demands. Im not sure people understand how insanely taxing RT.

If RT / PT isn't the target then what is? Graphics have pretty much stagnated the traditional way, RDR2 back in 2018 was already good enough, why would anyone really care about gpus anymore?
I've always been convinced the RT push is a gamble. There is no conclusion yet whether it'll be here to stay, especially in its current form. PT is also an innovation. Is it one in the right direction? We don't know.
Posted on Reply
#141
Neo_Morpheus
Vayra86I've always been convinced the RT push is a gamble. There is no conclusion yet whether it'll be here to stay, especially in its current form. PT is also an innovation. Is it one in the right direction? We don't know.
Agreed.

I honestly cant justify the performance hit for what is given in return.

Granted, some few games do look better, but very few in my book.
Posted on Reply
#142
Chrispy_
freeagentIt isn't, but it can still play a good chunk of games at 4K.
especially with DLSS Q. The 4070 is a 1440p card but the most common TV resolution has been 4K for the better part of a decade now, and consoles have been targeting 4K for over 8 years with the XB1S / XB1X / PS4 Pro

I also fall into the "would rather not use DLSS unless I have to" but sometimes the display resolution outpaces the hardware power available, and sometimes the game isn't optimised very well so you have to do something to get the framerate up to acceptable speeds.
Posted on Reply
#143
Dawora
AusWolfI don't disagree, but selling one's sympathy for DLSS as a "must-have feature" rather than a personal opinion is a bit daft, especially when 45% of people on a front page poll say they don't use any upscaling.
And 80% of ppls use upscaler in RTX gpus.
Also 40% of those 45% is AMD users,FSR is so bad its better to not use it.
Posted on Reply
#144
Neo_Morpheus
Chrispy_Just frame-gen really. FSR3 with framegen was still pretty smeary, but so is DLSS in CP2077, so it's hardly an AMD-only problem.
I havent bought that game yet, but will admit that on the videos in youtube, it does always looks "smeary" as you said and "grainy" and I recall other games where the devs placed extra efforts, where both dlss and FSR looked really good.
Posted on Reply
#145
freeagent
Chrispy_I also fall into the "would rather not use DLSS unless I have to" but sometimes the display resolution outpaces the hardware power available, and sometimes the game isn't optimised very well so you have to do something to get the framerate up to acceptable speeds.
Yessir. Same boat with the Ti :)
Posted on Reply
#146
freeagent
Vayra86've always been convinced the RT push is a gamble. There is no conclusion yet whether it'll be here to stay, especially in its current form.
Neo_MorpheusAgreed.
RT is here to stay. It will only continue to be developed. AMD jumped on board this time because they support it, not like last gen where they had to pretend to support it.
Posted on Reply
#147
Hecate91
TheinsanegamerNMy favorite thing to see repeated ad nauseum is 'well well well people just buy nvidia because they are dumb poopy heads and cant see AMD is clearly better, its just mindshare bro, AMD just needs marketing bro".

Like, the concept that people want consistent support, better RT, and better upscaling, and AMD is generally worse at these things so they buy nvidia is just a totally lost concept; no the consumer just buys what they are told, they have no free will!
And my favorite thing to see repeated is people ignoring reality while repeating the same "but nvidia" marketing points reviewers always highlight.
TheinsanegamerNSo....... game devs use DLSS and ignore FSR because nGREEDia pays them
Exactly, and everyone whined when FSR got added to a game first, but there was nothing when Indiana Jones only had DLSS.
Posted on Reply
#148
JustBenching
Hecate91Exactly, and everyone whined when FSR got added to a game first, but there was nothing when Indiana Jones only had DLSS.
That's easily explained. When FSR is missing nobody complains since as you and your comrades keep telling us, upscalers are crap and you are not using them. So whos gonna complain exactly and why? Do you want me to complain that you can't use a feature that you weren't going to use anyways cause you think it's crap?
Posted on Reply
#149
dyonoctis
Neo_MorpheusI have the weird suspicion that we might need another PowerVR moment where the industry find another “work smarter, not harder “ way, especially with RT.
Vayra86I've always been convinced the RT push is a gamble. There is no conclusion yet whether it'll be here to stay, especially in its current form. PT is also an innovation. Is it one in the right direction? We don't know.
There's already something that's being worked on : neural rendering. Not as a after rendering filter, but to render the game itself. Some of it looks dodgy (neural faces looks like they don't belong in the game, something just looks off) but other seems promising, like neural texure compression, using machine learing to enable more complex shaders at a lower cost, using machine learing to lower the computational cost of path tracing... Because nvidia was the first to heavily market it, people are already doubting the tech, but they aren't the only one who've been researching it.

Intel Is Working on Real-Time Neural Rendering
Neural Supersampling and Denoising for Real-time Path Tracing - AMD GPUOpen
Posted on Reply
#150
Geofrancis
The secret with AMD is to never buy their latest GPU, but wait until just after the latest one comes out to buy the older generation at steep discount, that way you get a decent upgrade for a fraction of the price with well optimised drivers.
Posted on Reply
Add your own comment
Feb 2nd, 2025 17:21 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts