Thursday, January 23rd 2025

AMD is Taking Time with Radeon RX 9000 to Optimize Software and FSR 4

When AMD announced its upcoming Radeon RX 9000 series of GPUs based on RDNA 4 IP, we expected the general availability to follow soon after the CES announcement. However, it turns out that AMD has scheduled its Radeon RX 9000 series availability for March, as the company is allegedly optimizing the software stack and its FidelityFX Super Resolution 4 (FSR 4) for a butter smooth user experience. In a response on X to Hardware Unboxed, AMD's David McAfee shared, "I really appreciate the excitement for RDNA 4. We are focused on ensuring we deliver a great set of products with Radeon 9000 series. We are taking a little extra time to optimize the software stack for maximum performance and enable more FSR 4 titles. We also have a wide range of partners launching Radeon 9000 series cards, and while some have started building initial inventory at retailers, you should expect many more partner cards available at launch."

AMD is taking its RDNA 4 launch more cautiously than before, as it now faces a significant problem with NVIDIA and its waste portfolio of software optimization and AI-enhanced visualization tools. The FSR 4 introduces a new machine learning (ML) based upscaling component to handle Super Resolution. This will be paired with Frame Generation and an updated Anti-Lag 2 to make up the FSR 4 feature set. Optimizing this is the number one priority, and AMD plans to get more games on FSR 4 so gamers experience out-of-the-box support.
Source: David McAfee
Add your own comment

251 Comments on AMD is Taking Time with Radeon RX 9000 to Optimize Software and FSR 4

#226
Chrispy_
AusWolfI don't think the 5090 and 5080 are that important in this regard. They're both priced way out of reach of people looking for a 5070-level card. We've also learned in the last 2-3 generations that the performance of the halo card has little to no effect on the rest of the product stack. Personally, I won't even read their reviews in entirety, just the architectural differences, because how much faster, hungrier and more expensive we can go above the 4090, I honestly don't care.
The 5090 is going to be vaporware; Outside of the very wealthy willing to pay more than businesses who will be queueing for stock and able to justify spending 3000, 4000, 5000+ on a 5090.

The 5080 will be the GPU to try and get, and if you want one get in the limited-stock queue for the $999 founders edition directly from Nvidia because all of the AIB models are being listed at $1200+
Posted on Reply
#227
Onasi
AssimilatorBecause if NVIDIA doesn't increase its generation-on-generation performance in every aspect, the reviewers and buying public are going to trash them, and their investors will be mad. So NVIDIA, which literally doesn't care about rasterisation anymore, has to keep delivering linear rasterisation improvements anyway with more of the same old fixed-function rasterisation hardware, at the same time they deliver far greater RT improvements with new fixed-function hardware. At some stage I expect that they will either merge the rasterisation and RT hardware somewhat to prevent so much duplication, or be able to reuse the RT hardware to emulate rasterisation workloads, or possibly some combination of the two. Knowing NVIDIA they're already working hard on this problem.
There’s another reason that’s not exactly gaming related or at least not fully - a lot of professional software, be it 3D rendering, video editing or compute still are based around using shader cores for doing what they are doing. NV can’t really just come out with an architecture lacking those capabilities since they’d very much like to sell to that market as well. Tensor Cores and RT cores are a solution for THAT particular problem as much (well, more so) as for gaming - separating pure compute and RT hardware to make sure they are still present for most common professional uses down the line when unified shaders will start to get wound down.
Posted on Reply
#228
AusWolf
Assimilatorthey deliver far greater RT improvements with new fixed-function hardware.
Do they really?

AssimilatorAt some stage I expect that they will either merge the rasterisation and RT hardware somewhat to prevent so much duplication, or be able to reuse the RT hardware to emulate rasterisation workloads, or possibly some combination of the two. Knowing NVIDIA they're already working hard on this problem.
It doesn't look like it for now. This is Turing vs Blackwell. I assume that if Nvidia really was working on this problem, then we would see at least some indication by the 4th generation of their RT hardware.

AssimilatorClose your eyes and tell me how much you see. Now reconsider your statement that light is "just a portion" of the world.
I worded it badly. I'm tired after work. :laugh:

What I meant is, there's geometry, textures and physics as well. How do you do all of it without raster?
AssimilatorThere are multiple reasons for this.
  • Rasterisation has become incredibly good at simulating the real world, so good that the basic RT we currently have isn't able to outperform it visually. That's a consequence of literally decades of work on rasterisation, and far less on RT.
Possibly.
Assimilator
  • You've become used to how rasterisation simulates the real world, so games using rasterisation don't look "off" to you, even when their rendering is actually incorrect compared to the real world.
Why would you think that RT is always correct? I've seen it make errors. There was someone posting a screenshot in another thread not long ago of RT casting shadows that it logically shouldn't.

It was a desert scene. I can't remember which thread it was in.

Also, how is correct = good? I mean, photorealism isn't the only way for a game's graphics to be visually pleasing.
Assimilator
  • Conversely your brain gets used to RT quickly, because the latter does such a good job at simulating the real world.
Then why does my brain get used to no RT equally quickly? My conclusion is that both raster and RT do a pretty good job these days, just differently.
Chrispy_The 5080 will be the GPU to try and get, and if you want one get in the limited-stock queue for the $999 founders edition directly from Nvidia because all of the AIB models are being listed at $1200+
So much markup for AIB cards! You could buy a whole GPU for such price. Total ripoff on AIBs, total ripoff on cunsumers.
Posted on Reply
#229
Vayra86
AssimilatorCompletely and utterly wrong. Real-time ray tracing is the future of graphics rendering, despite how many times you and others like you attempt to poo-poo it. Its uptake has simply been delayed by a number of factors:
  • It's a complete paradigm shift compared to rasterisation - you don't just need appropriate tools, but the appropriate mindset. Game developers who were born and raised on rasterisation are going to take time to get to grips with RT, and especially will have to unlearn a vast quantity of the stupid bulls**t hackery required to coerce rasterisation to render things somewhat realistically.
  • Game development is no longer about pushing the boundaries of technology, but making money. Even if developers want to implement RT, their managers aren't necessarily going to let them because of the extra training and development time, and thus cost. This creates inertia.
  • Hardware isn't quite powerful enough to handle it yet. You might say "then it shouldn't have been introduced", but you need to make game developers aware of and comfortable with a technology sooner rather than later.
  • Hardware isn't getting powerful enough at a fast enough rate to handle it. Unfortunately RT was introduced just before we hit the Moore's Law wall, which is particularly important given how hardware-intensive RT is.
RT has been the holy grail of graphics rendering forever. We may not yet be able to hold that grail, but we can at least touch it. If you'd suggested the latter would be a possibility to any computer graphics researcher a decade ago, they'd have laughed you out of the room - and yet here we are.

You don't like RT, we get it, but stop allowing that irrational dislike to blind you to the fact that RT is, in every aspect, the future of realistic graphics rendering that is superior to rasterisation in every conceivable way. In another decade, the only conversation about the latter will be in relation to graphics from before the RT era.
Its the future, but not the way it is pushed today. You've pointed out the issues, but the introduction of RT 'just before we hit Moore's Law wall' is a business strategy Nvidia deployed knowing there is an insurmountable challenge to be met. Its a fantastic business model: you can never solve this issue in real time, GPUs will never be fast enough to brute force everything. Remarkably similar to AI.

I have no issues with RT. As pointed out, its already actively being used for lots of games. I hate doing it in real time, on an entire scene, introducing an ungodly amount of latency, and I also hate paying excessive money for it like we see today. The 5090 is a 750mm2 GPU - it hits 29 FPS in PT Cyberpunk. And the cost of that 750mm2 GPU isn't going down either. The gap's just too large, and as long upscaling isn't perfect (and its not), will remain so. We can be all happy about DLSS4 now, but the latency is here to stay regardless.

So far, the overall situation and deal I'm offered just still looks unconvincing and more like an Nvidia clusterfuck than anything else. Not convinced. Not buying into it.

Its a similar thing to me as VR. Sure, there are some niche situations where it really makes a dent (especially if you run into your TV)... but its not viable economically yet. You require an expensive headset (that's not perfect either), higher FPS thus more GPU, and a special suite of games. It hasn't taken off, and it won't, with that set of conditions. Now, for RT, you need an expensive GPU (that's not going to last either, and effectively already struggles from day one), you need an upscale to get playable FPS, and you need a special suite of games. See the similarities?

Now, some reflection on the beginning of this circus:

Back when SIGGRAPH happened and Huang told us this was the future, and Turing launched shortly after... a lot of people shared the idea this could take 2-3 generations before it actually took off and 10 years for the real change. Where are we now? 3 generations, six years ahead... 95%+ of all games are still built entirely on a non-RT framework. So we have four years left for that paradigm shift. I think its safe to add another six on top.
Posted on Reply
#230
AusWolf
Vayra86Its the future, but not the way it is pushed today. You've pointed out the issues, but the introduction of RT 'just before we hit Moore's Law wall' is a business strategy Nvidia deployed knowing there is an insurmountable challenge to be met. Its a fantastic business model: you can never solve this issue in real time, GPUs will never be fast enough to brute force everything.
Yet, here we are, brute forcing our way into everything for 4 Nvidia generations straight. The architecture doesn't change much, we just get more parts crammed into a tighter space.

If RT really was the future, then I'd like to see some indication that we're moving towards more RT-oriented architectures. But for now, RT cores are still just an add-on, and not really improved, either.
Posted on Reply
#231
Vayra86
AusWolfYet, here we are, brute forcing our way into everything for 4 Nvidia generations straight. The architecture doesn't change much, we just get more parts crammed into a tighter space.

If RT really was the future, then I'd like to see some indication that we're moving towards more RT-oriented architectures. But for now, RT cores are still just an add-on, and not really improved, either.
Yeah we are... with abysmal performance and artifacting everywhere. Wooptiedoo
Posted on Reply
#232
remekra
That paradigm shift will happen once new consoles are out. No dev studio will make heavy use of RT or PT if the consoles can't run it.
So yeah we are still couple of years away from it.
Posted on Reply
#233
Vayra86
Jtuck9It does look awesome in places. I just think you need a directors touch. Reminds me of procedural generation in that aspect.
It does, and hey... isn't that director's touch exactly the same touch you wanted on that archaic lighting on all those awesome games we already have?

That's the point. Shitty devs aren't going to be any less shitty because they can optimize a workflow now. They're just going to have a lower budget to work with. It is the same thing @BSim500 just pointed out and I did too in another post elsewhere; those hours saved on doing lighting aren't going to be spent elsewhere. They're going to be cut. I have yet to see the numbers of both approaches, too. Is it really faster, really cheaper? Or will you never really master your own engine and game that way and develop the same efficiency yourself? We're already seeing that happen in front of us with the stream of UE engine based games that run like absolute horse manure and don't even look good doing so. The gameplay is often nothing to write home about either. But yeah, they managed to release a game. yay. They also managed to foot part of their bill to our GPUs.
Posted on Reply
#234
Wasteland
Vayra86The 5090 is a 750mm2 GPU - it hits 29 FPS in PT Cyberpunk.
Yep. I'm a broken record on this, but I must point out that Cyberpunk's pathtracing is also limited to two rays and two bounces.

The thing I always come back to is the analogy to movies. The main reason that movies are so much more expensive (and more importantly, the main reason they tend to look so much more expensive) than traditional television is the lighting. In traditional TV, you don't have time to mess around with the lighting; you just film the scene on a set with static lighting and call it done. In movies, by contrast, lighting is meticulously micromanaged, sometimes altered several times in the same scene.

The purpose behind lighting in movies, in other words, is pretty much the opposite of realistic. We could look at art, too. Michelangelo's Pieta famously features a Virgin Mary who is something like twice as tall as the Jesus figure draped over her lap. It's an optical illusion, something the artist understood perhaps better than anyone. The proportions look real to us, but they aren't "realistic."

All of which is to say that there's an aesthetic trade off to "automated" lighting systems like RT, even if we assume a hypothetically perfect RT implementation. In gaming, this aesthetic trade off also has purely utilitarian implications, the most obvious being areas that are mistakenly too dark. I remember watching an early-ish Digital Foundry video gushing over how RT made a side alley pitch black, whereas the raster version of the scene left it unrealistically light. All I could think was, "is ... that really an improvement?"

Of course, all of this is somewhat premature. We don't actually have a perfect RT implementation. What we have now is what you might call the worst of all worlds--huge perf penalties for even relatively light RT, zero-to-low workload benefit for developers (who have to do the hand-placed lighting for raster anyway), and a wild over-emphasis generally on comparatively small graphical-fidelity improvements, gen-on-gen, year-to-year--and all the while, the things that make a game fun to play languish under the oppressive burden of corporate-copy-cat/focus-group design.

I don't actually hate RT. It's a cool tech, and it probably is the future, but as things stand now, I'm forced to regard it in much the same way that we would usually regard a particularly expensive-yet-unimpressive Ultra setting in the menu. No one in his right mind enabled "Volumetric Clouds" in e.g. AC Odyssey. RT's in a similar spot, not always, but most of the time. The main difference is hype.
Posted on Reply
#235
InVasMani
I hope to see a push towards FP2 next generation and maybe expand upon swept spheres with a slightly more crude better performing alternative possibly depends how that would look in practice. I think lower precision makes a good deal of sense for post process to quantize and layer more together. It's probably going to be really great for generating quantized textures as well and obviously AI inference and much of those things over time will seep into games or game development.
Posted on Reply
#236
Jtuck9
Vayra86It does, and hey... isn't that director's touch exactly the same touch you wanted on that archaic lighting on all those awesome games we already have?

That's the point. Shitty devs aren't going to be any less shitty because they can optimize a workflow now. They're just going to have a lower budget to work with. It is the same thing @BSim500 just pointed out and I did too in another post elsewhere; those hours saved on doing lighting aren't going to be spent elsewhere. They're going to be cut. I have yet to see the numbers of both approaches, too. Is it really faster, really cheaper? Or will you never really master your own engine and game that way and develop the same efficiency yourself? We're already seeing that happen in front of us with the stream of UE engine based games that run like absolute horse manure and don't even look good doing so. The gameplay is often nothing to write home about either. But yeah, they managed to release a game. yay. They also managed to foot part of their bill to our GPUs.
I don't think Star Citizen uses ray tracing, but I think they blend static and dynamic lighting together well, along with procedural and more tailored environments. They are probably a good example of how unfeasible it is to master your own engine these days, although I did hear mismanagement plays its part. Look at scientific progress, disregarding the lone antivax geniuses. How did Carmacks foray into AI go?! Speaking of which, haven't "they" started to spit out PHD level thesis? DOGE comes to mind. It would certainly be nice if progress translated into gameplay / more interactive environments. I mentioned that Black Parade mod for Thief earlier, compare that to Thief 4. I've also read that Unreal Engine 5.4 is an improvement in many ways, although I'm not sure how much there is beneath the veneer in that Matrix Demo. Apt that it's set in New York.
WastelandThe thing I always come back to is the analogy to movies. The main reason that movies are so much more expensive (and more importantly, the main reason they tend to look so much more expensive) than traditional television is the lighting. In traditional TV, you don't have time to mess around with the lighting; you just film the scene on a set with static lighting and call it done. In movies, by contrast, lighting is meticulously micromanaged, sometimes altered several times in the same scene.
The whole using lights to give the illusion of windows in reflections sprang to mind!
WastelandNo one in his right mind enabled "Volumetric Clouds" in e.g. AC Odyssey. RT's in a similar spot, not always, but most of the time. The main difference is hype.
They sure look pretty in Flight Sims though, and Star Citizen!
Posted on Reply
#237
Wasteland
Jtuck9They sure look pretty in Flight Sims though, and Star Citizen!
Right, and I guess that's part of the point too. Volumetric clouds aren't always bad, but in that particular game, the highest setting incurred something like a 40% perf penalty, IIRC even indoors, and the visual difference, even if you turned your camera up to stare at the sky, was undetectable. Odyssey's Volumetric Clouds are an extreme example, but most AAA games have at least one or two expensive-but-pointless settings, which is why running at stock Ultra is widely considered dumb.
Posted on Reply
#238
Jtuck9
True, although I did see a video the other day of that flying ship bug in one of the Assassin Creed games

I'm a graphics whore at heart, at my own expense in many ways...
Posted on Reply
#239
Kratouille
I own an RX 7900 XTX from ASUS (TUF). It is an awesome card. Drivers are great and the AMD software runs nice, highly customisable.
I hope AMD will release a new high end GPU and launch a techology similar to ray tracing but lighter to handle for the hardware.
Posted on Reply
#240
Gasaraki
AusWolfIf that's the case, then why do we have the same ratio of raster vs RT hardware even on Nvidia GPUs since Turing? If RT is the way to go, then surely we should be seeing some indication of at least Nvidia investing in it more heavily than in raster and/or general performance, right? - This isn't a form of disagreement, more like a genuine question.


RT only simulates lights and shadows as far as I know. They're just a portion of the world around us, not the entirety of it.
Nvidia is investing tons in RT. It's people poopooing it, "it so hard", "performances sucks", etc, etc. The minute Nvidia doesn't invest as much in raster and general performance (like in the 50xx), and develop new tech to improve performance, people complain that it's "fake frames", bla bla bla. Frame generation was designed so that more people can run RT.
Posted on Reply
#241
chrcoluk
Vayra86Its the future, but not the way it is pushed today. You've pointed out the issues, but the introduction of RT 'just before we hit Moore's Law wall' is a business strategy Nvidia deployed knowing there is an insurmountable challenge to be met. Its a fantastic business model: you can never solve this issue in real time, GPUs will never be fast enough to brute force everything. Remarkably similar to AI.

I have no issues with RT. As pointed out, its already actively being used for lots of games. I hate doing it in real time, on an entire scene, introducing an ungodly amount of latency, and I also hate paying excessive money for it like we see today. The 5090 is a 750mm2 GPU - it hits 29 FPS in PT Cyberpunk. And the cost of that 750mm2 GPU isn't going down either. The gap's just too large, and as long upscaling isn't perfect (and its not), will remain so. We can be all happy about DLSS4 now, but the latency is here to stay regardless.

So far, the overall situation and deal I'm offered just still looks unconvincing and more like an Nvidia clusterfuck than anything else. Not convinced. Not buying into it.

Its a similar thing to me as VR. Sure, there are some niche situations where it really makes a dent (especially if you run into your TV)... but its not viable economically yet. You require an expensive headset (that's not perfect either), higher FPS thus more GPU, and a special suite of games. It hasn't taken off, and it won't, with that set of conditions. Now, for RT, you need an expensive GPU (that's not going to last either, and effectively already struggles from day one), you need an upscale to get playable FPS, and you need a special suite of games. See the similarities?

Now, some reflection on the beginning of this circus:

Back when SIGGRAPH happened and Huang told us this was the future, and Turing launched shortly after... a lot of people shared the idea this could take 2-3 generations before it actually took off and 10 years for the real change. Where are we now? 3 generations, six years ahead... 95%+ of all games are still built entirely on a non-RT framework. So we have four years left for that paradigm shift. I think its safe to add another six on top.
Also that futuristic Turing generation runs like crap on modern RT as well, in the 5080 review only the 2080ti made it on to the graph (near the bottom).

Also for me visuals got good enough during the PS4's lifetime. I dont care if hacks were used to get there, thats not my concern as a consumer. I dont play games for realism, I play them to escape from it.
Posted on Reply
#242
Vayra86
chrcolukAlso that futuristic Turing generation runs like crap on modern RT as well, in the 5080 review only the 2080ti made it on to the graph (near the bottom).

Also for me visuals got good enough during the PS4's lifetime. I dont care if hacks were used to get there, thats not my concern as a consumer. I dont play games for realism, I play them to escape from it.
Its funny how a highly efficient and tried and tested approach is now called 'a hack' :) Way to downplay decades of refinement that runs well and looks good on hardware costing a fraction of what you need today.
Posted on Reply
#243
Chrispy_
I've just remembered the biggest reason that AMD are failing hard in the market after watching/reading a bunch of 5080 reviews - it reminded me of a couple of conversations I'd had with people over the last couple of years.

When AMD waits until after Nvidia's launch, the POPULAR, HEAVILY-SEARCHED, IMPORTANT coverage of the new Nvidia cards *doesn't* include AMD's answer.

For the next 18 month product cycle, people googling for "RTX 5080" are going to read or watch today's reviews. That's right, the "$1000" 5080 is better than the 7900XTX. "WTF is a 9070, man? It's not even on the charts!"

Where is the 9070XT? Nowhere in sight:

[INDENT]Zero exposure.[/INDENT]
[INDENT]Zero coverage in the first-impressions review cycl.[/INDENT]
[INDENT]Zero recommendations.[/INDENT]
[INDENT]Zero inclusion the discussion.[/INDENT]

The 5080 will feature in 9070XT reviews, but the potentially abysmal price/performance compared to what AMD keep promising they'll deliver won't matter because the people who search for RTX 5080 will never see those reviews.
Posted on Reply
#244
AusWolf
GasarakiNvidia is investing tons in RT. It's people poopooing it, "it so hard", "performances sucks", etc, etc. The minute Nvidia doesn't invest as much in raster and general performance (like in the 50xx), and develop new tech to improve performance, people complain that it's "fake frames", bla bla bla. Frame generation was designed so that more people can run RT.
But at what cost? Terrible latency and artefacts and other visual glitches? No, thanks.

Frame gen only works at high FPS when you don't need it in the first place.
Posted on Reply
#245
Jtuck9
AusWolfBut at what cost? Terrible latency and artefacts and other visual glitches? No, thanks.
Hold that thought
Chrispy_I've just remembered the biggest reason that AMD are failing hard in the market after watching/reading a bunch of 5080 reviews - it reminded me of a couple of conversations I'd had with people over the last couple of years.

When AMD waits until after Nvidia's launch, the POPULAR, HEAVILY-SEARCHED, IMPORTANT coverage of the new Nvidia cards *doesn't* include AMD's answer.

For the next 18 month product cycle, people googling for "RTX 5080" are going to read or watch today's reviews. That's right, the "$1000" 5080 is better than the 7900XTX. "WTF is a 9070, man? It's not even on the charts!"

Where is the 9070XT? Nowhere in sight:

[INDENT]Zero exposure.[/INDENT]
[INDENT]Zero coverage in the first-impressions review cycl.[/INDENT]
[INDENT]Zero recommendations.[/INDENT]
[INDENT]Zero inclusion the discussion.[/INDENT]

The 5080 will feature in 9070XT reviews, but the potentially abysmal price/performance compared to what AMD keep promising they'll deliver won't matter because the people who search for RTX 5080 will never see those reviews.
There does seem to be a lot of negative sentiment toward Nvidia at the moment.

"You come for the king"...
Posted on Reply
#246
AusWolf
Jtuck9Hold that thought


There does seem to be a lot of negative sentiment toward Nvidia at the moment.

"You come for the king"...
If you geared that comment towards me, then no. I'm not negative against Nvidia. I'm negative against a technology that doesn't work. I hate the AMD and Nvidia version of it equally.
Posted on Reply
#247
Jtuck9
AusWolfIf you geared that comment towards me, then no. I'm not negative against Nvidia. I'm negative against a technology that doesn't work. I hate the AMD and Nvidia version of it equally.
Hold that thought as in you might be pleasantly surprised with the improvements in FSR etc. The negativity aspect was mainly about AMD having to release a decent product across the board, the negativity toward Nvidia at the moment, not necessarily gaming related, might give them an opportunity to win people over. I'm sure people have argued that it'll have to be better than decent given perception (like renewable uptake if we are going on tangents).
Posted on Reply
#248
AusWolf
Jtuck9Hold that thought as in you might be pleasantly surprised with the improvements in FSR etc. The negativity aspect was mainly about AMD having to release a decent product across the board, the negativity toward Nvidia at the moment, not necessarily gaming related, might give them an opportunity to win people over. I'm sure people have argued that it'll have to be better than decent given perception (like renewable uptake if we are going on tangents).
Well, AMD f*ed up their marketing (surprise-surprise), while Nvidia just launched the 40-series again under a new name with an overpriced power hog on top. Let's see which blunder ends up being bigger.

Negativity is through the roof in both camps, that's for sure.
Posted on Reply
#249
Jtuck9
AusWolfWell, AMD f*ed up their marketing (surprise-surprise), while Nvidia just launched the 40-series again under a new name with an overpriced power hog on top. Let's see which blunder ends up being bigger.

Negativity is through the roof in both camps, that's for sure.
Easy to say as we pick one up off the shelf (or click a button). If people could see germs...
Posted on Reply
#250
AusWolf
Jtuck9Easy to say as we pick one up off the shelf (or click a button). If people could see germs...
It's an easy pick - whichever one serves your needs better and gives you the most bang for your buck. Problems arise when you start to see the world in red or green.
Posted on Reply
Add your own comment
Feb 2nd, 2025 19:54 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts