Thursday, January 23rd 2025
AMD is Taking Time with Radeon RX 9000 to Optimize Software and FSR 4
When AMD announced its upcoming Radeon RX 9000 series of GPUs based on RDNA 4 IP, we expected the general availability to follow soon after the CES announcement. However, it turns out that AMD has scheduled its Radeon RX 9000 series availability for March, as the company is allegedly optimizing the software stack and its FidelityFX Super Resolution 4 (FSR 4) for a butter smooth user experience. In a response on X to Hardware Unboxed, AMD's David McAfee shared, "I really appreciate the excitement for RDNA 4. We are focused on ensuring we deliver a great set of products with Radeon 9000 series. We are taking a little extra time to optimize the software stack for maximum performance and enable more FSR 4 titles. We also have a wide range of partners launching Radeon 9000 series cards, and while some have started building initial inventory at retailers, you should expect many more partner cards available at launch."
AMD is taking its RDNA 4 launch more cautiously than before, as it now faces a significant problem with NVIDIA and its waste portfolio of software optimization and AI-enhanced visualization tools. The FSR 4 introduces a new machine learning (ML) based upscaling component to handle Super Resolution. This will be paired with Frame Generation and an updated Anti-Lag 2 to make up the FSR 4 feature set. Optimizing this is the number one priority, and AMD plans to get more games on FSR 4 so gamers experience out-of-the-box support.
Source:
David McAfee
AMD is taking its RDNA 4 launch more cautiously than before, as it now faces a significant problem with NVIDIA and its waste portfolio of software optimization and AI-enhanced visualization tools. The FSR 4 introduces a new machine learning (ML) based upscaling component to handle Super Resolution. This will be paired with Frame Generation and an updated Anti-Lag 2 to make up the FSR 4 feature set. Optimizing this is the number one priority, and AMD plans to get more games on FSR 4 so gamers experience out-of-the-box support.
251 Comments on AMD is Taking Time with Radeon RX 9000 to Optimize Software and FSR 4
The 5080 will be the GPU to try and get, and if you want one get in the limited-stock queue for the $999 founders edition directly from Nvidia because all of the AIB models are being listed at $1200+
It doesn't look like it for now. This is Turing vs Blackwell. I assume that if Nvidia really was working on this problem, then we would see at least some indication by the 4th generation of their RT hardware.
I worded it badly. I'm tired after work. :laugh:
What I meant is, there's geometry, textures and physics as well. How do you do all of it without raster? Possibly. Why would you think that RT is always correct? I've seen it make errors. There was someone posting a screenshot in another thread not long ago of RT casting shadows that it logically shouldn't.
It was a desert scene. I can't remember which thread it was in.
Also, how is correct = good? I mean, photorealism isn't the only way for a game's graphics to be visually pleasing. Then why does my brain get used to no RT equally quickly? My conclusion is that both raster and RT do a pretty good job these days, just differently. So much markup for AIB cards! You could buy a whole GPU for such price. Total ripoff on AIBs, total ripoff on cunsumers.
I have no issues with RT. As pointed out, its already actively being used for lots of games. I hate doing it in real time, on an entire scene, introducing an ungodly amount of latency, and I also hate paying excessive money for it like we see today. The 5090 is a 750mm2 GPU - it hits 29 FPS in PT Cyberpunk. And the cost of that 750mm2 GPU isn't going down either. The gap's just too large, and as long upscaling isn't perfect (and its not), will remain so. We can be all happy about DLSS4 now, but the latency is here to stay regardless.
So far, the overall situation and deal I'm offered just still looks unconvincing and more like an Nvidia clusterfuck than anything else. Not convinced. Not buying into it.
Its a similar thing to me as VR. Sure, there are some niche situations where it really makes a dent (especially if you run into your TV)... but its not viable economically yet. You require an expensive headset (that's not perfect either), higher FPS thus more GPU, and a special suite of games. It hasn't taken off, and it won't, with that set of conditions. Now, for RT, you need an expensive GPU (that's not going to last either, and effectively already struggles from day one), you need an upscale to get playable FPS, and you need a special suite of games. See the similarities?
Now, some reflection on the beginning of this circus:
Back when SIGGRAPH happened and Huang told us this was the future, and Turing launched shortly after... a lot of people shared the idea this could take 2-3 generations before it actually took off and 10 years for the real change. Where are we now? 3 generations, six years ahead... 95%+ of all games are still built entirely on a non-RT framework. So we have four years left for that paradigm shift. I think its safe to add another six on top.
If RT really was the future, then I'd like to see some indication that we're moving towards more RT-oriented architectures. But for now, RT cores are still just an add-on, and not really improved, either.
So yeah we are still couple of years away from it.
That's the point. Shitty devs aren't going to be any less shitty because they can optimize a workflow now. They're just going to have a lower budget to work with. It is the same thing @BSim500 just pointed out and I did too in another post elsewhere; those hours saved on doing lighting aren't going to be spent elsewhere. They're going to be cut. I have yet to see the numbers of both approaches, too. Is it really faster, really cheaper? Or will you never really master your own engine and game that way and develop the same efficiency yourself? We're already seeing that happen in front of us with the stream of UE engine based games that run like absolute horse manure and don't even look good doing so. The gameplay is often nothing to write home about either. But yeah, they managed to release a game. yay. They also managed to foot part of their bill to our GPUs.
The thing I always come back to is the analogy to movies. The main reason that movies are so much more expensive (and more importantly, the main reason they tend to look so much more expensive) than traditional television is the lighting. In traditional TV, you don't have time to mess around with the lighting; you just film the scene on a set with static lighting and call it done. In movies, by contrast, lighting is meticulously micromanaged, sometimes altered several times in the same scene.
The purpose behind lighting in movies, in other words, is pretty much the opposite of realistic. We could look at art, too. Michelangelo's Pieta famously features a Virgin Mary who is something like twice as tall as the Jesus figure draped over her lap. It's an optical illusion, something the artist understood perhaps better than anyone. The proportions look real to us, but they aren't "realistic."
All of which is to say that there's an aesthetic trade off to "automated" lighting systems like RT, even if we assume a hypothetically perfect RT implementation. In gaming, this aesthetic trade off also has purely utilitarian implications, the most obvious being areas that are mistakenly too dark. I remember watching an early-ish Digital Foundry video gushing over how RT made a side alley pitch black, whereas the raster version of the scene left it unrealistically light. All I could think was, "is ... that really an improvement?"
Of course, all of this is somewhat premature. We don't actually have a perfect RT implementation. What we have now is what you might call the worst of all worlds--huge perf penalties for even relatively light RT, zero-to-low workload benefit for developers (who have to do the hand-placed lighting for raster anyway), and a wild over-emphasis generally on comparatively small graphical-fidelity improvements, gen-on-gen, year-to-year--and all the while, the things that make a game fun to play languish under the oppressive burden of corporate-copy-cat/focus-group design.
I don't actually hate RT. It's a cool tech, and it probably is the future, but as things stand now, I'm forced to regard it in much the same way that we would usually regard a particularly expensive-yet-unimpressive Ultra setting in the menu. No one in his right mind enabled "Volumetric Clouds" in e.g. AC Odyssey. RT's in a similar spot, not always, but most of the time. The main difference is hype.
I'm a graphics whore at heart, at my own expense in many ways...
I hope AMD will release a new high end GPU and launch a techology similar to ray tracing but lighter to handle for the hardware.
Also for me visuals got good enough during the PS4's lifetime. I dont care if hacks were used to get there, thats not my concern as a consumer. I dont play games for realism, I play them to escape from it.
When AMD waits until after Nvidia's launch, the POPULAR, HEAVILY-SEARCHED, IMPORTANT coverage of the new Nvidia cards *doesn't* include AMD's answer.
For the next 18 month product cycle, people googling for "RTX 5080" are going to read or watch today's reviews. That's right, the "$1000" 5080 is better than the 7900XTX. "WTF is a 9070, man? It's not even on the charts!"
Where is the 9070XT? Nowhere in sight:
[INDENT]Zero exposure.[/INDENT]
[INDENT]Zero coverage in the first-impressions review cycl.[/INDENT]
[INDENT]Zero recommendations.[/INDENT]
[INDENT]Zero inclusion the discussion.[/INDENT]
The 5080 will feature in 9070XT reviews, but the potentially abysmal price/performance compared to what AMD keep promising they'll deliver won't matter because the people who search for RTX 5080 will never see those reviews.
Frame gen only works at high FPS when you don't need it in the first place.
"You come for the king"...
Negativity is through the roof in both camps, that's for sure.