Monday, January 6th 2025

AMD Debuts Radeon RX 9070 XT and RX 9070 Powered by RDNA 4, and FSR 4

AMD at the 2025 International CES announced the Radeon RX 9070 XT and Radeon RX 9070 desktop performance-segment graphics cards. These will be the face of AMD's next generation of gaming graphics products, and will be powered by the new RDNA 4 graphics architecture. AMD hopes to launch both cards within Q1 2025. AMD changed the nomenclature of its gaming GPUs mainly because it has made a tactical retreat from the enthusiast graphics segment, its fastest products will compete in the performance segment. From the way AMD arranged the Radeon RX 9070 series and 9060 series product stack against the backdrop of the Radeon RX 7000 series, the GeForce RTX 4000 series, and the anticipated GeForce RTX 5000 series, the RX 9070 XT will offer performance roughly similar to the Radeon RX 7900 XT in raster, with the RX 9070 being slightly faster than the RX 7800 XT. The RX 9060 XT will beat the RX 7700 XT, while the RX 9060 beats the RX 7600 XT.

With RDNA 4, AMD claims generational SIMD performance increase on the RDNA 4 compute units. The 2nd Gen AI accelerators will boast of generational performance increase, and AMD will debut a locally-accelerated generative AI application down the line, called the AMD Adrenalin AI, which can generate images, summarize documents, and perform some linguistic/grammar tasks (rewriting), and serve as a chatbot for answering AMD-related queries. This is basically AMD's answer to NVIDIA Chat RTX. AMD's 3rd Gen Ray accelerator is expected to reduce the performance cost of ray tracing, by putting more of the ray tracing workload through dedicated hardware, offloading the SIMD engine. Lastly, AMD is expected to significantly upgrade the media acceleration and display I/O of its GPUs.
AMD also announced FidelityFX Super Resolution 4 (FSR 4), which has been developed for RDNA 4 (not sure if it will work on older generations of Radeon). It introduces a new machine learning (ML) based upscaling component to handle Super Resolution. This will be paired with Frame Generation, and an updated Anti-Lag 2, to make up the FSR 4 feature-set. Call of Duty: Black Ops 6 is confirmed to be one of the first titles to utilize FSR 4.
Nearly all AMD add-in board partners (AIBs) are ready with Radeon 9070 series graphics cards, including Acer, ASRock, ASUS, GIGABYTE, Sapphire, PowerColor, XFX, Vastarmor, and Yeston. MSI seems to have discontinued being an AMD AIB.

We also got our first peek at what the "Navi 48" GPU powering the Radeon RX 9070 series looks like—it features an unusual rectangular die with a 2:1 aspect ratio, which seems to lend plausibility to the popular theory that the "Navi 48" is two "Navi 44" dies joined at the hip with full cache-coherency. The GPU is rumored to feature a 256-bit GDDR6 memory interface, and 64 compute units (4,096 stream processors). The "Navi 44," on the other hand, is exactly half of this (128-bit GDDR6, 32 CU). AMD is building the "Navi 48" and "Navi 44" on the TSMC N4P (4 nm EUV) foundry node, on which it is building pretty much its entire current-generation, from mobile processors, to CPU chiplets.
Add your own comment

318 Comments on AMD Debuts Radeon RX 9070 XT and RX 9070 Powered by RDNA 4, and FSR 4

#201
freeagent
kapone32something we did not ask for but because of the narrative are given.
Looks great though, not seeing the problem :confused:
Posted on Reply
#202
oxrufiioxo
kapone32How was RDNA3 fumbled? The fact that they launched it in Germany?
Right on cue.....

Posted on Reply
#203
Darmok N Jalad
AusWolfIn other words: people are stupid. It's not logical, but it's fact.
Maybe. Keep in mind how many consumer choices we make all the time, on top of all the other life decisions that actually matter. It’s not impossible to make the most informed choice on any purchase, but to do it with every choice we make is incredibly time consuming. Make good decisions on the stuff that really matters. Choosing the right GPU for the ideal price for recreational gaming just isn’t critical for survival, unless you are spending beyond your means.
Posted on Reply
#204
kapone32
freeagentLooks great though, not seeing the problem :confused:
It was used as a Bludgeon to make a negative point for AMD. How many people complained about RT on AMD that were not AMD users?
Posted on Reply
#205
Dr. Dro
freeagentDoes it though? If you have the infrastructure laid, supply chain at the ready, they really don't cost much. Heck sell X amount and it pays for the hardware that built it, for the guys on the software end, and with a little profit on top, or a lot of profit depending on how efficient your operation is..
R&D certainly does cost though... that's the bulk of the spending. 2024 CapEx for NV turned out to $2.4B (USD), this capital expenditure is actually higher than pretty much any other technology company except for Intel. In a sense it is quite remarkable AMD went as far as they did with their relatively low CapEx but they need to start investing back their share price to advance their products. More importantly, they need to pay their engineers better than NV to make them want to deal with the state of affairs IMHO.
kapone32It was used as a Bludgeon to make a negative point for AMD. How many people complained about RT on AMD that were not AMD users?
You don't need to drink spoiled milk to know that it's spoiled, let alone complain about it being spoiled. Your point being?
Posted on Reply
#206
kapone32
oxrufiioxoRight on cue.....

I just want you to expand on your position?
Posted on Reply
#207
freeagent
kapone32It was used as a Bludgeon to make a negative point for AMD.
What? No... it is a natural progression.. looks good to me. Looks good to AMD too since they are embracing it..
Dr. DroR&D certainly does cost though...
I thought they were using their AI to plot the course :confused:
Posted on Reply
#208
kapone32
Dr. DroR&D certainly does cost though... that's the bulk of the spending. 2024 CapEx for NV turned out to $2.4B (USD), this capital expenditure is actually higher than pretty much any other technology company except for Intel



You don't need to drink spoiled milk to know that it's spoiled, let alone complain about it being spoiled. Your point being?
It will always be the same people bashing AMD and calling me out of touch with a PC I use everyday. Thanks for confirming why RT is such a focus. How is 7000 spoiled?
Posted on Reply
#209
Dr. Dro
freeagentI thought they were using their AI to plot the course :confused:
Not yet, but I fear that might as well be the case someday. Engineering is one of the tasks that will eventually be largely relegated to machine learning models, with humans used only for relative correction of the floorplan overall. Will be any day now...
kapone32It will always be the same people bashing AMD and calling me out of touch with a PC I use everyday. Thanks for confirming why RT is such a focus. How is 7000 spoiled?
I did not say it was spoiled, I proposed a counter to your universal dismissal of inconveniences by claiming that people are complaining without having experienced the product themselves. People complaining about low RT performance on Radeon are not doing so to "bash", and they don't need to have one to attest to the fact that it is so. I mean, that's what reviews are for. If anything, a product's deficiencies justifies choosing an alternative which does not suffer from them.
Posted on Reply
#210
AusWolf
oxrufiioxoEveryone but the most diehard amd fanboy knows they fumbled RDNA3 big time and they were coming off some momentum with RDNA2 and squandered it.

Hopefully the 9070XT is good right out of the gate and everyone goes damn that is what I am talking about.
What did they fumble? What did they squander? I see solid products in both cases, paired with terrible marketing.
TheinsanegamerNWell, ask yourself that question:

If AMD is slower in RT AND in raster compared to nvidia, why is it "not logical" for AMD to be sold significantly cheaper then nvidia? You already agreed that if the card is slower it should be cheaper, so.....
It is cheaper than Nvidia at basically any price point.
Darmok N JaladMaybe. Keep in mind how many consumer choices we make all the time, on top of all the other life decisions that actually matter. It’s not impossible to make the most informed choice on any purchase, but to do it with every choice we make is incredibly time consuming. Make good decisions on the stuff that really matters. Choosing the right GPU for the ideal price for recreational gaming just isn’t critical for survival, unless you are spending beyond your means.
That is a fair assumption. Although, I would do my best to be as informed as possible if I was about to spend hundreds on something, especially if it's my hobby, therefore an important part of my life.
Posted on Reply
#211
Wasteland
Dr. DroDon't spin this around, if it's anti-consumer when Nvidia does it, warranting years of rhetoric and scorn, it's anti-consumer when AMD does it. Where is all the outrage of AMD going back on their word and "betraying the trust of the loyal Radeon customer"? It's simple, really, it's pure hypocrisy. You people never once cared for it being closed source software, you only ever cared that it was better and you couldn't use it.
Look, anyone who thinks that AMD is "pro-consumer" for altruistic reasons needs his head examined. AMD is a multi-billion-dollar globocorp, not your friend or mine. But it isn't incorrect to say that a disadvantaged market position frequently forces AMD to adopt a pro-consumer posture, at least relative to the competition. As far as the consumer is concerned, this might even seem like a distinction without a difference. That is, until AMD gains a market advantage.

All else being equal, open standards are better. There's nothing wrong with having rooted for FSR on that basis. And in fact, historically, open standards tend to win. VHS won out over Betamax. The CD won out over the Mini-Disc. The PC defeated the Macintosh. If we're discussing the GPU wars, Freesync was ultimately vindicated, and PhysX dwindled to obscurity.

(I still loved my Betamax, though.)

In this case, it just so happens that AMD's open standard couldn't compete with Nvidia's closed standard on image quality. But it got pretty close, and I'm glad they tried. Whatever else you want to say about the tech itself or the motivations behind its design, FSR represents a significant value add for consumers of all shapes or sizes. I wish AMD luck on their new proprietary tech, but they're going to need more than "we've got DLSS too" to move the needle substantially on their position in the GPU market.
Dr. DroThose cards were just under the heat of competition. And they had their merits which made them successful products.

And the CPU question... yes, for a while, until Intel chips lost momentum and they started charging $300 for a 6 core Ryzen 5. Then the 12400F happened and suddenly even motherboards that had "technical limitations" supported it all right quick alongside a nice price cut.
Exactly right.
Posted on Reply
#212
Outback Bronze
oxrufiioxo



Yeah.... You Nvidia loving troll.... Even though your platform has been AMD for a while now.....
Hey, you forgot me in that picture!

Posted on Reply
#213
oxrufiioxo
AusWolfWhat did they fumble? What did they squander? I see solid products in both cases, paired with terrible marketing.
1. They showed performance that was well beyond what the actual products benchmarked at during reviews at launch. They had been historically pretty good with this but then squandered it.
2. They priced multiple sku too high leading to negative feedback and then their price tanking. People can view this however they want but even AMD admits this much.

Are the two main things they did poorly but AMD said so themselves multiple products they released specifically the 7600/7700XT/7900XT were poorly received and they would like to fix that. Which is a good thing.

Performance has also regressed over time the best AMD card is now 50% slower in RT vs the 4080 and it loses to a card that was 200 usd cheaper and has been retired because even Nvidia wanted something better at it's price point which isn't a good look either. FSR hasn't really improved over RDNA3 lifetime.

And now they are in the awkward position of not improving raster for at least another generations and hopefully catching last generation Nvidia products in the same tier in RT while doing something their most diehard fans hate potentially locking FSR4 to RDNA4 although I think if they can make it work on RDNA3 and older they will backtrack on this.
Outback BronzeHey, you forgot me in that picture!

@freeagent seems like he gets the most heat from my observation but what I wouldn't do to be a fly on the wall of your pms from users lmao especially the reports.... Well the mods in general not just you lol.
Posted on Reply
#214
Cheeseball
Not a Potato
CheeseballI will also eat my used shorts (no skidmarks here folks) if the RTX 5090 debuts at $1799. That would (technically) be fair pricing for a 32 GB GDDR7 monster.

RTX 5080 is hard to guess since the previous Super did come out at $1000 (so NVIDIA made themselves look "generous" :laugh:) but the predecessor was $1200. I would be surprised as hell if they do $1000 again.
Yeah I'm surprised (about the RTX 5080, not the RTX 5090):



Actually I'm surprised about the 5070 Ti and 5070 so they were lowered $50, although it would've been nice to be at a $700 or $500, but meh. At least the RX 9070 XT would be around maybe $600 now.
Posted on Reply
#215
AusWolf
oxrufiioxo1. They showed performance that was well beyond what the actual products benchmarked at during reviews at launch. They had been historically pretty good with this but then squandered it.
2. They priced multiple sku too high leading to negative feedback and then their price tanking. People can view this however they want but even AMD admits this much.

Are the two main things they did poorly but AMD said so themselves multiple products they released specifically the 7600/7700XT/7900XT were poorly received and they would like to fix that. Which is a good thing.

Performance has also regressed over time the best AMD card is now 50% slower in RT vs the 4080 and it loses to a card that was 200 usd cheaper and has been retired because even Nvidia wanted something better at it's price point which isn't a good look either. FSR hasn't really improved over RDNA3 lifetime.

And now they are in the awkward position of not improving raster for at least another generations and hopefully catching last generation Nvidia products in the same tier in RT while doing something their most diehard fans hate potentially locking FSR4 to RDNA4 although I think if they can make it work on RDNA3 and older they will backtrack on this.
Yeah, the 7600, 7700 XT and 7900 XT were badly priced at launch. But the 7800 XT and 7900 XTX were great.

Showing false benchmark results at launch is really poor, but people should know better than to believe marketing hype, and look for real reviews before buying. That doesn't affect the end product in any way, imo.

RT didn't improve because they basically used RDNA 2's RT engine. They spent all their R&D on chiplets and improving raster, neither of which paid off really well. But that still didn't make the end products bad, just not as much better than RDNA 2 as we'd hoped. Now, they're doing the opposite: they're trying to improve RT and the video engine while only doing minor fixes on raster, and backtracking on the chiplet design. I'm curious if it'll pay off, this is why I'm disappointed that we didn't get more detail.
CheeseballYeah I'm surprised (about the RTX 5080, not the RTX 5090):



Actually I'm surprised about the 5070 Ti and 5070 so they were lowered $50, although it would've been nice to be at a $700 or $500, but meh. At least the RX 9070 XT would be around maybe $600 now.
Oh so they've been announced now? What a shame for AMD... The 5070 is coming and the 9070 XT is nowhere in sight.

Also, AI TOPS? That's how we measure GPU performance now? *Facepalm*
Posted on Reply
#216
oxrufiioxo
AusWolfYeah, the 7600, 7700 XT and 7900 XT were badly priced at launch. But the 7800 XT and 7900 XTX were great.

Showing false benchmark results at launch is really poor, but people should know better than to believe marketing hype, and look for real reviews before buying. That doesn't affect the end product in any way, imo.

RT didn't improve because they basically used RDNA 2's RT engine. They spent all their R&D on chiplets and improving raster, neither of which paid off really well. But that still didn't make the end products bad, just not as much better than RDNA 2 as we'd hoped. Now, they're doing the opposite: they're trying to improve RT and the video engine while only doing minor fixes on raster, and backtracking on the chiplet design. I'm curious if it'll pay off, this is why I'm disappointed that we didn't get more detail.
I forgot to add that the chiplet thing didn't work out....Honestly the only reason the 7800XT/7900XTX looked ok was because the competing cards were massively hiked in price especially the 4080 and even then neither was better than the alternative just cheaper and both took a massive hit to RT performance that has grown over time.
Posted on Reply
#217
AusWolf
oxrufiioxoI forgot to add that the chiplet thing didn't work out....Honestly the only reason the 7800XT/7900XTX looked ok was because the competing cards were massively hiked in price especially the 4080 and even then neither was better than the alternative just cheaper and both took a massive hit to RT performance that has grown over time.
Cheaper at the same performance level means better in my books. And RT... Like I said, I don't care as long as I have to spend a grand to use it properly (which I won't).
Posted on Reply
#218
freeagent
AusWolfAlso, AI TOPS? That's how we measure GPU performance now? *Facepalm*
Are they really GPUs, or can they just play videogames as a byproduct :D
Posted on Reply
#219
Onasi
@freeagent
…welcome to 2007 when CUDA was released, I heard Crysis is gonna be mighty impressive and I seriously can’t wait for The Orange Box.
No, seriously, this isn’t new. The writing was on the wall for almost two decades.
Posted on Reply
#220
oxrufiioxo
AusWolfCheaper at the same performance level means better in my books. And RT... Like I said, I don't care as long as I have to spend a grand to use it properly (which I won't).
Regardless my favorite RDNA3 card is the 7900GRE when factoring in the US launch price hopefully the 9070 is a much better version of that because the 5070 and 5070ti are cheaper than expected....
Posted on Reply
#221
freeagent
Great, but can it run Indiana Jones at Supreme settings??

:D
Posted on Reply
#222
AusWolf
oxrufiioxoRegardless my favorite RDNA3 card is the 7900GRE when factoring in the US launch price hopefully the 9070 is a much better version of that because the 5070 and 5070ti are cheaper than expected....
Yet, we won't know their true performance because Nvidia is sidetalking with some FG 4x bullshit that makes the cards look better than they actually are.

Just when I thought it couldn't get worse than the AMD keynote. *Sigh*
Posted on Reply
#223
Zazigalka
Where are performance numbers for 9070xt ? What a waste of time.
Also, FSR4 being hardware locked to rx9000 means no more excuses, it really has to step up in image quality.
Posted on Reply
#224
Visible Noise
Now we know why AMD bailed on presenting RDNA 4 this afternoon. It’s bottom market trash compared to what Nvidia is showing now.

This Nvidia announcement isn’t a stomping, it’s a shredding. A year from now AMD will have 5% market share from their loyalists and that’s it.

Radeon is dead. Long live Radeon.
Posted on Reply
#225
AusWolf
Visible NoiseNow we know why AMD bailed on presenting RDNA 4 this afternoon. It’s bottom market trash compared to what Nvidia is showing now.

This Nvidia announcement isn’t a stomping, it’s a shredding. A year from now AMD will have 5% market share from their loyalists and that’s it.

Radeon is dead. Long live Radeon.
Wait for it. Frame generation enabled fake data doesn't mean anything.
Posted on Reply
Add your own comment
Jan 8th, 2025 21:06 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts