Wednesday, February 26th 2025

Complete Specifications of AMD Radeon RX 9070 and RX 9070 XT Leaked

VideoCardz obtained AMD Radeon RX 9070 series specifications, which appear to be the official final configurations of the upcoming RDNA 4 GPUs. As we previously expected, the lineup consists of two models based on the Navi 48 GPU, which integrates 53.9 billion transistors on a 357 mm² die using a 4 nm (N5) process from TSMC. Both the RX 9070 XT and RX 9070 utilize identical memory configurations: 16 GB of GDDR6 memory running at 20 Gbps across a 256-bit bus, delivering 640 GB/s bandwidth. Each card implements 64 MB of 3rd Generation Infinity Cache and supports PCIe 5.0 x16 interface standards. The RX 9070 XT features 64 RDNA 4 Compute Units, equating to 4096 Stream Processors, 64 Ray Accelerators, and 128 AI Accelerators. It operates at a 2400 MHz game clock and 2970 MHz boost clock, providing 48.7 TFLOPS of single-precision FP32 compute performance.

Power requirements include a 304 W TBP and a recommended 750 W power supply. The standard RX 9070 reduces specifications to 56 Compute Units (3584 Stream Processors), 56 Ray Accelerators, and 112 AI Accelerators. Clock speeds decrease to 2070 MHz game clock and 2540 MHz boost clock, with correspondingly lower power requirements of 220 W TBP and a recommended 650 W power supply. Since both SKUs use the same Navi 48 die, the separation between them is likely better binning for the XT version, and lower bins end up for the non-XT version. Both models support HDMI 2.1b and DisplayPort 2.1a UHBR13.5 outputs. AMD has confirmed the cards will launch exclusively through board partners with no reference designs planned and that the official unveiling will be in March. Earlier rumors have suggested a $699 price tag for the Radeon RX 9070 XT SKU, putting its expected price/performance near NVIDIA's GeForce RTX 5070 Ti. AMD notes that 85% of gamers buy cards below $700, which the RDNA 4 series will focus on.
Source: VideoCardz
Add your own comment

54 Comments on Complete Specifications of AMD Radeon RX 9070 and RX 9070 XT Leaked

#26
TumbleGeorge
sethmatrix7So the 9070 XT could be roughly 7900 XTX rasterization
No!
Posted on Reply
#27
sethmatrix7
TumbleGeorgeNo!
We’ll see. The leaks from videocardz would’ve put it very close.
Posted on Reply
#28
Visible Noise
I’ve been saying it all along (check my posts).

- This is Vega 2.0

AMD is incapable of learning from their mistakes.
Posted on Reply
#29
LastDudeALive
BMfan80Can you show me where you saw the date?
Every news article I have seen that show the AMD slideshow, shows the slide that says Q1 2025, no official date.
They never officially had a Jan launch date, but that's clearly what the plan was and it got pushed back at the last minute. B&H was going to open pre-orders on Jan 23 (www.tomshardware.com/pc-components/gpus/radeon-rx-9070-gpu-preorders-are-seemingly-scheduled-for-january-23-asus-rtx-9070-and-rx-9070-xt-show-up-at-us-retailer-but-pricing-remains-unknown), and cards arrived in store warehouses in Jan as well. But AMD pushed it back a month, so the GPUs have just been sitting in warehouses burning money for retailers.

That's why we're getting all these leaks and early benchmark runs. Everyone (AIBs, retailers) involved with the GPUs has had far more hands-on time with them than originally planned. There were even AIB models at CES that attendees could handle. Everyone was ready to launch them until AMD pumped the brakes.
Posted on Reply
#30
JohH
It's intriguing. It seems they delayed the launch in order to increase pricing. I guess AMD knew Nvidia wasn't planning to ship a great volume of gamer Blackwell in 2025Q1.
Posted on Reply
#31
Visible Noise
7600 XT X2.

Even I’m disappointed and I wasn’t on the hype train.
Posted on Reply
#32
gridracedriver
Only 64 rops, only 64CU, only 64 RT cores it's incredible how RDNA4 is an extreme optimization in the latest RDNA generation, a great job by AMD, even the 7800XT had 96rops and is literally paved, RDNA 4 matches the top of the range RDNA3 in most cases (on average it will be at most 2~3% below, not much) and clearly surpasses it in ray tracing where evidently the throughput of the RT cores will be doubled as Mark Cerny had already anticipated in the Sony presentation of the PS5 Pro.
In my opinion AMD has fixed the dual issue, with an efficiency (IPC) almost 40% higher per CU.
64 vs 96 CU, 64 vs 192 Rops, 64 vs 96 RT, and only +500mhz clock... incredible.
Prices, hopefully 9070 = 7900GRE and 9070XT = 7900XT of today's street prices.
PS. 9070 220watt has a sensational efficiency, +35% vs 7900XT 300watt... Absurd.
Posted on Reply
#33
Vayra86
BMfan80So if I am to understand something, looking at the charts of a 5070ti review.
5070ti idle is 21w, a7900GRE is15w.
On a multi monitor the 5070ti is 23w and the GRE is 31w.
Some how that is considered high power comsumption.

The video playback seems okay at 60w, only because playing a 720p video with my 4090 and it is using between 46 and 60w
Yes, its a non issue at this point. Issue was fixed about 9 months ago I believe. There are minute differences, but they've always been there, depending on chip size combined with display setup. This is all about power state management really.
gridracedriverOnly 64 rops, only 64CU, only 64 RT cores it's incredible how RDNA4 is an extreme optimization in the latest RDNA generation, a great job by AMD, even the 7800XT had 96rops and is literally paved, RDNA 4 matches the top of the range RDNA3 in most cases (on average it will be at most 2~3% below, not much) and clearly surpasses it in ray tracing where evidently the throughput of the RT cores will be doubled as Mark Cerny had already anticipated in the Sony presentation of the PS5 Pro.
In my opinion AMD has fixed the dual issue, with an efficiency (IPC) almost 40% higher per CU.
64 vs 96 CU, 64 vs 192 Rops, 64 vs 96 RT, and only +500mhz clock... incredible.
Prices, hopefully 9070 = 7900GRE and 9070XT = 7900XT of today's street prices.
PS. 9070 220watt has a sensational efficiency, +35% vs 7900XT 300watt... Absurd.
Isn't it strange though for a technically great looking architecture that AMD deploys it so weirdly? You're not wrong...
Posted on Reply
#34
Yashyyyk
Dual fan model? AMD canceled reference one :/
Posted on Reply
#35
BMfan80
LastDudeALiveThey never officially had a Jan launch date, but that's clearly what the plan was and it got pushed back at the last minute. B&H was going to open pre-orders on Jan 23 (www.tomshardware.com/pc-components/gpus/radeon-rx-9070-gpu-preorders-are-seemingly-scheduled-for-january-23-asus-rtx-9070-and-rx-9070-xt-show-up-at-us-retailer-but-pricing-remains-unknown), and cards arrived in store warehouses in Jan as well. But AMD pushed it back a month, so the GPUs have just been sitting in warehouses burning money for retailers.

That's why we're getting all these leaks and early benchmark runs. Everyone (AIBs, retailers) involved with the GPUs has had far more hands-on time with them than originally planned. There were even AIB models at CES that attendees could handle. Everyone was ready to launch them until AMD pumped the brakes.
Every GPU launch has leaks before release, that is nothing new.

Toms article ended with"If AMD does plan to launch the RX 9070 series this month, or at least open pre-orders, expect a briefing on specifications and pricing in a few days."
Even Toms didn't just jump to a conclusion on when it will launch.
Vayra86Yes, its a non issue at this point. Issue was fixed about 9 months ago I believe. There are minute differences, but they've always been there, depending on chip size combined with display setup. This is all about power state management really.
I know, just tired of people complaining about the extra power usage every time AMD's name comes up, and when you check the graphs it's a few watts either side.
Posted on Reply
#36
Scattergrunt
LastDudeALiveDOA for sure. The -$50 strategy just doesn't work. And it doesn't matter if they made huge improvements in their RT cores, the memory bandwidth will severely handicap it in most games and at 4k compared to the 5070s. The 5070 has more memory bandwidth than these cards.
DOA? Maybe not, but I do agree the -$50 strategy and then cut it down later strategy sucks. Most of AMD's bulk sales should be ON LAUNCH but instead AMD seems complacent in releasing a GPU that's barely cheaper than NVIDIA's without the features people want (even if they don't realize they don't really need them) and when they realize for the howmanyith time that this strategy ain't working, they finally start cutting down the price to where it should of been in the first place but by then its already too late and they lost most of their sales opportunities.

You think after AMD has done this TWICE, they would realize this strategy doesn't work. But they still do it anyway..
LastDudeALiveAMD cards have been priced a "tier" down many times before. The 6700 XT offered "4060-Ti performance for 4060 price" for a long time. 7800 XT had "4070 performance for 4060-Ti (16GB) price."
I wouldn't really use these examples as these are very much after the fact; but I do agree. The 6700XT was a standout GPU from RDNA2, offering great value but that was only after it dropped in price, compared to the 3070 anyway.
LastDudeALiveThe fundamental issue is the general public doesn't see AMD cards as simply worse than Nvidia (and thus needing to be discounted), they actually see them as lacking features. It's like if you tried to sell a car without airbags. Even if you sold it at a 50% discount compared to other cars with airbags, your potential market is still tiny. A few people will see it as a "good deal" and buy it, but most people will say "I will never buy a car without airbags, no matter how cheap it is."
If you were to ask me their fundamental problem is that AMD failed to market to gamers back when gaming was hitting its peak (the early / mid 2010's), and now they're paying the price for it. Every gamer, every parent of that gamer, etc know what NVIDIA is, but hardly anyone outside of the tech sphere knows what AMD is. And that, while doesn't seem like it hurts, does. Especially for Gaming GPU's; something that is sold to the general public. NVIDIA has made a name for itself by being the GAMER brand (while also simultaneously straying further away from gamers every single year.), where as AMD is treated like an awkward middle child who people acknowledge exists but don't associate with gaming, or the general public at all.

I think AMD should invest its next steps forward into investing a lot more into marketing, as its pretty clear that AMD isn't cutting it in that department. Make people associate AMD GPU's with GAMING for gods sake, instead of 'oh my grandma bought me a cheap prebuilts with a RX 6600.. yay...'

Of course, were just talking their gaming GPU's here. Everywhere else AMD is fine (Imo)
LastDudeALiveAnd remember, most people don't buy discrete GPUs. They buy prebuilts and laptops, which are far more sensitive to marketing and public perception. That's why lots of prebuilts are still sold with i7s and i9s, even though AMD is the clear choice for most tech enthusiasts. The marketing power of "Core i9" is strong.
Marketing is big part of prebuilts (which is why you still see many prebuilts rocking the new intel chips, despite the fact many people do not like them.) so I definitely agree with that part. But in that regard, AMD's Ryzen chips also still sell well with prebuilts. As for laptops.. yea, intel dominates that. There's an elaborate story about the ins and outs of why AMD is seemingly behind in laptops but it's alot to get into, probably worth a whole separate discussion honestly.
Posted on Reply
#37
JohH
If accurate it seems AMD is within spitting distance of Nvidia on both performance per area and performance per watt. And they're both on 4nm-class manufacturing processes.
I really didn't expect that.
Posted on Reply
#38
Scircura
Too bad these new Radeon parts are UHBR13.5 instead of UHBR20 like the RTX 50 series.
Posted on Reply
#39
GodisanAtheist
Looks like everyone has staked their claim on this launch already. For some it's going to suck, for other's it's going to reshape the market.

All that is left for an actual public announce from AMD and an actual launch.

This horse has been beaten into atomized dust at this point :D
Posted on Reply
#40
Frick
Fishfaced Nincompoop
TumbleGeorge$549 max or nothing.
Those times will never come back.
Posted on Reply
#41
LabRat 891
The RX 9070 XT features 64 RDNA 4 Compute Units
The RX 9070 features 56 RDNA 4 Compute Units
Welcome back, Vega. :p
Posted on Reply
#42
Scattergrunt
GodisanAtheistLooks like everyone has staked their claim on this launch already. For some it's going to suck, for other's it's going to reshape the market.
I'm in the middle.. I'm excited but I know not to get my hopes up. AMD will likely shoot itself in the foot again but at least its more new GPU's in the market for me.
GodisanAtheistAll that is left for an actual public announce from AMD and an actual launch.
This horse has been beaten into atomized dust at this point :D
Yea, were just speculating atp. I will leave my opinion on AMD till then what I said here..
Posted on Reply
#43
Jtuck9
GodisanAtheistThis horse has been beaten into atomized dust at this point :D
Posted on Reply
#44
GodisanAtheist
Jtuck9
- Gonna give yourself a bad case of the horse lung...
Posted on Reply
#45
TheinsanegamerN
gridracedriverOnly 64 rops, only 64CU, only 64 RT cores it's incredible how RDNA4 is an extreme optimization in the latest RDNA generation, a great job by AMD, even the 7800XT had 96rops and is literally paved, RDNA 4 matches the top of the range RDNA3 in most cases (on average it will be at most 2~3% below, not much) and clearly surpasses it in ray tracing where evidently the throughput of the RT cores will be doubled as Mark Cerny had already anticipated in the Sony presentation of the PS5 Pro.
In my opinion AMD has fixed the dual issue, with an efficiency (IPC) almost 40% higher per CU.
64 vs 96 CU, 64 vs 192 Rops, 64 vs 96 RT, and only +500mhz clock... incredible.
Prices, hopefully 9070 = 7900GRE and 9070XT = 7900XT of today's street prices.
PS. 9070 220watt has a sensational efficiency, +35% vs 7900XT 300watt... Absurd.
If it's so good, why is AMD playing coy with official performance? Why did it not show the cards off in january? Why drop out of the high end when you have such a slam dunk on your hands?

Something doesnt add up here. A 40% increase per CU would be absolutely jaw dropping, I dont believe ANY generation, even going back to the R300 days, has achieved that.

If it were me, and I knew I had such a great arch on my hands, I'd be parading the 9070xt out there, doing my best to derail the nvidia hype train. I'd be talking about 9060s and the huge changes coming to the lower end and mid range. And I'd be planning a 9080xt to rip off the 5090's crown. It's not often you get such a slam dunk on larger competition.
Posted on Reply
#46
LastDudeALive
ScattergruntIf you were to ask me their fundamental problem is that AMD failed to market to gamers back when gaming was hitting its peak (the early / mid 2010's), and now they're paying the price for it. Every gamer, every parent of that gamer, etc know what NVIDIA is, but hardly anyone outside of the tech sphere knows what AMD is. And that, while doesn't seem like it hurts, does. Especially for Gaming GPU's; something that is sold to the general public. NVIDIA has made a name for itself by being the GAMER brand (while also simultaneously straying further away from gamers every single year.), where as AMD is treated like an awkward middle child who people acknowledge exists but don't associate with gaming, or the general public at all.
A fair point, and in the interest of honesty, I wasn't involved in the PC gaming scene at that time, so I'll defer to your judgement on that.
ScattergruntMarketing is big part of prebuilts (which is why you still see many prebuilts rocking the new intel chips, despite the fact many people do not like them.) so I definitely agree with that part. But in that regard, AMD's Ryzen chips also still sell well with prebuilts. As for laptops.. yea, intel dominates that. There's an elaborate story about the ins and outs of why AMD is seemingly behind in laptops but it's alot to get into, probably worth a whole separate discussion honestly.
However, I think the success of AMD in the CPU market vs their lackluster performance in the GPU market shows that a comeback was certainly possible, even if they were at an obvious disadvantage from the early 2010s. AMD came back in the CPU market with actual innovation. Zen and the chiplet architecture allowed them to ramp up the core counts with Ryzen and Threadripper, providing an actual, tangible advantage over Intel. That alone was massive for their success, and then the 3D cache (another actual innovation) put them on top in gaming. Intel responded with e-cores and Core Ultra, which are more of a mixed bag overall, but at least both sides are furiously innovating in the CPU market.

But in the GPU market, they haven't innovated. Nvidia has been first to literally every single new technology/feature, with AMD lagging 2-5 years behind. AI cores, RT cores, upscaling, reflex, frame generation, etc. AMD has done a good job improving the performance of RDNA, I'll give them that. But they haven't come up with a single NEW technology or feature for at least a decade. There's just no excuse for that. It's like nobody at Radeon has the ability to think "what would be cool, but doesn't exist yet?"
Posted on Reply
#47
Cheeseball
Not a Potato
sethmatrix7So the 9070 XT could be roughly 7900 XTX rasterization with better RT, possibly better features (FSR4)- at roughly 35% lower MSRP.

If that proves true I can’t see how it’s not a win for everyone involved.
Probably around 7900 GRE/XT rasterization since its only 256 TMUs and 4096 cores specs-wise. But I would like to eat my words if it turns out better than that.
Posted on Reply
#48
3x0
CheeseballProbably around 7900 GRE/XT rasterization since its only 256 TMUs and 4096 cores specs-wise. But I would like to eat my words if it turns out better than that.
The leaked AMD slides have it 30+% above the GRE in raster
Posted on Reply
#49
Cheeseball
Not a Potato
3x0The leaked AMD slides have it 30+% above the GRE in raster
Is it this one?



Those are very ambitious numbers, considering the 7900 XT and XTX are around ~18% and ~37% better than the GRE according to TPU's Relative Performance chart with the GRE as the baseline.

If the numbers are true, great. If not, then well, shit.
Posted on Reply
#50
3x0
CheeseballIs it this one?
Yup, that's the one. I mean, Maxwell had less cores than Kepler and smoked it in performance
Posted on Reply
Add your own comment
Feb 26th, 2025 22:22 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts