Friday, January 10th 2025

AMD Radeon RX 9070 XT Pricing Leak: More Affordable Than RTX 5070?

As we reported yesterday, the Radeon RX 9070 XT appears to be all set to disrupt the mid-range gaming GPU segment, offering performance that looks truly enticing, at least if the leaked synthetic benchmarks are anything to go by. The highest-end RDNA 4 GPU is expected to handily outperform the RTX 4080 Super despite costing half as much, with comparison to its primary competitor, the RTX 5070, yet to be made.

Now, a fresh leak has seemingly hinted at how heavy the RDNA 4 GPU is going to be on its buyers' pockets. Also sourced from Chiphell, the Radeon RX 9070 XT is expected to command a price tag between $479 for AMD's reference card and roughly $549 for an AIB unit, varying based on which exact product one opts for. At that price, the Radeon RX 9070 XT easily undercuts the RTX 5070, which will start from $549, while offering 16 GB of VRAM, albeit of the older GDDR6 spec. There is hardly any doubt that the RTX GPU will come out ahead in ray tracing performance, as we already witnessed yesterday, although traditional rasterization performance will be more interesting to compare.
In a recent interview, AMD Radeon's Frank Azor has already stated that the RDNA 4 cards will be priced as "not a $300 card, but also not a $1,000 card", which frankly does not reveal much at all. He did also state that the RDNA 4 cards will attempt a mix of performance and price, similar to the RX 7800 XT and the RX 7900 GRE. All that remains to be done now, is to wait and see whether AMD's claims hold water.
Source: HXL (@9550pro)
Add your own comment

106 Comments on AMD Radeon RX 9070 XT Pricing Leak: More Affordable Than RTX 5070?

#76
Chrispy_
Macro DeviceYear 5 of NV releasing absolutely nothing spectacular. AMD: "Herp derp..."
And that's why AMD's GPU marketshare is pitiful ;)

If they're serious about staying in the GPU race, they need to stop farting around!

AMD must make something so good that even Nvidia fanboys will be hard-pressed to ignore the AMD alternative. RDNA4 isn't going to do it on technical prowess, so they need to absolute murder Nvidia in the price/performance metric even if it's a loss-leader for them this generation. When I say murder, I mean like 40%+ better performance/$ than Nvidia.

If it's only 25% or something like that, the Nvidia buyers will just cite what they always cite; "but DLSS, but drivers, but Raytracing" etc - regardless of whether that's still even true in 2025.
Posted on Reply
#77
Macro Device
Chrispy_If they're serious about staying in the GPU race
We both know they are not. We'd have seen an X3D of the GPU market otherwise. Imagine an 8 GB GPU that has a 3DVCache so advanced it doesn't fall off in VRAM hogs! As snappy as heavily overclocked 4060 Ti. For $300 or lower. Eh.
Posted on Reply
#78
Chrispy_
Macro DeviceWe both know they are not. We'd have seen an X3D of the GPU market otherwise. Imagine an 8 GB GPU that has a 3DVCache so advanced it doesn't fall off in VRAM hogs! As snappy as heavily overclocked 4060 Ti. For $300 or lower. Eh.
Pretty sure more cache doesn't really work for GPUs the same way as it does for CPUs, both in terms of how much performance it adds, and also in how difficult it is to manage power and clocks with a vCache layer in the way.

Do you remember the discussions at the start of the 7000-series launch where AMD released numbers for their internal testing with even more cache? The cache hit rate for the 7900XTX isn't great, and it's just one reason why lesser tiers just don't have as much.
Posted on Reply
#79
Macro Device
Chrispy_more cache doesn't really work for GPUs the same way as it does for CPUs
I didn't say, "more," I said, "advanced."

Welp, literally anything that makes nVidia GPUs look terrible in comparison is welcome by me. The market is however nowhere near healthy.
Posted on Reply
#80
Zach_01
It’s true… AMD spend too much time and effort (what ever that was) and R&D changing courses and direction of architecture.

Someone in there stupidly believed that 2 different architectures will work then and in future…
Little did they knew… And I’m very surprised by that because for their CPU line they did the exact opposite, just like nVidia did with their own GPU line.
Posted on Reply
#81
Chrispy_
Macro DeviceI didn't say, "more," I said, "advanced."

Welp, literally anything that makes nVidia GPUs look terrible in comparison is welcome by me. The market is however nowhere near healthy.
Yeah, the market is not healthy right now - that's for sure.

It's crazy that Nvidia has such a massive gaming market dominance when AMD have had console monopoly and been a strong (if not the primary) target for game developers since 2013!

Nvidia are killing it on the marketing and execution front, and I'm saying that with my "consumer gaming hat" on, where I have to consider the cost and value of GPUs for gaming only, not my "system integrator hat" where I get free hardware and need to worry about CUDA API support and performance in professional productivity for profit.
Posted on Reply
#82
3valatzy
Chrispy_Yeah, the market is not healthy right now - that's for sure.
It's crazy that Nvidia has such a massive gaming market dominance when AMD have had console monopoly and been a strong (if not the primary) target for game developers since 2013!
Nvidia are killing it on the marketing and execution front
It's because Nvidia has the halo. Had AMD launched a card that is within 15% of RTX 4090 for $650, things would have been much different.
Remember the RX 6800 XT? It was like RX 6900 XT, but the first was $1000, while the second was $650!

It's all about the top performance.



Posted on Reply
#83
Redwoodz
Chrispy_And that's why AMD's GPU marketshare is pitiful ;)

If they're serious about staying in the GPU race, they need to stop farting around!

AMD must make something so good that even Nvidia fanboys will be hard-pressed to ignore the AMD alternative. RDNA4 isn't going to do it on technical prowess, so they need to absolute murder Nvidia in the price/performance metric even if it's a loss-leader for them this generation. When I say murder, I mean like 40%+ better performance/$ than Nvidia.

If it's only 25% or something like that, the Nvidia buyers will just cite what they always cite; "but DLSS, but drivers, but Raytracing" etc - regardless of whether that's still even true in 2025.
But you see, the defecit AMD faces is not in raster...it's in software. They have no means of breaking the cUDA wall. Until people start voting with their wallet they have no chance.
Posted on Reply
#84
kapone32
3valatzyIt's because Nvidia has the halo. Had AMD launched a card that is within 15% of RTX 4090 for $650, things would have been much different.
Remember the RX 6800 XT? It was like RX 6900 XT, but the first was $1000, while the second was $650!

It's all about the top performance.



It is always the same people and the same Graphs. Look at your relative performance chart and realize how you are wrong. Let's look at that through an objective lens. Halo parts are for people with more money than sense and if you think more people buy Halo cards than that you would not understand why the 3060 laptop was the best selling Gaming device during Covid. Neither do you appreciate that 21% does not translate into measurable without an FPS counter difference in performance. That is saying that if you are getting 300 FPS with the 4090 then 260 would not feel as fast with the 7900XTX. At 4K we are just starting to get to High refresh rate monitors but the absolute fastest are still 1080P TN panels. Do E Sports players use RT? I know all the live streams use Nvidia they are the best are penetration.

www.newegg.ca/asus-geforce-rtx-4090-tuf-rtx4090-24g-og-gaming/p/N82E16814126671?Item=N82E16814126671

www.newegg.ca/sapphire-pulse-11322-02-20g-amd-radeon-rx-7900-xtx-24gb-gddr6/p/N82E16814202429?Item=N82E16814202429

If you look add price to the equation the whole narrative changes. I could buy 2 7900XTX for the price of 1 4090. Is RT at the 3090TI level bad? It is not like the TUF is a high end 4090 either those go as hifgh as $4000+.

If you are a Gamer then it makes no common sense to pay double for CP2077 to look prettier as you get the Katana as it is the easiest way to dispatch enemies. The best thing about CP2077 is not the visuals but the story and the secret is that even though Nvidia features are available, playing the Game native is not a bad experience on either card.

What AMD contends with are the bombastic negative from HUB, Paul's hardware asking if AMD GPUs have DP 2.1 or still have 1.4, Robbeytech getting triggered when asked why he doesn't use AMD Gpus, Kitguru saying that their Nvidia partners are very good to them. At least they used an AMD GPU in their latest build video. Then there is what I talk about that people like you choose to ignore. The China effect. Was the Chinese Govt not buying every single 4090 Nvidia made. Did Nvidia not make the 4090D when the Govt told them to stop? Are those numbers separated from the ones that are sold at Canada Computers in the numbers? It is small wonder that AMD did not let the ravens pick at their bones as the narrative suggests. Meanwhile Nvidia are trying their hardest to get into the Handhled revolution. I expect the ever always Switch has Nvidia argument.

So be happy being a Nvidia fan to the point where you make negative comments at every oppurtunity in AMD focused threads.
Posted on Reply
#85
Chrispy_
RedwoodzBut you see, the defecit AMD faces is not in raster...it's in software. They have no means of breaking the cUDA wall. Until people start voting with their wallet they have no chance.
It's a little bit of both, I think. Perhaps a chicken-or-egg situation:

ROCm isn't getting much developer attention because AMD GPUs aren't a good choice for AI TOPS and cannot compete with NV's Tensor core performance for LLMs.

IF UDNA solves the hardware performance shortfall, then ROCm needs to be a viable alternative to CUDA. If that ever happens, I'm sure there will be CUDA emulators and translators to bridge the transition away from a full-on CUDA monopoly.
Posted on Reply
#86
Kaleid
3valatzy15% of RTX 4090 for $650
Oh crap, somehow AMD has to sell a lot lower price and Nvidia shouldn't cut prices. It's OK for them to overcharge the costumers. And be cheap with VRAM
Posted on Reply
#87
mechtech
TheinsanegamerNEvery time AMD has made an actual competitive card, they have sold out. Evergreen pushed AMD to 49% marketshare. The 290/x were going for almost double MSRP and were unobtainable for almost a year after launch, even with the flood of used cards there were still new sales. The 6800/xt/6900xt were complete unobtainable for over a year after launch, what cards were made sold immediately, often mere seconds after coming in stock.

Literally only ONCE in the last decade have they had a superior product, the 290x. Even then, Nvidia rushed out the 780/ti to counter it. The 980ti was uncontested. Vega sucked. Fury/X were failed experiments. Polaris stopped at 1060 level. rDNA was missing features like mesh shader support or RT and was limited to mid range performance.
ya kind of my point, been 10 years now, not sure how much nvidia owners would go back. In an ideal market it would be nice if all three had about the same market share/qty in sales.
SRSConsidering that AMD has shown that it can make a competitive GPU, let's look back at how things looked when Intel released Skylake (or even Broadwell C) and AMD released Piledriver.

The difference between AMD's shocking success in CPUs and its "Polaris forever" sandbagging in GPUs isn't due to Nvidia's perfection as a monopolist of enthusiast-grade GPU tech. It's due to duopoly, which, despite the commonness of the assumption in consumer tech circles, is not the same thing as adequate competition.

The transformation of AMD from a nearly dead company which had a very shabby track record (note that not even Phenom I and Phenom II were particularly exciting) against a corporation that had been killing it since Core Duo should put to rest all of these claims about Nvidia's unbreakable dominance, particularly given how much stronger AMD's financials are — and — because Nvidia is fabless. Intel could have continued to use its fabs as a source of dominance against AMD's CPU hopes if it hadn't messed its nodes up. Nvidia has no such advantage.

It simply is more in AMD's interest to let Nvidia set prices for the stack by ceding the enthusiast-tier (aka higher-end) GPU space. What is good business for AMD is not, in this case, good business for consumers. It's an absurd situation that one can get a reasonably affordable CPU (9800X3D) that is rather overkill but must pay through the nose for GPU performance. That's not a healthy product ecosystem. It's monopolization in action.
Yes. Even though the rx480, etc. were competetive in the perf/$ range the market share gap keeps getting larger though. Ideally the market share/qty sold would be about equal for Intel/AMD/Nvidia, that would probably be best for the consumer.
Posted on Reply
#88
3valatzy
kapone32if you think more people buy Halo cards than that you would not understand why the 3060 laptop was the best selling Gaming device during Covid
What does marketing mean to you ?
Those graphs are the marketing !
Posted on Reply
#89
TechBuyingHavoc
OnasiOh how times have changed now that people are “excited” and “hyped” to pay what used to be near-flagship prices for mid as hell cards. Meh. Snark aside, the pricing undercuts nothing - it’s 550. The 480 reference will be unobtainable in most parts of the world and even simplest AIB models will start from 550. So the entire value proposition will be based on just how much faster it will be in raster than the 5070. I say “in raster” since it sure as hell will not be faster in anything else.
I am hopeful in a year, the price drops to sub-$400 levels, but that is just that, hope. Get the price to below $350 and we have something to be excited about.
Posted on Reply
#90
TheinsanegamerN
mechtechya kind of my point, been 10 years now, not sure how much nvidia owners would go back. In an ideal market it would be nice if all three had about the same market share/qty in sales.
I think they would. Look at the excitement over the B580. Sure there are plenty of nvidia hardliners but there's also plenty that buy based on the performance they want, if AMD can match that at a lower price, why wouldnt they jump ship?

IF AMD can offer a card that actually competed with nvidia, without the astericks, it'll sell. The 9070, IF the 3d mark scores are actually accurate to in game performance, could do that. We wont know until it launches.
mechtechYes. Even though the rx480, etc. were competetive in the perf/$ range the market share gap keeps getting larger though. Ideally the market share/qty sold would be about equal for Intel/AMD/Nvidia, that would probably be best for the consumer.
Polaris was a huge misfire, largely because AMD severely underestimated the size of the market over $200, thinking it was still 2009. That was a severe blow to AMD's potential market as the GTX 1070 sat there eating up sales uncontested for 18 months, as did the 1080, and later the 1070ti. Vega was a wet fart that was too slow, expensive, and power hungry to make a difference.
Posted on Reply
#91
Random_User
Someone, please, explain me, how $549 for 9070XT with 7900GRE/4070 Ti rumoured performance, is anywhere more affordable than 5070 for the same money?
I've wrote $549, because MBA models are going to be unobtanium as usual, so only AIB overpriced custom design cards will present.
This is really a complete idiocy. The 5070 perhaps has a performance of 4080, maybe even Super version, whereas 9070 even XT is not, as it was clearly stated.

This is obvious, that these are only rumours. But this is also obvious, AMD will do everything to keep the price as high as possible (especially while usng the rumours to test the waters). $550 though is not a good price, unless the 9070 XT beats the 5070 non TI, for at least 5-10%, or even 15% performance and feature wise. But that won't happen.

And there are still the "new" unsold old stock cards of 7000, which theoretically should have get the price cut. As much as the used market. The only reason to buy 9070 over 7900, is potential availability, a fresh warranty, and the potentially better encoding/decoding performance. Still have to wait for the actual reviews.
Chrispy_That was the impression I got from AMD when they said they were focusing on the midrange in 2008.

From the 4850 TPU review:

"AMD is determined to claim price/performance leadership in the $199 and $299 segments, that's where those cards are positioned. A R700 card called HD 4870 X2 will appear later this year and is supposed to fight for the performance crown."

They didn't make a monolithic flagship, instead opting to just jam two midrange GPUs together on the same board. The result of designing and optimising for the midrange market first and foremost paid off in spades, because the 4000-series knocked it out of the park!
Been there. Did the biggest mistake by investing into R700. I'd better buy a single 4870 and call it a day.

But the thing is, back then, ATi has beaten the sh*t out of 8800GTX/9800GTX, and was rivaling GTX280. This time, AMD is barely challenging their own almost three years old mid-end GPUs, for their original launch prices. This is not even stagnation. This is pure gouging, just because they can get away with this.
Posted on Reply
#92
Kaleid
Random_UserSomeone, please, explain me, how $549 for 9070XT with 7900GRE/4070 Ti rumoured performance, is anywhere more affordable than 5070 for the same money?
I'v
Well, I for one wouldn't want a 12GB card. Seems like normal Nvidia behavior.
Posted on Reply
#93
Redwoodz
Random_UserSomeone, please, explain me, how $549 for 9070XT with 7900GRE/4070 Ti rumoured performance, is anywhere more affordable than 5070 for the same money?
I've wrote $549, because MBA models are going to be unobtanium as usual, so only AIB overpriced custom design cards will present.
This is really a complete idiocy. The 5070 perhaps has a performance of 4080, maybe even Super version, whereas 9070 even XT is not, as it was clearly stated.

This is obvious, that these are only rumours. But this is also obvious, AMD will do everything to keep the price as high as possible (especially while usng the rumours to test the waters). $550 though is not a good price, unless the 9070 XT beats the 5070 non TI, for at least 5-10%, or even 15% performance and feature wise. But that won't happen.

And there are still the "new" unsold old stock cards of 7000, which theoretically should have get the price cut. As much as the used market. The only reason to buy 9070 over 7900, is potential availability, a fresh warranty, and the potentially better encoding/decoding performance. Still have to wait for the actual reviews.

Been there. Did the biggest mistake by investing into R700. I'd better buy a single 4870 and call it a day.

But the thing is, back then, ATi has beaten the sh*t out of 8800GTX/9800GTX, and was rivaling GTX280. This time, AMD is barely challenging their own almost three years old mid-end GPUs, for their original launch prices. This is not even stagnation. This is pure gouging, just because they can get away with this.
Nividia is setting market prices... or have you not heard lately? Just go line up and buy whatever Jensen's going to let you buy. Gouging by AMD... I have really heard it all now.
Posted on Reply
#94
Chrispy_
Random_UserSomeone, please, explain me, how $549 for 9070XT with 7900GRE/4070 Ti rumoured performance, is anywhere more affordable than 5070 for the same money? I've wrote $549, because MBA models are going to be unobtanium as usual, so only AIB overpriced custom design cards will present.
What region are you in?

I've never had any issues finding AMD graphics cards at the MSRP, so if the rumours of $479 are true, I expect to see £450 9070XTs in the UK in the spring. The MBA cards will be fine but I'll be watching for reviews of any AIB designs because for the 7000-series, the Sapphire pulse (MSRP) and XFX speedster (MSRP) were quieter, better-cooled models.

The 7800XT's launch was certainly popular enough that specific SKUs regularly went out of stock in a few shops but I don't recall ever being unable to find at least one MSRP 7800XT in stock somewhere. The GRE was less available, as only Sapphire and XFX models appeared over here on e-tailer listings. Asrock/Asus/Gigabyte/Powercolor and MBA models were notably absent, yet availability of the Sapphire and XFX 7900GRE remained okay.
Posted on Reply
#95
Rightness_1
I guarantee that AMD is very hard at work right now on a new and wonderful way of messing this opportunity up. Half the new NV cards are not even 10% faster than the previous series cards... Outside of fake frames.
Posted on Reply
#96
Zach_01
I just hope they are not planning to bake a last minute MultiFG to match nVidia...
For real, no one saw that coming from the green team and everyone was caught by surprise.

Maybe they are just waiting to see "simple" raster performance from 5000 so they know how they place the prices.
But nVidia played the game really good. No raster figures until actual release...
Posted on Reply
#97
Jtuck9
Zach_01I just hope they are not planning to bake a last minute MultiFG to match nVidia...
For real, no one saw that coming from the green team and everyone was caught by surprise.

Maybe they are just waiting to see "simple" raster performance from 5000 so they know how they place the prices.
But nVidia played the game really good. No raster figures until actual release...
Frame generation is part of the package no? Good enough is probably good enough for me until they believe they can compete at the high end.
Posted on Reply
#98
LittleBro
Chrispy_Pretty sure more cache doesn't really work for GPUs the same way as it does for CPUs, both in terms of how much performance it adds, and also in how difficult it is to manage power and clocks with a vCache layer in the way.
It does. That's the reason why RX 6000 series ware able to catch up with RTX 3000 series.

With RX 7000 series, AMD nerfed (halved) L3 cache and that was a bad move IMHO. RX 7000s could have been better with more L3 cache. My RX 7800 XT has about about 18% less compute units than RX 6800 XT and half the L3 cache. IPC improvement (higher clocks, "dual-issue" stream processor) of RDNA3 was able to partially compensate for lack of those units, but it still sucks when 7800 XT is beaten by 6800 XT in some games even today, while in others it is losing by only a single digit %. I'd even dare to say that RX 7800 XT is not a real successor of RX 6800 XT.
Posted on Reply
#99
AnotherReader
LittleBroIt does. That's the reason why RX 6000 series ware able to catch up with RTX 3000 series.

With RX 7000 series, AMD nerfed (halved) L3 cache and that was a bad move IMHO. RX 7000s could have been better with more L3 cache. My RX 7800 XT has about about 18% less compute units than RX 6800 XT and half the L3 cache. IPC improvement (higher clocks, "dual-issue" stream processor) of RDNA3 was able to partially compensate for lack of those units, but it still sucks when 7800 XT is beaten by 6800 XT in some games even today, while in others it is losing by only a single digit %. I'd even dare to say that RX 7800 XT is not a real successor of RX 6800 XT.
It isn't the real successor to the 6800 XT; the name is a red herring. When you account for TDP, die size, and MSRP, it's an obvious successor to the 6700 XT. As for the L3 cache, it seems to be fine as performance relative to the 6800 XT doesn't decrease at 4K compared to lower resolutions.
Posted on Reply
#100
Chrispy_
LittleBroIt does. That's the reason why RX 6000 series ware able to catch up with RTX 3000 series.

With RX 7000 series, AMD nerfed (halved) L3 cache and that was a bad move IMHO. RX 7000s could have been better with more L3 cache. My RX 7800 XT has about about 18% less compute units than RX 6800 XT and half the L3 cache. IPC improvement (higher clocks, "dual-issue" stream processor) of RDNA3 was able to partially compensate for lack of those units, but it still sucks when 7800 XT is beaten by 6800 XT in some games even today, while in others it is losing by only a single digit %. I'd even dare to say that RX 7800 XT is not a real successor of RX 6800 XT.
That's not a cache problem, that's because the 6800XT has 20% more compute units 72 vs 60.
IPC improvements are basically zero between RDNA2 and RDNA3, proved quite conclusively by the 7600 having near-identical performance to the 6650XT when clocked at the same speed.

What you're seeing is the 7800XT with half the cache of the 6800XT making up the compute unit deficit with clockspeed.
[INDENT]72CU x 2.2GHz boost clock = 158 'CU GHz'[/INDENT]
[INDENT]60CU x 2.6GHz boost clock = 156 'CU GHz'[/INDENT]

i.e, if both were identical architecture and IPC, the 6800XT would be only 1-2% faster than the 7800XT, which is often the case in real games, regardless of the cache sizes being dramatically different. It's also worth noting that you cannot call higher clocks an IPC gain like you cited:
LittleBroIPC improvement (higher clocks, "dual-issue" stream processor) of RDNA3
IPC literally means instructions per clock, so it's independent of clockspeed.
AnotherReaderIt isn't the real successor to the 6800 XT; the name is a red herring. When you account for TDP, die size, and MSRP, it's an obvious successor to the 6700 XT.
To me the 7800XT is the obvious successor to the vanilla 6800, in that it's the same rough price, same bus width, same VRAM amount, and same core config - just clocked a solid 25% faster ;)
Posted on Reply
Add your own comment
Mar 6th, 2025 21:46 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts