Friday, January 10th 2025

AMD Radeon RX 9070 XT Pricing Leak: More Affordable Than RTX 5070?
As we reported yesterday, the Radeon RX 9070 XT appears to be all set to disrupt the mid-range gaming GPU segment, offering performance that looks truly enticing, at least if the leaked synthetic benchmarks are anything to go by. The highest-end RDNA 4 GPU is expected to handily outperform the RTX 4080 Super despite costing half as much, with comparison to its primary competitor, the RTX 5070, yet to be made.
Now, a fresh leak has seemingly hinted at how heavy the RDNA 4 GPU is going to be on its buyers' pockets. Also sourced from Chiphell, the Radeon RX 9070 XT is expected to command a price tag between $479 for AMD's reference card and roughly $549 for an AIB unit, varying based on which exact product one opts for. At that price, the Radeon RX 9070 XT easily undercuts the RTX 5070, which will start from $549, while offering 16 GB of VRAM, albeit of the older GDDR6 spec. There is hardly any doubt that the RTX GPU will come out ahead in ray tracing performance, as we already witnessed yesterday, although traditional rasterization performance will be more interesting to compare.In a recent interview, AMD Radeon's Frank Azor has already stated that the RDNA 4 cards will be priced as "not a $300 card, but also not a $1,000 card", which frankly does not reveal much at all. He did also state that the RDNA 4 cards will attempt a mix of performance and price, similar to the RX 7800 XT and the RX 7900 GRE. All that remains to be done now, is to wait and see whether AMD's claims hold water.
Source:
HXL (@9550pro)
Now, a fresh leak has seemingly hinted at how heavy the RDNA 4 GPU is going to be on its buyers' pockets. Also sourced from Chiphell, the Radeon RX 9070 XT is expected to command a price tag between $479 for AMD's reference card and roughly $549 for an AIB unit, varying based on which exact product one opts for. At that price, the Radeon RX 9070 XT easily undercuts the RTX 5070, which will start from $549, while offering 16 GB of VRAM, albeit of the older GDDR6 spec. There is hardly any doubt that the RTX GPU will come out ahead in ray tracing performance, as we already witnessed yesterday, although traditional rasterization performance will be more interesting to compare.In a recent interview, AMD Radeon's Frank Azor has already stated that the RDNA 4 cards will be priced as "not a $300 card, but also not a $1,000 card", which frankly does not reveal much at all. He did also state that the RDNA 4 cards will attempt a mix of performance and price, similar to the RX 7800 XT and the RX 7900 GRE. All that remains to be done now, is to wait and see whether AMD's claims hold water.
106 Comments on AMD Radeon RX 9070 XT Pricing Leak: More Affordable Than RTX 5070?
If they're serious about staying in the GPU race, they need to stop farting around!
AMD must make something so good that even Nvidia fanboys will be hard-pressed to ignore the AMD alternative. RDNA4 isn't going to do it on technical prowess, so they need to absolute murder Nvidia in the price/performance metric even if it's a loss-leader for them this generation. When I say murder, I mean like 40%+ better performance/$ than Nvidia.
If it's only 25% or something like that, the Nvidia buyers will just cite what they always cite; "but DLSS, but drivers, but Raytracing" etc - regardless of whether that's still even true in 2025.
Do you remember the discussions at the start of the 7000-series launch where AMD released numbers for their internal testing with even more cache? The cache hit rate for the 7900XTX isn't great, and it's just one reason why lesser tiers just don't have as much.
Welp, literally anything that makes nVidia GPUs look terrible in comparison is welcome by me. The market is however nowhere near healthy.
Someone in there stupidly believed that 2 different architectures will work then and in future…
Little did they knew… And I’m very surprised by that because for their CPU line they did the exact opposite, just like nVidia did with their own GPU line.
It's crazy that Nvidia has such a massive gaming market dominance when AMD have had console monopoly and been a strong (if not the primary) target for game developers since 2013!
Nvidia are killing it on the marketing and execution front, and I'm saying that with my "consumer gaming hat" on, where I have to consider the cost and value of GPUs for gaming only, not my "system integrator hat" where I get free hardware and need to worry about CUDA API support and performance in professional productivity for profit.
Remember the RX 6800 XT? It was like RX 6900 XT, but the first was $1000, while the second was $650!
It's all about the top performance.
www.newegg.ca/asus-geforce-rtx-4090-tuf-rtx4090-24g-og-gaming/p/N82E16814126671?Item=N82E16814126671
www.newegg.ca/sapphire-pulse-11322-02-20g-amd-radeon-rx-7900-xtx-24gb-gddr6/p/N82E16814202429?Item=N82E16814202429
If you look add price to the equation the whole narrative changes. I could buy 2 7900XTX for the price of 1 4090. Is RT at the 3090TI level bad? It is not like the TUF is a high end 4090 either those go as hifgh as $4000+.
If you are a Gamer then it makes no common sense to pay double for CP2077 to look prettier as you get the Katana as it is the easiest way to dispatch enemies. The best thing about CP2077 is not the visuals but the story and the secret is that even though Nvidia features are available, playing the Game native is not a bad experience on either card.
What AMD contends with are the bombastic negative from HUB, Paul's hardware asking if AMD GPUs have DP 2.1 or still have 1.4, Robbeytech getting triggered when asked why he doesn't use AMD Gpus, Kitguru saying that their Nvidia partners are very good to them. At least they used an AMD GPU in their latest build video. Then there is what I talk about that people like you choose to ignore. The China effect. Was the Chinese Govt not buying every single 4090 Nvidia made. Did Nvidia not make the 4090D when the Govt told them to stop? Are those numbers separated from the ones that are sold at Canada Computers in the numbers? It is small wonder that AMD did not let the ravens pick at their bones as the narrative suggests. Meanwhile Nvidia are trying their hardest to get into the Handhled revolution. I expect the ever always Switch has Nvidia argument.
So be happy being a Nvidia fan to the point where you make negative comments at every oppurtunity in AMD focused threads.
ROCm isn't getting much developer attention because AMD GPUs aren't a good choice for AI TOPS and cannot compete with NV's Tensor core performance for LLMs.
IF UDNA solves the hardware performance shortfall, then ROCm needs to be a viable alternative to CUDA. If that ever happens, I'm sure there will be CUDA emulators and translators to bridge the transition away from a full-on CUDA monopoly.
Those graphs are the marketing !
IF AMD can offer a card that actually competed with nvidia, without the astericks, it'll sell. The 9070, IF the 3d mark scores are actually accurate to in game performance, could do that. We wont know until it launches. Polaris was a huge misfire, largely because AMD severely underestimated the size of the market over $200, thinking it was still 2009. That was a severe blow to AMD's potential market as the GTX 1070 sat there eating up sales uncontested for 18 months, as did the 1080, and later the 1070ti. Vega was a wet fart that was too slow, expensive, and power hungry to make a difference.
This is obvious, that these are only rumours. But this is also obvious, AMD will do everything to keep the price as high as possible (especially while usng the rumours to test the waters). $550 though is not a good price, unless the 9070 XT beats the 5070 non TI, for at least 5-10%, or even 15% performance and feature wise. But that won't happen.
But the thing is, back then, ATi has beaten the sh*t out of 8800GTX/9800GTX, and was rivaling GTX280. This time, AMD is barely challenging their own almost three years old mid-end GPUs, for their original launch prices. This is not even stagnation. This is pure gouging, just because they can get away with this.
I've never had any issues finding AMD graphics cards at the MSRP, so if the rumours of $479 are true, I expect to see £450 9070XTs in the UK in the spring. The MBA cards will be fine but I'll be watching for reviews of any AIB designs because for the 7000-series, the Sapphire pulse (MSRP) and XFX speedster (MSRP) were quieter, better-cooled models.
The 7800XT's launch was certainly popular enough that specific SKUs regularly went out of stock in a few shops but I don't recall ever being unable to find at least one MSRP 7800XT in stock somewhere. The GRE was less available, as only Sapphire and XFX models appeared over here on e-tailer listings. Asrock/Asus/Gigabyte/Powercolor and MBA models were notably absent, yet availability of the Sapphire and XFX 7900GRE remained okay.
For real, no one saw that coming from the green team and everyone was caught by surprise.
Maybe they are just waiting to see "simple" raster performance from 5000 so they know how they place the prices.
But nVidia played the game really good. No raster figures until actual release...
With RX 7000 series, AMD nerfed (halved) L3 cache and that was a bad move IMHO. RX 7000s could have been better with more L3 cache. My RX 7800 XT has about about 18% less compute units than RX 6800 XT and half the L3 cache. IPC improvement (higher clocks, "dual-issue" stream processor) of RDNA3 was able to partially compensate for lack of those units, but it still sucks when 7800 XT is beaten by 6800 XT in some games even today, while in others it is losing by only a single digit %. I'd even dare to say that RX 7800 XT is not a real successor of RX 6800 XT.
IPC improvements are basically zero between RDNA2 and RDNA3, proved quite conclusively by the 7600 having near-identical performance to the 6650XT when clocked at the same speed.
What you're seeing is the 7800XT with half the cache of the 6800XT making up the compute unit deficit with clockspeed.
[INDENT]72CU x 2.2GHz boost clock = 158 'CU GHz'[/INDENT]
[INDENT]60CU x 2.6GHz boost clock = 156 'CU GHz'[/INDENT]
i.e, if both were identical architecture and IPC, the 6800XT would be only 1-2% faster than the 7800XT, which is often the case in real games, regardless of the cache sizes being dramatically different. It's also worth noting that you cannot call higher clocks an IPC gain like you cited: IPC literally means instructions per clock, so it's independent of clockspeed. To me the 7800XT is the obvious successor to the vanilla 6800, in that it's the same rough price, same bus width, same VRAM amount, and same core config - just clocked a solid 25% faster ;)