Friday, January 10th 2025

AMD Radeon RX 9070 XT Pricing Leak: More Affordable Than RTX 5070?

As we reported yesterday, the Radeon RX 9070 XT appears to be all set to disrupt the mid-range gaming GPU segment, offering performance that looks truly enticing, at least if the leaked synthetic benchmarks are anything to go by. The highest-end RDNA 4 GPU is expected to handily outperform the RTX 4080 Super despite costing half as much, with comparison to its primary competitor, the RTX 5070, yet to be made.

Now, a fresh leak has seemingly hinted at how heavy the RDNA 4 GPU is going to be on its buyers' pockets. Also sourced from Chiphell, the Radeon RX 9070 XT is expected to command a price tag between $479 for AMD's reference card and roughly $549 for an AIB unit, varying based on which exact product one opts for. At that price, the Radeon RX 9070 XT easily undercuts the RTX 5070, which will start from $549, while offering 16 GB of VRAM, albeit of the older GDDR6 spec. There is hardly any doubt that the RTX GPU will come out ahead in ray tracing performance, as we already witnessed yesterday, although traditional rasterization performance will be more interesting to compare.
In a recent interview, AMD Radeon's Frank Azor has already stated that the RDNA 4 cards will be priced as "not a $300 card, but also not a $1,000 card", which frankly does not reveal much at all. He did also state that the RDNA 4 cards will attempt a mix of performance and price, similar to the RX 7800 XT and the RX 7900 GRE. All that remains to be done now, is to wait and see whether AMD's claims hold water.
Source: HXL (@9550pro)
Add your own comment

89 Comments on AMD Radeon RX 9070 XT Pricing Leak: More Affordable Than RTX 5070?

#51
mechtech
SRSAMD has the means to compete with Nvidia and chooses not to because it's in AMD's interest to let Nvidia artificially inflate prices with its monopoly position. AMD grants Nvidia a pure monopoly in higher-end consumer GPUs and we're expected to be pleased when the price of AMD GPUs is adjusted a bit lower than Nvidia's?

This is what happens when people don't understand the corrosive effects of monopolization.

I understand that AMD stockholders want to cheerlead for whatever improves AMD's stock value. However, for those who want video gaming it get closer to the goal (a holodeck-quality experience), the monopolist stagnation + price inflation is not a recipe for cheers. And, even non-gamers have other goals, like consumer-grade AI hardware that is priced more fairly.

Someone in this thread claimed that it's silly to offer higher-end GPUs because they don't sell. That's not reality at all. They sell. The 4090's price, in fact, was inflated for a long time precisely because they were selling so well. AMD may make more money by concentrating on its enterprise business, in terms of how its wafers are allowed — as well as keeping consoles relevant via helping Nvidia set prices from the top down the stack. That may be smart business for AMD, Nvidia, Sony, and MS. It's not smart for consumers unless they like a slower pace of hardware sophistication with higher prices — the thing people complain about pseudo-communism about (ironically, enough). The problem with monopolization is that it's not capitalism; it's corporate socialism.
Nah, it doesn't because it would probably be a waste. If nvidia is say 80%+ish of the gamer card market share and say AMD had an equal or even superior product to match the 5070, 5080 and 5090 at an equal price or even 10% less $$, I would wager at least 90% of nvidia owners would not jump over.

But I could be wrong............
Posted on Reply
#52
Zach_01
We have to make up our minds.
We want 7900XT+ performance (with also better other features) for $500 and go to the future like that, or our pricy old GPUs to hold their price?
I choose the former. Its better for later.
Posted on Reply
#53
SRS
mechtechNah, it doesn't because it would probably be a waste. If nvidia is say 80%+ish of the gamer card market share and say AMD had an equal or even superior product to match the 5070, 5080 and 5090 at an equal price or even 10% less $$, I would wager at least 90% of nvidia owners would not jump over.

But I could be wrong............
Considering that AMD has shown that it can make a competitive GPU, let's look back at how things looked when Intel released Skylake (or even Broadwell C) and AMD released Piledriver.

The difference between AMD's shocking success in CPUs and its "Polaris forever" sandbagging in GPUs isn't due to Nvidia's perfection as a monopolist of enthusiast-grade GPU tech. It's due to duopoly, which, despite the commonness of the assumption in consumer tech circles, is not the same thing as adequate competition.

The transformation of AMD from a nearly dead company which had a very shabby track record (note that not even Phenom I and Phenom II were particularly exciting) against a corporation that had been killing it since Core Duo should put to rest all of these claims about Nvidia's unbreakable dominance, particularly given how much stronger AMD's financials are — and — because Nvidia is fabless. Intel could have continued to use its fabs as a source of dominance against AMD's CPU hopes if it hadn't messed its nodes up. Nvidia has no such advantage.

It simply is more in AMD's interest to let Nvidia set prices for the stack by ceding the enthusiast-tier (aka higher-end) GPU space. What is good business for AMD is not, in this case, good business for consumers. It's an absurd situation that one can get a reasonably affordable CPU (9800X3D) that is rather overkill but must pay through the nose for GPU performance. That's not a healthy product ecosystem. It's monopolization in action.
Posted on Reply
#54
Scircura
CosmicWandererEven on the green side you'll end up missing features eventually. I have a RTX 3090, a flagship card with 24GB of VRAM, and Nvidia decided that it should not support DLSS Frame Generation, so I havent been able to use that feature in most games that support it. And now, people with a 4090 won't even get DLSS 4 Multi Frame Gen.

At least on the AMD side everything is open-source and their tech has wide adoption. FSR4 will be exclusive to the new cards initially, but at least AMD confirmed that they are working on bringing it to older cards. Good luck getting a similar commitment from Nvidia.
It's not a feature you'd miss, I don't think. I have a similar monitor to yours and I found frame gen to be essentially unusable. I've tried with FSR-enabled games and also with DLSSFG-to-FSR3 dll's.

If your doubled frame rate exceeds 100 Hz then you get either tearing or awful lag, depending on whether VSync is enabled. (Digital Foundry explains why.) But under 100 Hz doubled, you have < 50 Hz base frame rate and the game is not very responsive.

I honestly have no idea what good multi frame gen can do. There exists no base frame rate that is responsive-feeling but also can be quadrupled and still display on today's monitor. (500 Hz 1440p arrive later this year though.) Maybe it will help with elaborate in-game cutscenes? Maybe some cool demoscene demos?

Edit: also Nvidia confirmed the transformer model enhancements will be supported by 20 series, so they deserve at least a tiny bit of credit for that.
Posted on Reply
#55
Dawora
MakaveliThe fact that the 5070 is a 12GB card I would avoid it.

Not buying a 12GB gpu in 2025.
There will be other options, its not only Gpu from nvidia.
maybe 5070Ti or 5060Ti 16GB

12GB is ok for something like 5050/5060
But for xx70 series it should be 16GB.

But i dont see big problems if GPU is 12GB, its okay for next couple years for most of us.
Posted on Reply
#56
kapone32
DaworaThere will be other options, its not only Gpu from nvidia.
maybe 5070Ti or 5060Ti 16GB

12GB is ok for something like 5050/5060
But for xx70 series it should be 16GB.

But i dont see big problems if GPU is 12GB, its okay for next couple years for most of us.
Indiana Jones
Posted on Reply
#57
Darmok N Jalad
I’m still waiting on the price of the x700 series to come down. The B580 launch hasn't done much to build up a good $250 tier.
Posted on Reply
#58
uftfa
GhostRyderIf true, that price will disrupt the market. I like the idea of reference design being under $500 as that would be a good value (At least by todays standards) based on the performance leaks. The other thing is, the used market will be disrupted majorly as the old cards should lose most of their values.
7800XT was already $499 at launch, and can be found for cheaper than that now. If AMD is going to bring 10-15% performance improvement at 4% lower launch price, it's still an underwhelming generational step, IMO.

$479 and $499 is the "same" price for all practical purposes, but AMD really needs to outperform the 7800XT by over 25% in raster and 40-50% in RT for it to be a "good" product.
Posted on Reply
#59
Cheeseball
Not a Potato
Heiro78Thanks for the further input. MBA stands for?

So we could actually see multiple AIBs making the 479USD version of the 9070 XT. Was your reference 7900 XTX branded by ASROCK priced at 999USD and looked like the AMD card manufactured by sapphire?


The last time I believed PC hardware hype was Alphacool and their apex stealth metal fans... It was mostly cuz of the "tests" performed by Igor's Lab's Pascal
MBA means "Made by AMD", its a semi-official term for the original AMD-made video cards, like Founder's Edition is for NVIDIA cards.

Posted on Reply
#60
Octavean
I predict supply shortages, a plague of scalpers and the Nile river will run PowerColor Red Devil red,……….I have spoken!!!
Posted on Reply
#61
Macro Device
CheeseballFounder's Edition
There is no apostrophe there...

For those living in insane places where you pay long dollar for electricity, 9070 XT is definitely not on an affordable side. 5070 seems more cost effiicent. Unless we're in for a miracle and 9070 XT offers >20 % more speed than the said 5070. Then it's okay. Not excellent but okay.
Posted on Reply
#62
TheinsanegamerN
mechtechNah, it doesn't because it would probably be a waste. If nvidia is say 80%+ish of the gamer card market share and say AMD had an equal or even superior product to match the 5070, 5080 and 5090 at an equal price or even 10% less $$, I would wager at least 90% of nvidia owners would not jump over.

But I could be wrong............
Every time AMD has made an actual competitive card, they have sold out. Evergreen pushed AMD to 49% marketshare. The 290/x were going for almost double MSRP and were unobtainable for almost a year after launch, even with the flood of used cards there were still new sales. The 6800/xt/6900xt were complete unobtainable for over a year after launch, what cards were made sold immediately, often mere seconds after coming in stock.

Literally only ONCE in the last decade have they had a superior product, the 290x. Even then, Nvidia rushed out the 780/ti to counter it. The 980ti was uncontested. Vega sucked. Fury/X were failed experiments. Polaris stopped at 1060 level. rDNA was missing features like mesh shader support or RT and was limited to mid range performance.
Sound_CardThe price leak, the performance hints, the FSR improvement - the usual green doom and gloomers and whataboutisms have been strangely less active this past 24 hours.
Because any post with more realistic expectations has been drowned out by people who still havent learned their lesson about "leaks", ESPECIALLY with GPUs.

When the 9070 comes out and isnt the second coming of ATi, the forums will echo with their wailing about how AMD has betrayed them.
Marcus LC'mon, this is TPU, you can't believe this can you? :laugh: there will always be whataboutism when it comes to AMD no matter how good their software stack is or the price/perf :rolleyes::D
LOLNO. "whataboutism" is used to shield against criticism against AMD just as often as "nGREEDIA mindshare11!!!1!". And just like crypto FUD, it's totally ridiculous.
Posted on Reply
#63
Macro Device
TheinsanegamerNThe 6800/xt/6900xt were complete unobtainable for over a year after launch, what cards were made sold immediately, often mere seconds after coming in stock.
Purely due to the mining hysteria. Sans it and any other real/artificial shortages, these GPUs would've gotten heavily discounted. Same would've happened to some Ampere cards, too.
Posted on Reply
#64
tfdsaf
If the leaks are true and it is indeed faster than the 4080super at raster performance and comes at $500 that would be sick, its an insta buy from me. Not to mention that its supposedly very good at RT as well, beating the 4070ti super at it, then we have FSR4 which from all of the videos about it seems to be amazing, much much better than DLSS and actually much better than even native.

I just hope that there is enough of these to go around, otherwise prices are likely to skyrocket easily to $600, $650 if we can't find it, which would defeat the purpose of it being mid tier and supposedly for the mainstream masses.

If the 9070 not xt is say 4070ti super level at comes at $400 that would be even bigger, I think that is going to be the real killer because a lot more people can afford GPU's at the $300 to $400 price.
Posted on Reply
#65
Visible Noise
Frank Azor did an interview with PC World, it’s on YouTube. He specifically states it’s called the 9070 because it is a 70 class card in price and performance. His words.

It’s not going to be a 4080, let alone a 4080S class card.
Posted on Reply
#66
Jtuck9
Chrispy_As far as I understand the AMD situation, there MBA reference design cards are all identical and are built by PC Partner who are the parent company behind Zotac, Inno3d, and Manli brands on the Nvidia side. These MBA cards are all identical and will be sold to Asus/Sapphire/Powercolor/XFX etc to be rebranded and resold under those brands. If they do anything at all, those AIB partners will put their logo stickers on the fan hubs and change the brand on the retail packaging at most.

Then you have reference designs which are made by Asus/Asrock/Gigabyte/Powercolor/Sapphire/XFX etc that use the MBA PCB layout with their brand-specific coolers adapted for the MBA PCB layout.

Then you have the MSRP base models that are usually the AIB vendor's own PCB design, built as cheap as they think they can get away with and sold at the same MSRP as the MBA models. In most cases the AIB will try to launch with these or at least switch to their own cheaper designs as soon as possible since they make more profit on cards they manufacture themselves rather than just rebranding the PC Partner MBA ones.

After that you have all the various premium variants with overbuilt power delivery and bigger coolers. I generally never find much value in these but they do seem to be popular because people seem to want large cards with quieter coolers, even if they're 15% more expensive for 2% more performance. I'm not judging - just saying what I see.


12GB is 2025's minimum VRAM quantity, IMO.

I think 12GB today is like buying an 8GB card in 2021 - fine at the time but only suitable for 1080p these last couple of years. I don't see a problem with 12GB cards as long as they're entry-level models aimed at the $200-350 price tier right now. By the time they run out of VRAM they won't have enough raw performance for higher settings and higher resolutions anyway.


Fair enough. I've done that with a few GPUs in the past and refused to even throw away my MGA Millenium, Voodoo2, 9700 Pro, and GTX980 reference card once they'd become obsolete.

Tariffs could be a huge bummer for those in the US. I'm unsure whether that's going to impact the ROW market, since both AMD and Nvidia are headquartered in the US, despite all manufacturing being done in Taiwan/SG/HK/China
Does the quality differ overtime? I had a quick nose at the reference design of the 7900xtx on here and vendors like Sapphire and Hellhound are using 14 layer PCBs, 2 oz copper, DrMOS, etc on 7700xt / 7800xt/ 7900GRE boards, also reviewed on here. Looking forward to comparing and contrasting anywho!
Posted on Reply
#67
mb194dc
HxxIf it’s priced at $500 it’s going to perform like a $500 card . Don’t assume it will be some amazing deal at that price
Pretty much, AMD aren't a charity. Obviously it's not faster 4080s if it's that price.

That timespy extreme run will be a cherry picked card with extreme cooling. The same way you can push 7900xt massively as well.

I'll be surprised if 9070xt is faster than 7900xt or even close to it.
Posted on Reply
#68
motov8
It was the same with RTX 30x0, but it was crypto boom. Price was much higher than MSRP, and availability none.
Dont sell ur rtx 40x0 series, its super b still.
Posted on Reply
#69
k0vasz
In other words, nothing new: they just put a price tag of $50-100 lower than their nvidia counterpart

And based on this, their "high-end" card will be on par with the RTX5070, which is pathetic
Posted on Reply
#70
BlaezaLite
k0vaszIn other words, nothing new: they just put a price tag of $50-100 lower than their nvidia counterpart

And based on this, their "high-end" card will be on par with the RTX5070, which is pathetic
Not everyone needs a 5090 you know? Tell someone who is using a GT1030 that performance is low.
Posted on Reply
#71
Chrispy_
Jtuck9Does the quality differ overtime? I had a quick nose at the reference design of the 7900xtx on here and vendors like Sapphire and Hellhound are using 14 layer PCBs, 2 oz copper, DrMOS, etc on 7700xt / 7800xt/ 7900GRE boards, also reviewed on here. Looking forward to comparing and contrasting anywho!
I don't know, but I doubt it - there are added costs to a redesign. A powercolor fighter, Gigabyte Eagle, Asus Dual, MSI armor/ventus are examples of the MSRP base-model from AIBs and they are hit and miss on quality and value, but I'm not aware of specific SKUs being switched out for lower-quality variants during a generation. Asus does seem to spit out a huge number of variants - often hundreds of SKUs for every generation, so the problem isn't really that the quality of a single SKU declines over time, but likely additional cheaper, lower-quality parts are released over the duration of a GPU's generational cycle, and these newer models probably silently replace the original models for which there are reviews.

Typically the only thing we really see about quality is launch-day reviews that include cooler teardowns and PCB inspection (like W1zzard's) which are usually a mix of custom AIB and reference/MBA variants, then you tend to get premium models launched a little later if they're not during the launch cycle. The MSRP AIB base models are sometimes reviewed during the launch cycle, but they rarely get the attention that fully custom, flagship-tier custom AIB models will get once the launch cycle window has passed, so it's hard to say.

I won't buy an MSRP AIB base model unless I've seen a review of that specific SKU and deemed the cooler to be quiet enough and the build quality to be adequate. I'm buying in the EU so everything gets a 2-year minimum warranty, meaning that build quality is less of an issue. The true "disposable" SKUs from lesser Asian brands that really are the minimum viable product at the lowest possible price rarely even make it over here unless it's via a third-party marketplace like eBay or AliExpress - so quality isn't really an issue in Europe, I don't think.
SRSAMD has the means to compete with Nvidia and chooses not to because it's in AMD's interest to let Nvidia artificially inflate prices with its monopoly position. AMD grants Nvidia a pure monopoly in higher-end consumer GPUs and we're expected to be pleased when the price of AMD GPUs is adjusted a bit lower than Nvidia's?

This is what happens when people don't understand the corrosive effects of monopolization.

I understand that AMD stockholders want to cheerlead for whatever improves AMD's stock value. However, for those who want video gaming it get closer to the goal (a holodeck-quality experience), the monopolist stagnation + price inflation is not a recipe for cheers. And, even non-gamers have other goals, like consumer-grade AI hardware that is priced more fairly.

Someone in this thread claimed that it's silly to offer higher-end GPUs because they don't sell. That's not reality at all. They sell. The 4090's price, in fact, was inflated for a long time precisely because they were selling so well. AMD may make more money by concentrating on its enterprise business, in terms of how its wafers are allowed — as well as keeping consoles relevant via helping Nvidia set prices from the top down the stack. That may be smart business for AMD, Nvidia, Sony, and MS. It's not smart for consumers unless they like a slower pace of hardware sophistication with higher prices — the thing people complain about pseudo-communism about (ironically, enough). The problem with monopolization is that it's not capitalism; it's corporate socialism.
It's easy to get distracted by marketing, review coverage, and influencers. If you look at the Steam Hardware Survey you will see that the vast majority of people are gaming on entry-level or midrange GPUs, and many of them are really quite old, too.

Yes, the 4090 sells like hotcakes, but the overwhelming majority of those do not sell to gamers. Their price is high because of a supply-demand imbalance, and gamers are not the demand.

If the flagship cards like the 4090 were being snapped up by gamers, the 4080 should have been even more popular. Outside of the brief instant when you could buy them at MSRPs, It has always offered better performance/$ than the 4090. A $1300 4080 gets more than 60% the performance of a $2200 4090, despite being only 60% the cost, and arguably a 4080 is still very very fast, even for 4K gaming.

In reality, the 4080 sat on shelves for most of it's entire sales window. Nobody bought them. Meanwhile, the "disappointing", and "VRAM-limited", and "not powerful enough for ray-tracing" 4060 sold like hotcakes, and the 4070 (and 4070S) were flying off shelves to gamers and gamers only.

There is money in AI/Compute GPUs, but there isn't anywhere near as much money in flaghship gaming GPUs. Remember that RDNA4 is still AMD's gaming architecture, They're consolidating their gaming and datacenter architecture back into UDNA next generation, but right now, a 7900XTX makes zero sense for productivity/AI/compute and that SKU commands <0.1% of the gaming market, which means it's not relevant for AMD in terms of sales, nor is it relevant to AMD in terms of getting developer support because there aren't enough people with them to make game developers target it as a platform.
Posted on Reply
#72
Macro Device
BlaezaLiteTell someone who is using a GT1030 that performance is low.
No need to own a Ford Model T to have a right to declare Daewoo Lanos not a sports car.

9070 XT does only provide access to FSR4 and, likely but not definitely, more advanced AI performance compared to existing AMD flagships (with 7900 XTX likely bruteforcing it away anyway since the XTX has a 1.5 times better VRAM situation).

Priced at 550ish USD, it's about 44% cheaper than 7900 XT at launch (inflation considered). Nice but we shouldn't forget that $900 was a terrible price in the first place ($1000 for XTX is significantly more fair) and AMD won't achieve anything by just making a good GPU. They need a killer one.

It's also not confirmed that it's capable of outperforming 7900 XT on a regular basis. Who knows, could be some major architectural bug tanking performance in select titles. Could be NetBurst situation with clocks going up and IPC going down.

It woud've been impressive if:
• Released LAST YEAR. BEFORE NV had made a move, ultimately being made to face the fact;
• Beats whatever competing GPU, both from NV and AMD, in raster perf per $ by at least 25 percent;
• FSR4 is readily available and works at least no worse than DLSS3;
• Has enough guts to outclass whatever Ampere GPU in RT (unless <400 USD, then losing to 3080 Ti upwards isn't a big deal).

All at the same time. Not the case: all we've got is questionable benchmark rumours, YT videos showing FSR4 in action with as much compression as Google lets be there, and speculation on pricing and whatnot. AMD mumbling and being secretive doesn't really help the situation. Even if the GPU itself ends up being worth its money (doubt but who knows) its release is already a mess.
Posted on Reply
#73
Zach_01
k0vaszIn other words, nothing new: they just put a price tag of $50-100 lower than their nvidia counterpart

And based on this, their "high-end" card will be on par with the RTX5070, which is pathetic
Which of one is pathetic?
That AMD is not competing past x070 or that it will be ~50$ lower in price?
Posted on Reply
#74
Chrispy_
AnotherReaderIt will take some doing to even approach the sensation that the 4000 series was. Nvidia's best was only 13% faster at more than twice the die area and price.
Well, the 4090 vs XTX isn't a fair comparison since the 4090 is a compute/AI/professional card like previous Titans that happened to also be great for gaming, its price was driven by non-gaming sales. I think the highest-tier SKUs that actually competed in the gaming sector would have been the XTX vs the 4080, or the 4070Ti vs the 7900XT, and none of those cards sold particularly well.

At the real mainstream level, the 7800XT and 7900GRE flew off shelves and went out of stock frequently over a prolonged stretch of months. The 4070S those models competed against also flew off shelves, but I feel AMD's issue was that there simply weren't enough 7800XT and 7900GRE models being sold to really make a dent in the incumbent 4070/4070S juggernaut.

I think the other problem is that the 40-series launched with stronger upscaling than FSR at the time, working frame-generation, CUDA, and of course Nvidia had a functional marketing department, which shouldn't be underestimated when it comes to the ignorance of the masses. This time around, it looks like we have viable FSR to combat DLSS, AMD's Frame-gen has (based on my experience) looked better and been better supported than Nvidia's, MFG will probably change that back in Nvidia's favour but also with the negative of making it look like all you're paying for with Nvidia is "fake frames" (the memes are already flying around the web and the 50-series announcement is only a few days old!) ROCm isn't CUDA, but I think the gaming demographic that cares about CUDA is relatively small.
Macro DeviceAMD won't achieve anything by just making a good GPU. They need a killer one.
QFT.
Macro DeviceAMD mumbling and being secretive doesn't really help the situation. Even if the GPU itself ends up being worth its money (doubt but who knows) its release is already a mess.
I feel that AMD just aren't ready yet, which is why they didn't release or make a point of the 9000-series at CES.

It seems like Nvidia have had Blackwell ready for a while, but are happy to keep selling 40-series inventory for as long as they can. The fact they have 5090 ready for the lucrative AI market is why they're not delaying any longer, but we're not going to get 50-series laptops for another 6 months, and the mainstream GPUs like the 5070 and below are likely going to be "launched" at the very end of Q1 and not actually something you can buy until April. Gotta clear that 40-series inventory at full price first!

Meanwhile, AMD are working on UDNA for the AI bubble that has had a huge growth spurt in the last year or so. RDNA4 is a lower-priority stopgap to take advantage of TSMC 4nm availabilty, We know RDNA is being discontinued, it's not AMD's (RTG's) focus, and I doubt it's fully baked yet.
Posted on Reply
#75
Macro Device
Chrispy_I feel that AMD just aren't ready yet.
Year 5 of NV releasing absolutely nothing spectacular. AMD: "Herp derp..."
Posted on Reply
Add your own comment
Jan 11th, 2025 14:48 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts