• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD's Radeon RX 9070 XT Shatters Sales Records, Outperforming Previous Generations by 10X

AleksandarK

News Editor
Staff member
Joined
Aug 19, 2017
Messages
3,006 (1.07/day)
In a recent interview with ASUS China Manager Tony Yu on Bilibili, AMD CEO Dr. Lisa Su revealed that the company's latest RDNA 4-based Radeon RX 9070 XT GPU has achieved remarkable success, with first-week sales surpassing previous generations by tenfold. "I'm very proud of the team on RDNA 4," said Dr. Su, adding, "When we design a new architecture, we plan its core features years in advance. For RDNA 4, our goal was to deliver top-tier gaming performance at an accessible price, allowing more gamers to experience this technology. The 9070 XT has been a fantastic success—it's the No. 1 selling AMD Radeon GPU in its first week, with sales 10 times higher than past generations."

While rumors suggested AMD sold 200,000 Radeon RX 9000 GPUs at launch, the company clarified that this figure was never shared, basically dismantling the rumor. When (and if) we get the official concrete sales numbers, it will show just how much of a significant milestone has been achieved by Team Red. Notably, in some specific markets like Japan, AMD has captured nearly 50% of the market share with the RX 9070 series—a first for the company. Dr. Su confirmed that AMD is ramping up production to meet surging demand. "We are very excited and increasing manufacturing to ensure more gamers can access our GPUs," she said. This move is expected to stabilize supply and pricing. Additionally, AMD hinted at more RDNA 4 releases, likely including the Radeon RX 9060 series, which will come in 16 GB and 8 GB variants later this year. With competitive pricing and strong performance, AMD's latest GPUs are well-received by the gaming community despite no high-end competition for NVIDIA's top SKUs.



View at TechPowerUp Main Site | Source
 
> "It's the No. 1 selling AMD Radeon GPU in its first week, with sales 10 times higher than past generations."

This is just the exact same thing as Nvidia's "double the amount" claim, comparing your mid-tier launches to your previous flagship launches. And frankly, 10x the first week sales of the 7900 XT and XTX is like taking candy from a baby, nobody wanted those things compared to the 4080 and 4090.
 
At msrp very decent cards given how expensive cards got in the last 5 years. Still, I don't find them that exciting if you own a decent GPU already from >2020, not massive gains in performance. Just better price / performance than what's been available recently...
 
It's easy to jump "tenfold" when your past record is at the bottom. Not to mention when the main competitor is focusing on a different market and the market is dying for new products. That being said I hope AMD continues strong, because we can't expect a company with 7 billions income per quarter to keep fighting with a company that is closing to 40 billions per quarter.

The problem for the average consumer is that, Nvidia primarily and AMD secondarily, proved that the average consumer today is willing to pay 2-4 times more compared with 5-10 years ago for a GPU. That's simply bad. Maybe if Intel fixes it's manufacturing it will be in a position to flood the market with cheap 18A graphics cards in the near future. For now, let's see what the xx60 models will have to offer and at what price points. And they need even higher numbers of GPUs to cover the demand for the xx60 models, so let's see.
 
Hopefully AMD management doesn't take the wrong lesson from this (abandon high end/Halo) and understands that they're selling because they have a compelling product *in the absence of Nvidia*.

If AMD puts their best foot forward every gen, then they'll be there to pick up the slack when Nvidia retreats from the market, but don't just turn into a shrinking violet when Nvidia does show up to play.
 
Hopefully AMD management doesn't take the wrong lesson from this (abandon high end/Halo) and understands that they're selling because they have a compelling product *in the absence of Nvidia*.

If AMD puts their best foot forward every gen, then they'll be there to pick up the slack when Nvidia retreats from the market, but don't just turn into a shrinking violet when Nvidia does show up to play.
I don't think so. Polaris and RDNA1 were upper mid-end yet they still put highend/enthusiast cards in the next generation.
 
While it's not the same as a native implementation, using Optiscaler is easy and the result can be truly transformative and there are numerous walk-through online to explain how to use it.

Optiscaler is great. Looking forward to Vulkan support so I can use it in No Man's Sky.
 
I WANT to believe, but we'll see the numbers eventually!
They have every reason to do better than previous generations, though, not having ROCm day 1 is embarrassing when most of the world outside gaming has all eyes on what each company's AI readiness is.
But I'm sure they'll get ROCm and when they do it'll sell even more.
 
I wonder if I should sell my 7900XTX while it still has decent value. I don't need the absolute performance if 9070 XT is almost just as fast (and faster in RT which might be relevant in the card's lifetime), but newer and can be made more power efficient.
What do you think?
 
And when we are two months late with software and force retailers to sit on two months of inventory, guess what? We get to blab about how great we are doing when all that inventory finally moves.

In all seriousness, good products sell. I’m glad AMD has a fairly good product portfolio now.
 
If the 9070 or 9070XT were compatible with ROCM from the first minute, the gamer would have a problem and a higher price premium due to lack of stock. In a few months it will be compatible with ROCM. Now it is easier to buy this GPU precisely for this reason.the gamer....
 
I wonder if I should sell my 7900XTX while it still has decent value. I don't need the absolute performance if 9070 XT is almost just as fast (and faster in RT which might be relevant in the card's lifetime), but newer and can be made more power efficient.
What do you think?
I don't see the urgency. It's not like next gen cards come out that frequently, and with scalping, it's not like things are losing value quickly either.
You should be able to wait longer without losing value.
 
great, good to see the consumers have spoken and 1000 euro is now the midrange.
 
If the 9070 or 9070XT were compatible with ROCM from the first minute, the gamer would have a problem and a higher price premium due to lack of stock. In a few months it will be compatible with ROCM. Now it is easier to buy this GPU precisely for this reason.the gamer....
Those who really care about ROCm lean towards 7900XTX and 7900XT because of memory bandwidth and more VRAM.
 
Here's a toast to... you know, great products. Welcome back into the game.

Next step: fire marketing and poach some people from Nvidia.

Those who really care about ROCm lean towards 7900XTX and 7900XT because of memory bandwidth and more VRAM.

Everyone should care about ROCm, regardless of the hardware they have. CUDA made GeForce the beast it is. Radeon has nothing on that, and is entirely missing that slice of the pie. AMD is aware and to the best of my knowledge, working on a solution.
 
It's easy to jump "tenfold" when your past record is at the bottom.

No it's not.

Rx 290x had an oversupply issue and AMD lost money at a critical time period. All companies have to deal with balancing production lines and ensuring that too much stock isn't made.... But enough stock to prevent scalping / hoarding is made.

Too little and consumers are screwed. Too much and the company is screwed.
 
Everyone should care about ROCm, regardless of the hardware they have. CUDA made GeForce the beast it is. Radeon has nothing on that, and is entirely missing that slice of the pie. AMD is aware and to the best of my knowledge, working on a solution.
From inference point of view, both CUDA and ROCm will get capped by hardware memory bandwidth here. 7900XT and 7900XTX offer superior numbers here (besides actual ROCm support).
 
CUDA is great and all, but virtually all video game code is DirectX, DirectX Raytracing or DirectCompute.

The exceptions are Vulkan, Vulkan Raytracing and Vulkan compute.

CUDA (and HIP) is for AI and Prosumer (ex: DaVinci Resolve, Blender Cycles) applications. Not video games.
 
That being said I hope AMD continues strong, because we can't expect a company with 7 billions income per quarter to keep fighting with a company that is closing to 40 billions per quarter.

The problem for the average consumer is that, Nvidia primarily and AMD secondarily, proved that the average consumer today is willing to pay 2-4 times more compared with 5-10 years ago for a GPU. That's simply bad. Maybe if Intel fixes it's manufacturing it will be in a position to flood the market with cheap 18A graphics cards in the near future. For now, let's see what the xx60 models will have to offer and at what price points. And they need even higher numbers of GPUs to cover the demand for the xx60 models, so let's see.

They've been fighting Intel from this position (relatively small-scale esp wrt R&D/marketing budget) their literal whole existence. Similar w/ GPG vs nVIDIA for also a very, very, long time. Almost 20 years.
nVIDIA has gotten larger, but were always operating from an advantage. By advantage, I mean those budgets; not necessary engineering prowess, especially wrt customer needs.
In-fact, sometimes despite the latter (arguably because of the former), which has been successful for them in chaining along customers based on marketing. AMD operates on their tech speaking for itself.
That isn't even hyperbole. AMD is primarily an engineering company (as was ATi before it), nVIDIA is a marketing company first (in both advertising and engineering goals), and a technology company second.

We already know how '60' GPUs will perform. Like a 9070 xt but with half the units. It is less than half the size as 9070 xt as it's meant for 'sweet-spot' clocks w/ 20gbps ram, where-as N48 can scale higher.
Probably for a currently unannounced product (which I don't know will ever see the light of day, but probably). Obviously their goal is a 1440pRT GPU to compete with a 5080 w/ 24GB (maybe w/ 32GB).
The 'problem' is obviously using a chip that small (and scaling clocks) will likely use a lot of power. OTOH, likely be relatively inexpensive; we'll have to see how that works out in mind/market.
Both wrt N48 scaling higher and N44 being 'good-enough' for what people want (which I would argue it is not; I think 9070 xt is the baseline you should buy for RT or just buy something cheaper/better for raster).
Simiarly, even if a faster N48 is 'ok' for 1440p native RT, it will be questionable wrt 4k up-scaling. It may do those things 'okay', but IMO (just like wrt 5080) worth waiting until 3nm if you didn't buy a 4090.

Prices have adjusted some, but wrt to AMD I don't think it's that bad. I'll always be the guy saying 9070 xt should have been $550, but asking $50 for early RT/FSR4 adopters at this budget RN isn't drastic.
They've slowly been trying to to jump from the $300 market to the $400 market at the bottom, and in some ways have to, as going from 8GB to 16GB adds extra cost.
I could imagine N44 being $300-350 at the top, but the thing that really screws N44 is Intel (B570/580).
Intel undercut that market by taking almost zero margin, plus adequete bus/ram; pretty tough to directly compete in 1080p raster market wrt price/perf with a 128-bit bit bus as more games req >8GB.

Similarly, they want to catch the areas 7900xt and 7800xt/7900gre sold (~$600-650 and ~$470-550), but I think the market will reject it for N48 long-term. Right now people thirsting for not $1000 16GB RT cards.
AMD just don't want, for instance, 9070 to freefall in price like 7700xt did, as it then overlaps their market with a lower chip. Instead, they need to build products towards each of those markets, and likely are.
And likely have, clearly hoping 9070 doesn't exhibit that drop from ~7800xt pricing eventually (until it is replaced). If successful or not, TBD. It likely depends on what nVIDIA does wrt 5070 pricing.

Even if you extrapolate to the future, where perhaps their low-end may be $350-400 for a full chiplet stack (and perhaps $50-100 less with units disabled), that isn't too bad of a hike compared to nVIDIA at $400/500.
Similarly I expect their stack to scale, and while it perhaps not be exactly equal in MSRP (although it could be), it's still conceivable it will be in similar markets (not whatever they can charge; unlike a '90').
With likely AMD undercutting what nVIDA wants to sell for formerly $1200, now $1000 (because that market rejected that price long-term after early-adopters buy and/or supply settles).
Looking back at AMD, we have clear bellwethers at consistant markets. Not only with where 9070 xt is selling, but 7900xt/xtx (~600/800 or so). Furthermore, attempted markets (like 6900xt/6950xt/7900xtx MSRP).

Essentially, AMD has to make products for what 7700xt actually sold for (~$350), and not much lower. This is likely where they will aim UDNA as a base. Cut-down chips and cheaper, sure, but that's the meat.
They also need to hit where 7800xt, GRE, 7900xt, and 7900xtx all sold. 9070 series currently doing the GRE/79XT, eventually first two. UDNA will likely also target these markets and accepted pricing.
So, say something like a stack that is one chiplet stack at ~$350 or so (whatever ASP for 7700xt), a disabled two-chiplet stack for ~$500, another for ~600 or so, and a full double stack for around $800ish.

nVIDIA price-checks the market constantly, and adjust supply, where-as AMD very much builds TO the market and what it will pay. This is very fact-checkable, and something they mention consistantly.
My prices for UDNA speculation, but it truly does all make sense (especially if figure adjusting RAM amounts on a 256-bit bus to 16/24/32GB etc; maybe disabling chiplets from 2048 to 1792/1536).

IMO, this strategy is going to absolutely slaughter nVIDIA if the latter keeps doing what they do, which is overprice and underspec, especially when AMD builds not only to accepted market pricing, but also a spec.
Like 1080p RT (or 1080p RT scaled to 1440p). Or 1440pRT native. Or 1080pRT up-scaled to 4k. Or 1440pRT to 4k scaled. Or 4k native. Unlike nVIDIA with their unbalanced specs and/or planned obsolescence...
ALL CARDS YOU CAN EXPECT AMD TO MAKE AND DO THOSE THINGS W/O COMPROMISE. ALL AT PRICES THAT THE MARKET ACCEPTS. THIS IS WHAT THEY DO. No shenanigans, unlike their competitor.

I wonder if I should sell my 7900XTX while it still has decent value. I don't need the absolute performance if 9070 XT is almost just as fast (and faster in RT which might be relevant in the card's lifetime), but newer and can be made more power efficient.
What do you think?
IMHO, I would just overclock the 7900xtx to wherever it's stable until you're ready for another real upgrade. The 7900xtx is a better over-all card (although not by a ton), excluding FSR4.
It all comes down to if you if you're a 4k (raster) gamer that plays at 4k and/or 1440p->4k. If you do, keep the 7900xtx. If you're a 1080p/1440p gamer, 9070xt will run 1440p raster and up-scale RT better.
The only real downside to 7900xtx is the fact RT has to up-scale from 1080p->4k and FSR3 looks....well...not great. The expected market of 9070 xt is not 4k. It's 1440p. I would argue you could also run 'quality' up-scaling of RT at 1440p on a 7900xtx, but that's the value in the 9070 xt (and FSR4 will look better; although I don't think FSR3 in that scenario is unusable unlike 1080p->4k). 7900xtx get a bad rap.
It's only true failing is that it has to scale RT from 1080p->4k (it's intended market) and FSR3 looks bad scaling that much. If they implentented FSR4 (and improve it), and with an OC, it would be a great card.
AMD truly did put those users in a pickle, and it's clearly because they want to start fresh with everyone on the same page of their new arch goals of one level lower raster (or one level higher RT) per market.
Right now that market is 1080p/1440p. 4k users real only alternative (imho) is a 4090 until either company makes something similar (and cheaper), which they didn't this gen.
Again, I have speculated it is because it would cost ~$1200, and neither wanted to attempt that market long-term given 4080 was unsuccessful at that pricing (and AMD didn't attempt it after 4080 price cuts).
It's also possible they believed anyone that wanted that spec and was willing to pay over $1000 bought a 4090. Both of those explanations, perhaps in conjunction, most feasible.
As I've said before, that leaves users like me waiting on cheaper used 4090's and/or another generation. Similar for 7900xtx users, if not just some kind of FSR4 port as a stop-gap.
 
Last edited:
I wonder if I should sell my 7900XTX while it still has decent value. I don't need the absolute performance if 9070 XT is almost just as fast (and faster in RT which might be relevant in the card's lifetime), but newer and can be made more power efficient.
What do you think?

I'd be inclined to sell it sooner rather that later, as stock levels of the 9070 XT pickup it's value is likely to drop quicker than Yevgeny Prigozhin.
 
I'd be inclined to sell it sooner rather that later, as stock levels of the 9070 XT pickup it's value is likely to drop quicker than Yevgeny Prigozhin.
I think so too.
I have only recently upgraded to 1440p and don't EVER plan to go 4k, so I don't need more performance. I actually run the 7900XTX undervolted and severely limited in frequency (capped at 2300MHz) because the noise and heat is unbearable even with the quiet BIOS (yes even on the card in my specs).
So, if 9070XT can be run a little more efficiently at more or less the same performance level, I feel like it might be a sound replacement (not upgrade) if also is much better at raytracing (which might be relevant to me in the coming years) like they say it is.
 
Back
Top