Tuesday, February 11th 2025
![AMD Radeon Graphics](https://tpucdn.com/images/news/amdradeon-v1739213381396.png)
AMD Plans Aggressive Price Competition with Radeon RX 9000 Series
According to ITHome, AMD is preparing to disrupt its competition with aggressive pricing for its upcoming RX 9000 series. The RX 9070 XT, built on the RDNA 4 architecture, is expected to launch at $599, positioning it directly against NVIDIA's RTX 5070 Ti, which carries a $749 price tag. With this competitive pricing, AMD aims to revitalize its market position following lower-than-expected sales of the RX 7000 series, causing it to lose some market share. The upcoming RX 9070 XT features the Navi 48 core running at 2.97 GHz, complemented by 16 GB of GDDR6 memory across a 256-bit bus. Architecture's enhanced AI upscaling capabilities, already demonstrated in the PlayStation 5 Pro, could offer compelling performance advantages over current-generation cards. The base RX 9070 model is anticipated to debut at $499, creating a focused attack on multiple market segments, including NVIDIA's RTX 5070, priced at $549.
AMD reportedly plans to accelerate the end-of-life timeline for its RX 7800 XT, currently priced at $479. Sources from IT Home suggest production ceased as early as January, months ahead of the planned initial third-quarter 2025 termination. This accelerated timeline suggests AMD's confidence in the RX 9000 series' ability to deliver superior price-to-performance metrics. The March 2025 launch window for the RX 9000 series arrives at a critical point in the GPU market, as NVIDIA rolls out its Blackwell-based RTX 50 series. AMD's aggressive pricing strategy and the architectural improvements in RDNA 4 positions the company to challenge NVIDIA's market dominance, at least in the $500-$600 price range. This competitive positioning could trigger NVIDIA price adjustments, potentially benefiting consumers who have faced consistently high GPU prices in recent years.
Sources:
ITHome, via PK Insight
AMD reportedly plans to accelerate the end-of-life timeline for its RX 7800 XT, currently priced at $479. Sources from IT Home suggest production ceased as early as January, months ahead of the planned initial third-quarter 2025 termination. This accelerated timeline suggests AMD's confidence in the RX 9000 series' ability to deliver superior price-to-performance metrics. The March 2025 launch window for the RX 9000 series arrives at a critical point in the GPU market, as NVIDIA rolls out its Blackwell-based RTX 50 series. AMD's aggressive pricing strategy and the architectural improvements in RDNA 4 positions the company to challenge NVIDIA's market dominance, at least in the $500-$600 price range. This competitive positioning could trigger NVIDIA price adjustments, potentially benefiting consumers who have faced consistently high GPU prices in recent years.
103 Comments on AMD Plans Aggressive Price Competition with Radeon RX 9000 Series
Since it and 7900xtx will probably clock similar, probably both good for Low RT 1080p in Hogwarts. Yes, I know 5070ti is cheaper, but xtx also 24GB of ram and a ton more raster for 4k w/o RT.
So, that's how that goes. I wouldn't buy anything below a 4080 for RT, personally. But I also think there's way better value to be had not buying into it right now and judging $/perf on raster. JMO.
Spoiler alert: I think they are both garbage, too.
I wonder what those will cost? Probably not $300, but probably not $500 either.
Gaming is 7% of AMD's total revenue. I've been saying it for a long time, AMD does not care about gaming. RDNA was cut for a reason. They don't want to devote resouces to gaming. They have razor thin margins and it's a tiny portion of their portfolio.
7800XT for example still has 64MB of cache. GCD+MCD is 346mm^2. The Radeon 6700XT has 96MB Infinity Cache where the GCD/MCD is a single monolithic chip at 335mm^2. If you don't count the MCD, then lets not count the Infinity Cache die space on the monolithic chips.
In the real world though, MSRPs aren't real. Neither is stock, apparently. If both MSRPs are real, people will get the one that's in stock now, not ETA 3 months. If both have stock, people will get the one at MSRP not +30%. We won't know until both are out and a week has passed for initial pre-tarriff stock to run out. Which is another fun thing to factor in.
Because of the RAM, 7900xtx can both clock higher AND has more units (not RT, 6 v 7) at that higher clock. It's not a contest...It's a bloodbath.
It's really if you want 1080p high RT (5080) versus if you want to have to overclock for 1080p low RT (7900xtx). I choose wait, because 5080 too low raster/ram, 7900xtx too low RT.
5070ti I'm sure will be okay in raster for 1440p. It will just suffer in RT; be similar to a 7900xtx.
5070 is a joke. But so are N48 stock clocks, so IDK. Both are (probably) 4 engines. Both will suck at RT. Depending on where they end up clocking, one will suck less. N48 will destroy 5070 in raster.
I want to see N48 running at 50% scale with 4x FG so I can call it a 5090 to 5070's 4090. Not really, but I hope someone runs that test for shits and giggles.
As for RX 9070, $499 is too much. If rumors about 9070 XT's performance are true, no one will go for 9070 at this price. $449 or $469 suites 9070 much better.
They can keep FSR and FG to themselves, I've never asked for this shit.
So 4*36.6*.777+200 = ~ 5nm monolithic N32. It would likely be smaller than N22. This is likely why AMD is discontinuing N32.
It was a VERY good chip, and probably relatively cheap to make (even with assembly given 7nm wafer price is lower)...
... but they don't want people to keep buying it like they did N22 (which dropped quickly below $400 after launch but even a good amount under $300 after it was replaced making it a very good deal)...
...I don't think they want people out there buying 7800xt's for as low as $300, up to 400, because people would literally probably not buy anything else (unless extremely high-end).
Who tf is buying this shit? AMD needs to undercut nvidia's MSRP to create a bloodbath or NO SALE.
Guys, that's your non-AMD alternative. Have fun.
I wonder how much of that was required by the reviewer's guide? Give me the over/under on 50%.
Those are averages, and those are stock clocks (which the later product can always adjust to beat the former AT STOCK).
Heavy reality distortion field and why I like to give W1z hell. What would you say that average is lead in mins, the thing that actually matters so consistent/not-stuttering? Don't read ahead and guess.
5080 problem is lows, what matters, where it will not retain 60fps in many cases when it exceeds the buffer, which will only increase. This is how nVIDIA fools people. I'm not shilling. I'm telling you the truth.
16GB is going to become the MINIMUM for 1440p. 18GB the average. 24-32GB the high-end for 4k. You understand that or you don't. IDGAF.
It is important to notice many of W1zard's suite is not games that exceed 16GB buffer...and even in his case he says the difference is only 12-13%. Did you guess right based based on that chart?
No? MAYBE THAT'S WHY I GIVE HIM SHIT. Crazy how NOT ACCURATE that graph looks isn't it? CRAZY HOW THE RAM AND THE MINIMUMS ARE THE PROBLEM...LIKE I'VE SAID OVER AND OVER.
He literally complains when a game uses VRAM, and does not include those games after initial tests for this purpose. This is why he thought 8GB, maybe now thinks 12GB is enough. Understand the joke.
Not only can that difference be made up from overclocking a XTX (usually ~15%), it will actually retain good scaling and minimums up through that difference because RAM.
Wonder how many games 5080 will be ~56FPS mins and an OC not fix it? I don't know but that's literally THE ONLY GAME THAT'S BEEN TESTED POST-5080 and it's already choking at REGARDLESS.
Upscaling 1440p RT to 4k? WHOOPS!
I don't care what you buy, guy. Just remember JUST LIKE I SHAT ON 4070Ti bc 12GB for 1440p raster/1080p RT, which is now true,
THIS IS THE EXACT SAME THING THAT WILL HAPPEN WHEN RUBIN LAUNCHES for 4K, if not sooner and a 5080 24GB. In-fact it already is, just not games in W1zard suite. This is why nVIDIA reviews suck.
If not for 4k raster/1440pRT over 7900xtx, then what is the purpose of 5080? 7900xtx can run 4k raster (and has the buffer not to stutter) and 1080p RT.
Same for 4070ti/5070 over literally any cheap card that's a 6800xt or better for 1440p for raster? 1080p RT? Whoops. Does not compute. Error. Marketing bullshit detected. Outdated.
Certainly not <45TF or 12GB being the problem. Certainly not nVIDIA capping it under those VERY OBVIOUS requirements. Certainly not exactly what N48 is precisely to not have those problems.
Certainly not the same thing happening with 5080 and ~60TF/16GB (4080 a problem for one, 5080 for the other) which is it's own threshold for a TON of games for 1440pRT (etc) or 4K raster.
Please understand logic. This is what nVIDIA does. You see it or you don't. If you can't, that's fine. I can't help you, but I tried. IYKYK.
But don't cry about it when it happens as soon as literally ANYTHING that could surpass it from them launches, BC THAT'S WHAT THEY DO. ON PURPOSE. That's my point.
They don't have to limit things this way, but they do. Because of that you literally cannot be safe with ANYTHING other than a '90'-class card to last through any length of time at a given rez.
6800xt lots of people hated on versus the 3080? Still freakin' going, guy. 7800xt even better. Cheap. As. Hell. Still kickin'. Fine-ass-not-ripping-you-off-Wine.
3080 people wet their pants over, how's that holding? Glad you asked. LOOK. Totally not planned obsolescence. Overclock it? Good luck! OH WAIT. IT'S ALMOST LIKE THEY DON'T WANT IT TO BE 60FPS.
I could do this all day. BC it is literally that apparent. Some people noticed it with the 4060 Ti 8GB. Some noticed it with the 4070Ti 12GB. Some people the 3080 10GB. Now 5080 16GB.
At some point it's not a fluke, but a trend. Or rather their GD BUSINESS MODEL. AND IT'S FUCKING EVIL BC THERE IS NO REASON FOR IT OTHER THAN $. They control EVERYTHING and this happens.
And that's why AMD can't even START. Because they don't understand nVIDIA has literally the whole market rigged. They have to compete on value and common sense. That's the only way. Intel too, itw.
If these are going to be the prices I don't see the point, even with better performance. Just seeing that you can get team green for the same price is enough for most people to not flip. They don't research benchmarks or really care about RT, it's all about price and perception.
Truth: it didn't - all models are alive, well, lurking around and available for sale... (for the few who would buy them) This is a different category chip. The ultra-high-end, enthusiast part, with 24 GB of VRAM, highest shaders count, the biggest, most complicated and fastest GPU from Radeon to date.
On the other hand, Navi 48 is just a low-end to mid-range mid-cycle refresh, backported from 2nm or 3nm, because the Moore's law is dead, and because TSMC charges insane amounts for its wafers, which no one except Apple can afford to pay.
If it wipes the floor with the 5070Ti for $50 less, who cares? To everyone who wants it to be priced at 500 for 25% greater than nvidia performance need to get their thoughts checked and corrected. AMD isn't a charity case.
This is straight from AMD employees i've spoken to: they don't even know how much money they will make from MI300 chips in 2025. Guess how much they thought they will make in 2024? Initially they claimed 1 billion, then revised it to 2, then 4, then 5+ which is what it amounted to. Guess what their customers are saying? That they are super professional to work with, their deployments are stable and does the job. OpenAI said they are quicker and more efficient than nvidia for inference which is why they deployed 100,000+ of those instead of nvidia, and this year they're focusing head on for training instead. The same Epyc guys who have a stellar reputation in the field are the ones branching out and doing these deployments. THAT is what AMD's focus is, because their rate of growth (for NV too) has been astounding. But AMD are literally doing this from scratch, because unlike nvidia their instinct sales before 2024 was miniscule. They went from 100 million to 5+ Billion in less than 2 years. That takes a lot of resources to ensure it's done right.
If there was no CUDA monopoly, MI300 would cause a world of trouble for nvidia's latest, greatest and highest end chip. So yes, AMD is perfectly capable of releasing hardware that's competitive with nvidia at the highest end with a fraction of the budget. Just like they did for decades with intel. But no, twerps want them to treat gamers like a charity case and give them 4080 performing chips for 300 when nvidia are releasing bullshit successors for a grand performing 10% greater than it's predecessor like it's a failed CPU launch. It just isn't happening.
Edit: This twerp wouldn't mind a 4080 competitor for $550 though..make it happen!
Matching vram of the 3090/4090 on next wave GDDR7 at any price would be absolutely cutthroat.
The only possible team green response would be 32GB halo cards they can't even offer to consumer. The current panic over 40/50 series says ✘ to doubt. Who is buying?
Who the fuck is buying last gen's lineup at 2-3x MSRP? Who is it? Isn't me. This one could be a breaking point in the very near future.
There have been experiments with it before and they work great.
AMD is still cooking.