Thursday, January 30th 2025
AMD Radeon 9070 XT Rumored to Outpace RTX 5070 Ti by Almost 15%
It would be fair to say that the GeForce RTX 5080 has been quite disappointing, being roughly 16% faster in gaming than the RTX 4080 Super. Unsurprisingly, this gives AMD a lot of opportunity to offer excellent price-to-performance with its upcoming RDNA 4 GPUs, considering that the RTX 5070 and RTX 5070 Ti aren't really expected to pull off any miracles. According to a recent tidbit shared by the renowned leaker Moore's Law is Dead, the Radeon RX 9070 XT is expected to be around 3% faster than the RTX 4080, if AMD's internal performance goals are anything to go by. MLID also notes that RDNA 4's performance is improving by roughly around 1% each month, which makes it quite likely that the RDNA 4 cards will exceed the targets.
If it does turn out that way, the Radeon RX 9070 XT, according to MLID, should be roughly around 15% faster than its competitor from the Green Camp, the RTX 5070 Ti, and roughly match the RTX 4080 Super in gaming performance. The Radeon RX 9070, on the other hand, is expected to be around 12% faster than the RTX 5070. Of course, these performance improvements are limited to rasterization performance, and when ray tracing is brought to the scene, the performance improvements are expected to be substantially more modest, as per tradition. Citing our data for Cyberpunk 4K with RT, MLID stated that his sources indicate that the RX 9070 XT falls somewhere between the RTX 4070 Ti Super and RTX 3090 Ti, whereas the RX 9070 should likely trade blows with the RTX 4070 Super. Considering AMD's track record with ray tracing, this sure does sound quite enticing.Of course, it will all boil down to pricing once the RDNA 4 cards hit the scene. If AMD does manage to undercut its competitors from NVIDIA by a reasonable margin, there is no doubt that RDNA 4 will be the better choice for most people. However, with NVIDIA's undeniable lead in ray tracing, paired with DLSS 4, will presumably make things more complicated than ever before. It is unclear what AMD has up its sleeve with FSR 4. Recent rumors do point at pretty good compatibility, but as with all rumors, be sure to accept any pre-release whispers with a grain of salt.
Source:
MLID via YouTube
If it does turn out that way, the Radeon RX 9070 XT, according to MLID, should be roughly around 15% faster than its competitor from the Green Camp, the RTX 5070 Ti, and roughly match the RTX 4080 Super in gaming performance. The Radeon RX 9070, on the other hand, is expected to be around 12% faster than the RTX 5070. Of course, these performance improvements are limited to rasterization performance, and when ray tracing is brought to the scene, the performance improvements are expected to be substantially more modest, as per tradition. Citing our data for Cyberpunk 4K with RT, MLID stated that his sources indicate that the RX 9070 XT falls somewhere between the RTX 4070 Ti Super and RTX 3090 Ti, whereas the RX 9070 should likely trade blows with the RTX 4070 Super. Considering AMD's track record with ray tracing, this sure does sound quite enticing.Of course, it will all boil down to pricing once the RDNA 4 cards hit the scene. If AMD does manage to undercut its competitors from NVIDIA by a reasonable margin, there is no doubt that RDNA 4 will be the better choice for most people. However, with NVIDIA's undeniable lead in ray tracing, paired with DLSS 4, will presumably make things more complicated than ever before. It is unclear what AMD has up its sleeve with FSR 4. Recent rumors do point at pretty good compatibility, but as with all rumors, be sure to accept any pre-release whispers with a grain of salt.
74 Comments on AMD Radeon 9070 XT Rumored to Outpace RTX 5070 Ti by Almost 15%
We always need to wait for the reviews to get the final numbers, and that’s clear no one wants to sell nonsense like NVIDIA did with the 5070=4090 ;)
By the way, have you seen the benchmarks of the XTX with DeepSeek R1 model compared to the 4090? As soon as you step out of the NVIDIA "ecosystem", which has become as closed and dangerous for evolution as Apple, and move to a more open system, the numbers start to seem different. I’d love to see what would happen if game developers began to consider them a bit less. It’s logical, NVIDIA has a much larger market share and has been leveraging it for years but there's life beyond as well.
9070xt will definitely be very competitive against the 5070Ti. By the way, it’s not new to see the 9070xt performing close to the XTX in raster. There’s always been talk of achieving performance near the 4080 or XTX in raster. And we’ve known for months that ray tracing has improved significantly, especially since the PS5 Pro with RDNA3.5 (at least +50%) was introduced, which has been somewhat of a development platform for RDNA4.
The 5080 was never its target, the focus remains on the 5070Ti. But considering how little improvement NVIDIA has brought, it will still be exciting to see the 9070xt in the high-end range of the charts!
The other thing I noticed is in the nvidia 5xxx reviews TPU had that strange, unexpected and frankly unnecessary line about not being sure if AMD will be in the GPU space in a couple of years. Didn't really expect it from them as it was...idk something the trashy rumor sites would post and i'll stop at that. Anyway, what people fail to realize is development cycles and the company's position at the time.
1) GCN was being developed around 2008-2009 when AMD inherited the arch during it's infancy from ATI who were doing quite well. It was a banger, and even though they ran out of money right after launch it served them well for a decade
2) RDNA was developed around 2016-2017 when AMD were deep in debt and putting all their money, hopes and dreams on Ryzen. It turned out okay, but nothing close to what GCN achieved.
3) UDNA is being developed now, when AMD have money, resources, time and a bunch of clowns in their marketing department. Speaking to people at AMD, they're putting a lot of resources into that thing and rightly so - their whole AI money pit depends on it. There's every possibility it's going to be another banger, but let's wait and see. I just can't see it being worse than RDNA on the 'relative to competition' basis.
It's supposed to launch around the same time TPU claims AMD discrete GPU division might not be around so erm..let's wait and see I suppose.
I didn't even know about the janky comment on "AMD not being in the business later". I stopped reading/watching conclusions ages ago because they are way too swayed by the reviewer's personal taste, and that's not just TPU, it's everywhere. Anyway, why would AMD quit the business when they own basically the entire console APU market? That comment is just stupid.
Caution: The above post contains twisted humor, sarcasm and cannot be used as grounds for legal action.
Gotta sell it later with 24GB of ram as 5080 Super when 3GB prices go down. (Probably Micron but ignore that part!)
Whoops, doesn't have 8 clusters so it doesn't need 24GB of ram, better sell it again as 6080 with a slightly higher clockspeed when we can make it cheaper on 3nm.
I ain't even lying. 5080 16GB is ridiculous; always was. That's why these cards will get so close bc they're well-matched in terms of compute/vram.
As I've said before N31 was equalized to 2720mhz with 20gbps ram. If you figure the same cache and it's essentially 2/3 N31, it would be the same.
The difference is it's most-likely used in a 7511.111/64 ROP use-case, hence you get the clockspeed of 2.97ghz. Strangely, just above where a 7800xt can overclock...WEIRD!
To me, these rumors look correct in terms of absolute performance. Tom is a reliable guy that shares credible information (at the time he receives it) and they fit with my thesis of their likely capability.
That said, again, to me, this looks like OC/absolute performance...unless AMD is adjusting clockspeeds to where I think they should have always been all-along and there's some cache/faster ram shenanigans we aren't privy.
I come here for the charts; voltage/clock ranges etc. To deduce bandwidth/arch limitations through them, etc.
There is great info here, but it ain't in the conclusion.
Also, I have only once accused W1zard of payola from PNY/nVIDIA. Only once. Because I don't want to get banned.That said, he created some amazing tools/charts that these companies use to create their product stacks because many consumers see them as the gold standard, so he deserves
some kickbacksrespect for that wonderful continued work over many years.Remember: 8GB is enough.
I listen to DF to scream out loud about their bias comma mostly (which I think hurts/confuses many consumers). They certainly have ins at nVIDIA for info, which if you parse through the regurgitated marketing (they perhaps have to say to keep them) is actually interesting. It's a shame many in the general populace get brainwashed by it though.
Remember that time guy revealed frame-gen was done on tensor to them? That was pretty hilarious (whoops!). Guess we can't sell the Optical Flow snake oil anymore, put it in the back with the G-sync module. I'll be looking forward to DF never mentioning it again so FG doesn't have to be back-ported.
Again, I go to them for their frame-time/rate vids, image comparisons, mentioning the resolution scale, analyzing clocks etc...and they're very good at that. They deserve massive respect for pioneering/updating those tests et al for more consumers to see. Also, sometimes they zoom-crop FFVIIR bikinis, sometimes they don't. I feel that. I do.
But I'll never forget the time they had to be over-nighted a card to test an AMD feature, because they don't even keep them around. That was pretty telling. Speak for the average/balanced/long-term consumer wrt products they do not. Proudly, I guess. If they could do it with a little bit less propaganda though, that would be cool.
With efficiency swaying so much from game to game there needs to be an average of sorts I suppose. CP also comparatively shows AMD worse off as well.
There's a power draw average for games, and performance summary as well in the review itself. Pretty sure an equation can be put in there because the data for averages already exist. But I haven't put much thought into it only something that crossed my mine rn. Because when I glanced over that data, it seems 5080 consumes around 10% more power for 12% more performance. Certainly not 11% more efficient overall, not even close.
The poetry writes itself, it's like lady luck loves AMD. lol.
Watch AMD release a 32gb 9070xt just to mind f'k the market with AI.
I get very saddened by people buying into nVIDIA's savvy crap that does not benefit them long-term but they think does (until they complain about it later), and reviewers aren't helping. I won't get into it.
Some people kinda/sorta already did, but it goes beyond that in ways I don't want to get into. I don't want to start a fight with any Youtube math teachers that want allocation.
I look to AMD for solace, it isn't there, and I get mad. It's doesn't mean they don't and/or can't make good-value products, but they used to LEAD in very important aspects; also make their strengths known.
I know it comes out in my posts, and I apologize for that. There's just something about the culture changing from nerds to normies whom think they understand, but don't; really mostly nVIDIA marketing.
...and AMD's marketing is awful to boot, which doesn't help. The whole 9070 series thing is a gigantic clusterfuck the likes I have never seen before, and they should be ashamed.
I get that they want time to catch up on features, but how they went about exposing this series and then trying to shove it back in the closet is beyond ridiculous. The price/placement uncertainty...it's bad form.
Whoever they have now in marketing is no Dave Baumann. Hell, whoever they have now is no Scott Herkelman. Very obvious things are in disarray now, perpaps because of layoffs. Higher than that, my friend. Overclock a 7800xt. It's clocked in the toilet at stock for marketing purposes of this very card.
It gains around 19% performance in many cases. Look at W1zard's reviews.
The best a 7800 can clock is 2936mhz, but a 7700 (oddly similar across multiple cards) 3133mhz. That doesn't make any sense other than obv 7800xt was going to be clock-limited but instead got PL-limited.
Oddly, 7900xtx can also hit around 3165-3200mhz on the same arch before going power bananas?
It's stinky. Reeks of artificial product segmentation (granted RDNA4 has 8-bit ops; tensor cores). He's not wrong (for the most part).
The question truly is how high it will clock. If it's only ~3.3-3.4ghz max, that's bad considering die size is ~15-20% larger than it should be.
Not necessarily bad for those products, but for the chip overall if it can't be binned higher (at >375w).
If it's 3.5-3.7ghz on 3x8-pin (and release a >20gbps ram card), then we're talking an actual improvement wrt chip design and not just marketing tactics.
There have been times in the past where I couldn't really understand why reviewers were taking up the angle they were. AMD had better architectures a few times going up against nVidia, but it wasn't really reflected in reviews and (partly) consequently sales. Take the 290X - it was architecturally much superior to the GTX780 and that was reflected in it's longevity. Guess what the reviewers and consequently general people thought of it? It's hot, loud and sucks power. Sales? pfft. To those who weren't there, no there were no proprietary features that were worth the salt, nothing. Sure nvidia kept trying one thing after the other to lock consumers in their ecosystem but they all failed till then (they certainly learned from their failures though - see today). AMD's brand perception of being the 'cheaper intel' didn't help, nor did their decade of pulling GCN when they were broke. I don't want to really get into it either, but I do hope things change (as a whole) in the next decade.
Then there's Mantle. AMD literally fixed a whole clusterfuck of issues under the hood and it paved the way for DX12/Vulkan which we all enjoy now. Most people are under the impression it's only Nvidia who released all the great new features throughout the past couple of decades. Sure they did, they released a whole heap of features and a few of them turned out to be great successes today but I won't get into detail the ones they bought and killed off or the pissfight they had with tesselation. There were times it felt like (and it turned out to be true) they were only trying to increase their performance margin at the expense of consumers. I won't get into the fact that it's still happening. Not a word from peeps though, it's okay. But let's not forget that the other camp did a lot for your GPU's as well.
Lisa needs to talk to a few ex ATI people in marketing. And fire Azor and a couple of the other clowns today. This stupidity really needs to stop. I thought they got their marketing shit together with 6xxx launch but then they decided to go ahead and fuck up the other two, somehow one before they even launched it. That's a new low tbh.
AMD/ATI has come back from way back before a few times. 9700 pro, 4870 and to a lesser extent 3870 and 7970 were ones that come to mind. Hell, 6900XT was the same and it wasn't that long ago. No one's lead is insurmountable, but proprietary features are hard to crack and I doubt ill see much change in the competitive landscape anytime soon. One can hope.
Everyone thought AMD delayed the 9070 XT because they found out that Nvidia's cards are too good so the price had to be adjusted down.
What if they actually found out that Nvidia doesn't offer anything on top of last gen in the midrange, so the price on the 9070 XT actually has to be adjusted up?
So it's not like "hey look, the 5070 Ti is only $750, so we can't sell the 9070 XT for $900", but instead "look at these pieces of crap, we really shouldn't be selling the 9070 XT for $500, how about $700 instead".