Monday, February 24th 2025

AMD Radeon RX 9070 and 9070 XT Official Performance Metrics Leaked, +42% 4K Performance Over Radeon RX 7900 GRE
AMD's internal benchmarks of its upcoming RDNA 4-based RX 9070 series graphics cards have been leaked, thanks to VideoCardz. The flagship RX 9070 XT delivers up to 42% better performance than the Radeon RX 7900 GRE at 4K resolution across a test suite of over 30 games, with the standard RX 9070 showing a 21% improvement in the same scenario. The performance data, encompassing raster and ray-traced titles at ultra settings, positions the RX 9070 series as a direct competitor to NVIDIA's RTX 4080 and RTX 5070 Ti. Notably, AMD's testing methodology focused on native rendering and ray tracing capabilities rather than upscaling technologies like FSR. The RX 9070 XT demonstrated large gains at 4K resolution, achieving a 51% performance uplift compared to the two-generations older RX 6900 XT. Meanwhile, the base RX 9070 model showed a 38% improvement over the RX 6800 XT at 4K with maximum settings enabled.
While AMD confirms its new cards are designed to compete with NVIDIA's RTX 50 series, specific comparative benchmarks against the RTX 5070 Ti were absent from the presentation. AMD acknowledges it has yet to acquire the competitor's hardware for testing. The company is expected to provide a comprehensive performance overview, potentially including additional GPU comparisons, during its official announcement on February 28. Both RX 9070 series cards will feature 16 GB of VRAM, matching the memory configuration of the RX 7900 GRE used as a primary comparison point. By the official launch date, AMD will have time to push final driver tweaks for optimal performance. Nonetheless, more information will surface as we near the official release date.
Source:
VideoCardz
While AMD confirms its new cards are designed to compete with NVIDIA's RTX 50 series, specific comparative benchmarks against the RTX 5070 Ti were absent from the presentation. AMD acknowledges it has yet to acquire the competitor's hardware for testing. The company is expected to provide a comprehensive performance overview, potentially including additional GPU comparisons, during its official announcement on February 28. Both RX 9070 series cards will feature 16 GB of VRAM, matching the memory configuration of the RX 7900 GRE used as a primary comparison point. By the official launch date, AMD will have time to push final driver tweaks for optimal performance. Nonetheless, more information will surface as we near the official release date.
134 Comments on AMD Radeon RX 9070 and 9070 XT Official Performance Metrics Leaked, +42% 4K Performance Over Radeon RX 7900 GRE
I honestly dont understand why this BS is still so prevalent.
Very, very few games are worth the insane performance hit that RT demands in return of the so called benefit of RT.
I have not seen one game yet that makes me think, man, I cannot play this without RT.
But its the first thing that todays dumb consumers out there will say is "oH bUt MaH RT yo!"
Just hate that pc market and N word company especially.
250$ b580 is more like 325Euro over here and it's best value option. I am done with it.
How long it will take for <400GPB to something similar to 7800xt?
I think it will take at least another 6 mounts.
For now for around 400gbp you can get 7700xt 12gb which is okish.
Till end of year, 7800xt performance 16gb hope to be well below 400gbp.
I mean, I already set my budget and now w8 for performance for meat it.
Instead, to fall for upsell craze, time will tell but need to nosider quiting all that crazines.
I dont get it, i just want to have fun when i play a game, not worship this RT nonsense in them.
On a more serious note :
I personally don't really care about graphics cards overclocking overhead, I'd be more impressed by its undervolting capacity, seeing how much wattage I can pull while retaining performance in an acceptable bracket that's close enough to full performance sounds better to me 4 days after launch and they still don't have a 5070Ti, either it's true and it's just hilarious, either they did receive a card directly from Nshittia but still claimed so as a subtle jab lmao The duality of the human mind. I really hope we won't get much more of those "RT mandatory" games in the future because while some do have improved looks, it doesn't mean all of them look good with RT and the performance hit *sucks ass* BIG TIME, I'm much more interested in applications like RT-driven audio (where it's not actual ray tracing but makes use of the RT core for audio reflexions in a room/environment) multiplayer games where players likely won't turn on RT because they need the frames but where audio matters like CoD or -especially- Tarkov could make use of the unused RT cores
Sadly enough, I can’t even say that none of this matters as computer gaming is just a luxury. Nvidia controls so much tech and wealth that they can actually affect society outside computer gaming.
Oh well, time to escape from reality and go back to Valheim. A truly enjoyable game that doesn’t need RT to be fun.
Cg artist working in games are CG artist first, and (maybe) gamer second. RTRT is something that many of them wanted to achieve for years, but GPU makers weren't about it for a while. No hardware optimised for it meant that no progress could be made on the software side. Now that the hardware and software are outhere for everyone to experiment with, progress can be made to figure out how RTRT can be further optimised both software and hardware side.
One thing to note though, those optimisation might mean that fuure games will use heavier RT/PT effects, rather than doing the same thing that they are doing now, but faster. That's also how raster graphics evolved after all.
I've said that before, but the CG industry intererest won't always align with gamers interest.
I'm in the market, my 3080 ti died three days ago, and I'm prepared to buy a new Freesync compatible monitor if the price and performance of the 9070 XT is right (GSync only at the minute). If it looks like AMD are in the game, it'll be worth it just to have a viable Nvidia competitor. My last Radeon was an HD 6570 for an htpc (remember those?) and I'm using that right now :-(
I'd have preferred a much slower implementation of RT related tech, maybe something hybrid at first like RT guided conventional lighting ? I'll try to explain the idea from the best my mere mortal mind can picture it : instead of producing so many rays, why not make it just a few and use the data from those to enhance the quality of the conventional lighting, give it the information it need to know how to mimic actual RT ?
I am no dev, physician or mathematician but I figured the idea made sense to me so... I got the concept from PirateSoftware's Heartbound who explained he ran some RT methods on CPU for some shadows and the game could run on iGPU so... eh, why not ? As it is (imho) full RT lighting/reflections are but a marketing platform on which Nshittia boosted itself, trying to convince players that they absolutely needed that tech and any and all competition couldn't provide this pHony Grail of graphics.
That's not without saying that it actually look amazing in the experimental benchmarks/technological demonstrators I've seen of it, I definitely want to see more of this in future games, just not used as a shitty prop for the 3T$ company...
Using the percent chart from videocardz in excel, this video from Testing Games ~ 2 months ago, and assuming their settings are the same.
The FPS values in STALKER 2, Starfield, Cyberpunk 2077 Non-RT and RT, God of War Ragnarok, and Black Myth Wukong are below.
Not bad performance. But as always, price will be the big determining factor. 9070 XT around the rumored 550 would be good. File is attached for whoever wants to pick it apart.
They confirmed they wouldn't produce a Big Navi SKU on RDNA4 (for the others : NO they didn't say they'd stop making high end GPUs, they just said they'd focus on the mid/low-mid SKUs for RDNA4 while they're reorganizing their R&D effort for UDNA) Oh I see... I'm guessing that's why there's so little difference between RT on/off ? Ah yes, ML/AI. Of course, they have to promote what *datacenters* are after and insist on the one feature that caused them to make Blackwell 2.0 (because 1.0 was *that bad* I'm guessing -see reports of cracked dies from heat in DC racks from big customers-) with little raster improvement but SO MANY AI accelerators jam-packed on those dies...
Of all things I wanted AI to be used on, graphics wasn't one of them... Imagine how neat it would have been to run games' NPCs on AI from the GPUs ! Now that would have been epic. Maybe in games like Stellaris, Cities Skylines or, idk, CoD campaign enemies ?
Also, in reference to CG artists. You must mean that they enjoy real time ray tracing versus having lower end scenes that they then render for the full effect. Or are you saying that having the RT cores allows for full scene rendering to be done faster?