Monday, February 24th 2025

AMD Radeon RX 9070 and 9070 XT Official Performance Metrics Leaked, +42% 4K Performance Over Radeon RX 7900 GRE

AMD's internal benchmarks of its upcoming RDNA 4-based RX 9070 series graphics cards have been leaked, thanks to VideoCardz. The flagship RX 9070 XT delivers up to 42% better performance than the Radeon RX 7900 GRE at 4K resolution across a test suite of over 30 games, with the standard RX 9070 showing a 21% improvement in the same scenario. The performance data, encompassing raster and ray-traced titles at ultra settings, positions the RX 9070 series as a direct competitor to NVIDIA's RTX 4080 and RTX 5070 Ti. Notably, AMD's testing methodology focused on native rendering and ray tracing capabilities rather than upscaling technologies like FSR. The RX 9070 XT demonstrated large gains at 4K resolution, achieving a 51% performance uplift compared to the two-generations older RX 6900 XT. Meanwhile, the base RX 9070 model showed a 38% improvement over the RX 6800 XT at 4K with maximum settings enabled.

While AMD confirms its new cards are designed to compete with NVIDIA's RTX 50 series, specific comparative benchmarks against the RTX 5070 Ti were absent from the presentation. AMD acknowledges it has yet to acquire the competitor's hardware for testing. The company is expected to provide a comprehensive performance overview, potentially including additional GPU comparisons, during its official announcement on February 28. Both RX 9070 series cards will feature 16 GB of VRAM, matching the memory configuration of the RX 7900 GRE used as a primary comparison point. By the official launch date, AMD will have time to push final driver tweaks for optimal performance. Nonetheless, more information will surface as we near the official release date.
Source: VideoCardz
Add your own comment

134 Comments on AMD Radeon RX 9070 and 9070 XT Official Performance Metrics Leaked, +42% 4K Performance Over Radeon RX 7900 GRE

#51
forman313
HyderzWay too low, amd needs to make money too
You have to spend money to make money. AMD needs buyers to gain market share. With all that is happening (and not happening) with RTX 5000 series, AMD have two choices. Short term gain by squeezing every single penny out the buyers, or long term gain by providing so much perf/$ that a lot of the nVidia faithfull customers are won over. If they really have big plans for UDMA, this is the perfect way to get a head start on the future sales.
Posted on Reply
#52
PerfectWave
hope AMD will buy 5070ti with one rop less so their bench will be amazing :roll:
Posted on Reply
#53
Neo_Morpheus
DavenI hate the fact that Nvidia Ngreedia has bribed the influencers formerly know as reviewers to convince its dumb customers that GPU anticonsumer lock in tech features are more important than fun
Fixed a bit for you. ;)

I honestly dont understand why this BS is still so prevalent.

Very, very few games are worth the insane performance hit that RT demands in return of the so called benefit of RT.

I have not seen one game yet that makes me think, man, I cannot play this without RT.

But its the first thing that todays dumb consumers out there will say is "oH bUt MaH RT yo!"
Posted on Reply
#54
oddrobert
Jermelescu600€ for xt and 500€ for the other one, easy +20% market share on desktop.
I already sold my greedvidia and wait for AMD to deliver, if prices wont come down I will make myself scarce and sell all gaming things i got and move on and read books instead.
Just hate that pc market and N word company especially.
250$ b580 is more like 325Euro over here and it's best value option. I am done with it.
How long it will take for <400GPB to something similar to 7800xt?
I think it will take at least another 6 mounts.
For now for around 400gbp you can get 7700xt 12gb which is okish.
Till end of year, 7800xt performance 16gb hope to be well below 400gbp.

I mean, I already set my budget and now w8 for performance for meat it.
Instead, to fall for upsell craze, time will tell but need to nosider quiting all that crazines.
Posted on Reply
#55
ThomasK
Neo_Morpheus"oH bUt MaH RT yo!"
TPUs own poll proved most people don't care about RT, yet those are the same people whining about lacking RT performance on amd cards.
Posted on Reply
#56
Neo_Morpheus
ThomasKTPUs own poll proved most people don't care about RT, yet those are the same people whining about lacking RT performance on amd cards.
Same everywhere else, but the first crap that you hear is always “AMD sucks because of poor RT performance “

I dont get it, i just want to have fun when i play a game, not worship this RT nonsense in them.
Posted on Reply
#57
TheGeekn°72
King MustardThey have always been fine at raster.

I want to know if their ray tracing cores are still abysmal, as I will want to play a lot of RT games over the next three years.
Non-RT lighting methods are extremely mature as there's two decades of technical advancements on those, RT is but a gimmick Nshittia convinced game studios to implement so they can sell more cards and even then it still doesn't look good (see HUB's video about garbled, noisy RT with no object permanence with entities out of FOV)
SL2Why does people think that AMD will give these a good price?

Not that I know the price, but I know their launch day price history.
There's the launch day price and there's the actual selling price, of which has been historicallly 100-200 below MSRP, I have more trouble finding XTXes at MSRP (as I always found them between 750 and 900) than RTX cards a cent below MSRP
tugrul_SIMDWhat if 9070xt overclocks from 3.15GHz to 3.5GHz?
What if the sky was orange and the sun purple ?

On a more serious note :
I personally don't really care about graphics cards overclocking overhead, I'd be more impressed by its undervolting capacity, seeing how much wattage I can pull while retaining performance in an acceptable bracket that's close enough to full performance sounds better to me
PerfectWavehope AMD will buy 5070ti with one rop less so their bench will be amazing :roll:
4 days after launch and they still don't have a 5070Ti, either it's true and it's just hilarious, either they did receive a card directly from Nshittia but still claimed so as a subtle jab lmao
ThomasKTPUs own poll proved most people don't care about RT, yet those are the same people whining about lacking RT performance on amd cards.
The duality of the human mind. I really hope we won't get much more of those "RT mandatory" games in the future because while some do have improved looks, it doesn't mean all of them look good with RT and the performance hit *sucks ass* BIG TIME, I'm much more interested in applications like RT-driven audio (where it's not actual ray tracing but makes use of the RT core for audio reflexions in a room/environment) multiplayer games where players likely won't turn on RT because they need the frames but where audio matters like CoD or -especially- Tarkov could make use of the unused RT cores
Posted on Reply
#58
Daven
Neo_MorpheusSame everywhere else, but the first crap that you hear is always “AMD sucks because of poor RT performance “

I dont get it, i just want to have fun when i play a game, not worship this RT nonsense in them.
I get your frustration. I really do. GPU customers are trapped right now. A huge majority can’t bring themselves to spend money on a non-Nvidia GPU no matter the arguments for and against. No matter the failures by Nvidia (missing ROPs, black screens, burning components, etc). And AMD and Intel can’t do much because both are bound to make mistakes too (and have already) and those mistakes are turned up to 11 undeservedly in some cases so customers can feel good about choosing Nvidia.

Sadly enough, I can’t even say that none of this matters as computer gaming is just a luxury. Nvidia controls so much tech and wealth that they can actually affect society outside computer gaming.

Oh well, time to escape from reality and go back to Valheim. A truly enjoyable game that doesn’t need RT to be fun.
Posted on Reply
#59
Darc Requiem
Bomby569i doubt both cards will be price similar to the GRE, no one would buy the non xt :kookoo:
That makes no sense
The 7900GRE had a $550 MSRP when it launched globally. So these 9070 series will likely be priced around it based on these comparsions. Something like 9070XT $600 and the 9070 $500.
Posted on Reply
#60
dyonoctis
TheGeekn°72Non-RT lighting methods are extremely mature as there's two decades of technical advancements on those, RT is but a gimmick Nshittia convinced game studios to implement so they can sell more cards and even then it still doesn't look good (see HUB's video about garbled, noisy RT with no object permanence with entities out of FOV

There's the launch day price and there's the actual selling price, of which has been historicallly 100-200 below MSRP, I have more trouble finding XTXes at MSRP (as I always found them between 750 and 900) than RTX cards a cent below MSRP

What if the sky was orange and the sun purple ?

4 days after launch and they still don't have a 5070Ti, either it's true and it's just hilarious, either they did receive a card directly from Nshittia but still claimed so as a subtle jab lmao

The duality of the human mind. I really hope we won't get much more of those "RT mandatory" games in the future because while some do have improved looks, it doesn't mean all of them look good with RT and the performance hit *sucks ass* BIG TIME, I'm much more interested in applications like RT-driven audio (where it's not actual ray tracing but makes use of the RT core for audio reflexions in a room/environment) multiplayer games where players likely won't turn on RT because they need the frames but where audio matters like CoD or -especially- Tarkov could make use of the unused RT cores
Extremely mature and efficient, but not without their own challenge and limitation*. The narrative that RTRT was only from nvidia will need to die because it's factually false, Sigrapph has been making conference about RTRT since the early 2000, loong before nvidia took interest in that. CG researchers have been writing essays about RTRT without the involvment of nvidia, before Turing was released.

Cg artist working in games are CG artist first, and (maybe) gamer second. RTRT is something that many of them wanted to achieve for years, but GPU makers weren't about it for a while. No hardware optimised for it meant that no progress could be made on the software side. Now that the hardware and software are outhere for everyone to experiment with, progress can be made to figure out how RTRT can be further optimised both software and hardware side.

One thing to note though, those optimisation might mean that fuure games will use heavier RT/PT effects, rather than doing the same thing that they are doing now, but faster. That's also how raster graphics evolved after all.

I've said that before, but the CG industry intererest won't always align with gamers interest.
Posted on Reply
#61
SlappingBob
If they don't price for market share our only hope is Intel....

I'm in the market, my 3080 ti died three days ago, and I'm prepared to buy a new Freesync compatible monitor if the price and performance of the 9070 XT is right (GSync only at the minute). If it looks like AMD are in the game, it'll be worth it just to have a viable Nvidia competitor. My last Radeon was an HD 6570 for an htpc (remember those?) and I'm using that right now :-(
Posted on Reply
#62
CosmicWanderer
BroudkaWhy the hell compare them to GRE and XT or XTX ?
Because they want to keep selling the 7900 XT and XTX as higher-end options.
Posted on Reply
#63
TheGeekn°72
dyonoctisExtremely mature and efficient, but not without their own challenge and limitation*. The narrative that RTRT was only from nvidia will need to die because it's factually false, Sigrapph has been making conference about RTRT since the early 2000, loong before nvidia took interest in that. CG researchers have been writing essays about RTRT without the involvment of nvidia, before Turing was released.

Cg artist working in games are CG artist first, and (maybe) gamer second. RTRT is something that many of them wanted to achieve for years, but GPU makers weren't about it for a while. No hardware optimised for it meant that no progress could be made on the software side. Now that the hardware and software are outhere for everyone to experiment with, progress can be made to figure how RTRT can be further optimised both software and hardware side.

One thing to note though, those optimisation might mean that fuure games will use heavier RT/PT effects, rather than doing the same thing that they are doing now, but faster. That's also how raster graphics evolved after all.

I've said before, but the CG industry intererest won't always align with gamers interest.
Oh my bad, I didn't mean to imply Nshittia was the sole "perpetrator" behind the massive push for RT but I do think they did it because their R&D dept. stumbled across an arch that was "good enough" to be commercialisable and decided to bankroll on it while they had the lead (i.e. The Morally Wrong Reason to push for a new tech)

I'd have preferred a much slower implementation of RT related tech, maybe something hybrid at first like RT guided conventional lighting ? I'll try to explain the idea from the best my mere mortal mind can picture it : instead of producing so many rays, why not make it just a few and use the data from those to enhance the quality of the conventional lighting, give it the information it need to know how to mimic actual RT ?

I am no dev, physician or mathematician but I figured the idea made sense to me so... I got the concept from PirateSoftware's Heartbound who explained he ran some RT methods on CPU for some shadows and the game could run on iGPU so... eh, why not ? As it is (imho) full RT lighting/reflections are but a marketing platform on which Nshittia boosted itself, trying to convince players that they absolutely needed that tech and any and all competition couldn't provide this pHony Grail of graphics.

That's not without saying that it actually look amazing in the experimental benchmarks/technological demonstrators I've seen of it, I definitely want to see more of this in future games, just not used as a shitty prop for the 3T$ company...
Posted on Reply
#64
tugrul_SIMD
CosmicWandererBecause they want to keep selling the 7900 XT and XTX as higher-end options.
Then why do they produce 9070xt?
Posted on Reply
#65
TheGeekn°72
CosmicWandererBecause they want to keep selling the 7900 XT and XTX as higher-end options.
They stopped production on 7000 for a good while now, all that's left is remaining stock and it basically sells itself (RTX50 is the best RX7000 salesman of the entire previous generation !)
Posted on Reply
#66
bug
TheGeekn°72Oh my bad, I didn't mean to imply Nshittia was the sole "perpetrator" behind the massive push for RT but I do think they did it because their R&D dept. stumbled across an arch that was "good enough" to be commercialisable and decided to bankroll on it while they had the lead (i.e. The Morally Wrong Reason to push for a new tech)

I'd have preferred a much slower implementation of RT related tech, maybe something hybrid at first like RT guided conventional lighting ? I'll try to explain the idea from the best my mere mortal mind can picture it : instead of producing so many rays, why not make it just a few and use the data from those to enhance the quality of the conventional lighting, give it the information it need to know how to mimic actual RT ?
I am no dev, physician or mathematician but I figured the idea made sense to me so... I got the idea from PirateSoftware's Heartbound who explained he ran some RT methods on CPU for some shadows and the game could run on iGPU so... eh, why not ? As it is (imho) full RT lighting/reflections are but a marketing platform on which Nshittia boosted itself, trying to convince players that they absolutely needed that tech and any and all competition couldn't provide this pHony Grail of graphics.

That's not without saying that it actually look amazing in the experimental benchmarks/technological demonstrators I've seen of it, I definitely want to see more of this in future games, just not used as a shitty prop for the 3T$ company...
Actually, that is close to what is happening atm: few rays are actually traced, because the hardware is weak. The rest of the data is extrapolated. It's just not fed back into a rasterized illumination engine because that doesn't make much sense (you'd have to do traditional illumination on the side, that would be even more computationally intensive).
Posted on Reply
#67
dyonoctis
TheGeekn°72I'd have preferred a much slower implementation of RT related tech, maybe something hybrid at first like RT guided conventional lighting ? I'll try to explain the idea from the best my mere mortal mind can picture it : instead of producing so many rays, why not make it just a few and use the data from those to enhance the quality of the conventional lighting, give it the information it need to know how to mimic actual RT ?
There's something similar that's in the work, but will make use of machine learning : instead of computing each light bounce, you compute a few, and use ML to infer how the other will bounce. IIRC the RTX demo of half-life 2 already make use of that.
Posted on Reply
#68
Heiro78
BroudkaWhy the hell compare them to GRE and XT or XTX ?
Agreed, here's my attempt at getting FPS values.

Using the percent chart from videocardz in excel, this video from Testing Games ~ 2 months ago, and assuming their settings are the same.

The FPS values in STALKER 2, Starfield, Cyberpunk 2077 Non-RT and RT, God of War Ragnarok, and Black Myth Wukong are below.

Not bad performance. But as always, price will be the big determining factor. 9070 XT around the rumored 550 would be good. File is attached for whoever wants to pick it apart.

Posted on Reply
#69
TheGeekn°72
tugrul_SIMDThen why do they produce 9070xt?
Before the name change, it was supposed to be the 8800XT ; by definition, the -800(XT) model being AMD's mid-tier SKU
They confirmed they wouldn't produce a Big Navi SKU on RDNA4 (for the others : NO they didn't say they'd stop making high end GPUs, they just said they'd focus on the mid/low-mid SKUs for RDNA4 while they're reorganizing their R&D effort for UDNA)
bugActually, that is close to what is happening atm: few rays are actually traced, because the hardware is weak. The rest of the data is extrapolated. It's just not fed back into a rasterized illumination engine because that doesn't make much sense (you'd have to do traditional illumination on the side, that would be even more computationally intensive).
Oh I see... I'm guessing that's why there's so little difference between RT on/off ?
dyonoctisThere's something similar that's in the work, but will make use of machine learning : instead of computing each light bounce, you compute a few, and use ML to infer how the other will bounce. IIRC the RTX demo of half-life 2 already make use of that.
Ah yes, ML/AI. Of course, they have to promote what *datacenters* are after and insist on the one feature that caused them to make Blackwell 2.0 (because 1.0 was *that bad* I'm guessing -see reports of cracked dies from heat in DC racks from big customers-) with little raster improvement but SO MANY AI accelerators jam-packed on those dies...

Of all things I wanted AI to be used on, graphics wasn't one of them... Imagine how neat it would have been to run games' NPCs on AI from the GPUs ! Now that would have been epic. Maybe in games like Stellaris, Cities Skylines or, idk, CoD campaign enemies ?
Posted on Reply
#70
kapone32
If this is true. The GPU in the 9070 is very fast. The GRE, XT and XTX share the same GPU. I live in a World where the Distributor determines how above MSRP the retailer prices hardware. Funnily enough the 7900 cards sat at $799 and $999 for a year and from Dec- Feb were cheaper, I saw a As Rock 7900XT for $864 Canadian. Now an XTX is $1699.
Posted on Reply
#71
Heiro78
dyonoctisExtremely mature and efficient, but not without their own challenge and limitation*. The narrative that RTRT was only from nvidia will need to die because it's factually false, Sigrapph has been making conference about RTRT since the early 2000, loong before nvidia took interest in that. CG researchers have been writing essays about RTRT without the involvment of nvidia, before Turing was released.

Cg artist working in games are CG artist first, and (maybe) gamer second. RTRT is something that many of them wanted to achieve for years, but GPU makers weren't about it for a while. No hardware optimised for it meant that no progress could be made on the software side. Now that the hardware and software are outhere for everyone to experiment with, progress can be made to figure out how RTRT can be further optimised both software and hardware side.

One thing to note though, those optimisation might mean that fuure games will use heavier RT/PT effects, rather than doing the same thing that they are doing now, but faster. That's also how raster graphics evolved after all.

I've said that before, but the CG industry intererest won't always align with gamers interest.
Why did you say RTRT so many times in the beginning? Is this real time ray tracing?

Also, in reference to CG artists. You must mean that they enjoy real time ray tracing versus having lower end scenes that they then render for the full effect. Or are you saying that having the RT cores allows for full scene rendering to be done faster?
Posted on Reply
#72
Daven
CosmicWandererBecause they want to keep selling the 7900 XT and XTX as higher-end options.
I believe AMD has stopped production of RDNA3.
Posted on Reply
#73
rv8000
Bomby569Not sure what they were doing as RT performance seems incredibly disappointing, didn't they promised the opposite?
Do they state the RT core/unit count? If its still 1 per workgroup the 9070XT will have 2/3 the hardware RT units of a 7900XTX, matching RT performance would mean a ~50% improvement in RT hardware.
Posted on Reply
#74
ymdhis
Not bad for a midrange stopgap, shame it'll be priced way too high as usual.
Posted on Reply
#75
Heiro78
bugActually, that is close to what is happening atm: few rays are actually traced, because the hardware is weak. The rest of the data is extrapolated. It's just not fed back into a rasterized illumination engine because that doesn't make much sense (you'd have to do traditional illumination on the side, that would be even more computationally intensive).
Your signature linking to that 6 plus year old thread is hilarious and prescient
Posted on Reply
Add your own comment
Feb 24th, 2025 20:06 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts