Thursday, January 9th 2025

AMD Radeon RX 9070 XT Benchmarked in 3D Mark Time Spy Extreme and Speed Way

Although it has only been a few days since the RDNA 4-based GPUs from Team Red hit the scene, it appears that we have already been granted a first look at the 3D Mark performance of the highest-end Radeon RX 9070 XT GPU, and to be perfectly honest, the scores seemingly live up to our expectations - although with disappointing ray tracing performance. Unsurprisingly, the thread has been erased over at Chiphell, but folks have managed to take screenshots in the nick of time.

The specifics reveal that the Radeon RX 9070 XT will arrive with a massive TBP in the range of 330 watts, as revealed by a FurMark snap, which is substantially higher than the previous estimated numbers. With 16 GB of GDDR6 memory, along with base and boost clocks of 2520 and 3060 MHz, the Radeon RX 9070 XT managed to rake in an impressive 14,591 points in Time Spy Extreme, an around 6,345 points in Speed Way. Needless to say, the drivers are likely far from mature, so it is not outlandish to expect a few more points to get squeezed out of the RDNA 4 GPU.
Regarding the scores we currently have, it appears that the Radeon RX 9070 XT fails to match the Radeon RX 7900 XTX in both the tests, although it easily exceeds the GeForce RTX 4080 Super in the non-ray-traced TS Extreme test. In the Speed Way test, which is a ray-traced benchmark, the RX 9070 XT fails to match the RTX 4080 Super, falling noticeably short. Considering that it costs less than half the price of the RTX 4080 Super, this is no small feat. Interestingly, an admin at Chiphell, commented that those planning on grabbing an RTX 50 card should wait, further hinting that the GPU world has "completely changed". Considering the lack of context, the interpretation of the statement is debatable, but it does seem RDNA 4 might pack impressive price-to-performance that may give mid-range Blackwell a run for its money.
Sources: Chiphell, @0x22h
Add your own comment

95 Comments on AMD Radeon RX 9070 XT Benchmarked in 3D Mark Time Spy Extreme and Speed Way

#51
rv8000
Chrispy_The 9060 series worries me with its 8GB of VRAM.

A fried of mine bought a 4060 on discount recently, not because it was a good card, but because he has an extremely cramped mITX system with only a 350W PSU and I figured the 115W, tiny, 4060 was his best bet, I could be wrong but it seemed to me to be the smallest, most power-efficient card around.

He's running a 2560x1080 display, so not even 1440p and yet two of the three games (Indiana Jones, DA: Veilguard, Space Marine II) he upgraded for require him to turn things down to avoid stuttering because of VRAM shortages.

So yeah, the 8GB 9060 cards need to be 20% cheaper than a 12GB B580. 8GB wasn't enough for more than 1080p in 2022, it sure as hell isn't any better in 2025. I'm just as worried about the 5060 but Nvidia will sell like hotcakes because people will probably just believe Jensen when he says something like "a 5060 for $349 matches a 4080. The more you buy the more you save" or some other hand-wavy nonsense. My experience of DLSS FG and RT on 40-series cards is that all those fake-framed RT effects need a boatload more VRAM to work than just the raster codepath, and if the 5060 only gets 8GB, that's not going to go down so well.
Where has there been any info on the 9060/9060xt vram capacity? I’d expect 12GB on a 192-bit bus for the 9060/9060xt and the 9050/9050xt to be cut down to 128-bit and have pci-e lanes shaved off again (>_>).

*It’s also confirmed from Paul at Paulshardware that the 9070 retains a 16gb config based on the powercolor product label on one of their two slot designs.

I fully expect both the 9060xt and 9060 to sport 12gb of vram at minimum.
Posted on Reply
#52
Marcus L
Chrispy_The 9060 series worries me with its 8GB of VRAM.
is 8GB confirmed for 9060? 10GB/12GB would be a better fit, save 8GB for the entry level 9050 if they release a lower SKU and don't gimp it's PCIe lanes like the 6400/6500
Posted on Reply
#53
Chrispy_
QuietBobActually, all cards in the 3DMark Top 100 are heavily OC'd and use liquid cooling or better.

Those Top 100 scores are in no way representative of a particular model. They merely show best OCing samples.
Exactly. 3DMark is a mess for this stupid reason and that's why I asked, because honestly the numbers are totally meaningless without stock results to compare to.

Does any website test GPUs at stock settings with 3DMark and publish scores of GPUs that are representative of what people actually own?
Posted on Reply
#54
csendesmark
docnorthMaybe a comparison to 7900 XT would be more useful :confused:.
Yeah,
Normal people not running someMark3D every day or at all.
That number is like monopoly money in the cashier...
Mid range Nvidia+AMD card benchmarks numbers would make it a meaningful article.
Posted on Reply
#55
HairyLobsters
Drivers aren't even out, so performance isn't completely accurate.
Posted on Reply
#56
Chrispy_
Marcus Lis 8GB confirmed for 9060? 10GB/12GB would be a better fit, save 8GB for the entry level 9050 if they release a lower SKU and don't gimp it's PCIe lanes like the 6400/6500
8GB for the 9060 is not confirmed, no. The 9070 series has been announced, with unofficial tests/specs of the 9070XT that have leaked out before the review embargo confirming that it's a 256-bit, 16GB card like the 7800XT, and has 4096 cores that boost to around 3GHz.

There's precious little noise about the 9060-series, just rumours and speculation that they are coming in March 2025. Most of the rumours seem to show that Navi 44 powering the 9060 and 9060XT is exactly half of the Navi48 powering the 9070 and 9070XT - that means it's a 128-bit, 8GB card - most likely 2048 cores and cut down to 8 lanes of PCIe too. I would guess that if these rumours are accurate, the 9060XT will be using expensive double-density GDDR6 modules to boost the puny GPU to an acceptable quantity of VRAM, but also boost it's price into a higher, unacceptable price bracket for the performance it offers.

Hopefully the rumours are wrong and we do get a 192-bit 12GB 9060XT, as that's probably what the sub-$300 market needs more than anything else right now. B580 kind of nailed it there!
Posted on Reply
#57
csendesmark
k0vasz4090 is a different story - that's the absolute best card on the market, and those, nvidia can ask for (almost) whatever money they want, as there'll always be people who want the best, no matter what (also, it provides 23% better performance, so it's not like you pay twice as much to get the same performance)

but if you compare two cards from the same tier, you'll get the sameish pricing:

If it comes to games, the 7900XT fairs great, and still cheaper

The 7900XTX even beats the 4080 super while cheaper than the 4070 ti
I sorted for the cheapest on newegg


So yeah, the 4090 is king, but if you shoot lower than the top
AMD is was simply better in 2024.
And also yeah, in some specific games you may do better with the Nvidia, I talked in a general way
And if you would focus on AI stuff, Nvidia might be generally better, but that is not the average usecase
Posted on Reply
#58
QuietBob
Chrispy_Does any website test GPUs at stock settings with 3DMark and publish scores of GPUs that are representative of what people actually own?
Guru3D tests in Fire Strike Ultra, Time Spy, Steel Nomad, Port Royale and Full RT feature.
Posted on Reply
#59
Guwapo77
rv8000That’s what every generations 60 and 70 class gpus have been like from both brands since… forever. It’s not some new concept.
Man...thanks for enlightening me. /bow
Posted on Reply
#60
rv8000
Guwapo77Man...thanks for enlightening me. /bow
Thanks for the sarcasm and adding nothing to the conversation *shrug
Posted on Reply
#61
Guwapo77
rv8000Thanks for the sarcasm and adding nothing to the conversation *shrug
Thank you for sarcasm. This product is underwhelming on all counts. Just like Intel's recent GPU. *golf clap
Posted on Reply
#62
Vya Domus
AnarchoPrimitivI've jokingly hypothesized that the over-promising leaks that seem to preceed every single Radeon release are just misinformation from Nvidia fanboys
You think so ? Nah, couldn't be.

Posted on Reply
#63
debido666
Chrispy_The 9060 series worries me with its 8GB of VRAM.

A fried of mine bought a 4060 on discount recently, not because it was a good card, but because he has an extremely cramped mITX system with only a 350W PSU and I figured the 115W, tiny, 4060 was his best bet, I could be wrong but it seemed to me to be the smallest, most power-efficient card around.

He's running a 2560x1080 display, so not even 1440p and yet two of the three games (Indiana Jones, DA: Veilguard, Space Marine II) he upgraded for require him to turn things down to avoid stuttering because of VRAM shortages.

So yeah, the 8GB 9060 cards need to be 20% cheaper than a 12GB B580. 8GB wasn't enough for more than 1080p in 2022, it sure as hell isn't any better in 2025. I'm just as worried about the 5060 but Nvidia will sell like hotcakes because people will probably just believe Jensen when he says something like "a 5060 for $349 matches a 4080. The more you buy the more you save" or some other hand-wavy nonsense. My experience of DLSS FG and RT on 40-series cards is that all those fake-framed RT effects need a boatload more VRAM to work than just the raster codepath, and if the 5060 only gets 8GB, that's not going to go down so well.
Could you send me a link where AMD says 9060 series will be 8GB?
Posted on Reply
#64
AusWolf
Neo_MorpheusThis is beyond tiresome.

Its supposed to be a mid tier card and as others have said, even the Ngreedias 90's GPU have issues with RT even after all kinds of trickeries.

And as others have said, its a performance killer that provides nothing to gameplay.

All that AMD need to do is price this damned thing right and stop trying to be greedy.
Yeah, a $500 card can't match a $1,000 one in RT, only in raster. What an utter piece of...!
Posted on Reply
#65
capdauntless
Interesting performance numbers. I am in the middle of parting together my newest gaming PC and it will have a flavour of this card in it (even with the stupid name). It'll be a decent upgrade to what I have now. I do wish we'd see more focus on power efficiency with GPUs and CPUs.
Posted on Reply
#66
The Shield
Neo_MorpheusAll that AMD need to do is price this damned thing right and stop trying to be greedy.
We both know this is not gonna end well...
Posted on Reply
#67
remekra
FSR4 also seems to be looking great:

Seems like they really wanted to focus on two points that everybody was complaining about so FSR and RT. Now there is only one thing left, price.
Posted on Reply
#68
Zach_01
Didi you notice that the 3D mark test was done with 285K?
If that makes any difference...

5900X + 7900XTX (TBP 366W, GPU clock 2500~2530MHz, VRAM 2600MHz)

Posted on Reply
#69
Marcus L
remekraSeems like they really wanted to focus on two points that everybody was complaining about so FSR and RT. Now there is only one thing left, price.
And once everything aligns with the stars and planets under those 3 things, Nvidia can lower their prices and we can buy cheaper Nvidia cards!!!!!!!!!!! F yeaaaaaaaaa, screw AMD loooosers! :laugh:
Posted on Reply
#70
Zach_01
5900X + 7900XTX (TBP 366+10%=402W, GPU clock 2620~2670MHz, VRAM 2600MHz)

Posted on Reply
#71
dyonoctis
lilhasselhofferI...can only say the following...TressFX.

Before you ask why I cite the two above comments, it's because ray tracing is about as stupid as TressFX was. It's "better" than the results you get from the other guy doing the same calculations...but completely forgets that 99.9% of games that exist now were made before ray tracing was adopted. You're more than welcome to claim you think it looks more realistic...and someone else is more than able to call it crap. Those are not debatable points, only opinions. The truth is that it's a computationally intensive process that doesn't result in linear or better improvements...and thus will be relegated to the dustbin of history exactly like TressFX. The only difference is that Nvidia has clung to their dead horse for longer because AMD has not competed with them, and thus it's always something they win at. It's always easiest to be the best when nobody else competes.

The only fact is the cost to performance numbers that this card will eventually have after a proper review...and hopefully it will be priced competitively. Yesterday's performance in RT, today's performance in raster, and yesterday's pricing would be a great boon. That's especially true when today's pricing is highway robbery, and yesterday's yesterday provides enough performance for most people today.
You seem to assume that TressFX merely died, and games just kept using the same old hair dynamics. That's not the case; the principles behind the tech merely got natively implemented in the game engine. Same thing happened with gameworks.
Hair Physics in Unreal Engine - Overview | Unreal Engine 5.5 Documentation | Epic Developer Community
here you can see a video of the tech in action:
Enabling Physics Simulation on Grooms in Unreal Engine | Unreal Engine 5.5 Documentation | Epic Developer Community

AMD is working on a loooot of technologies available for game devs who might not have an equivalent alternative in their engines just yet.
Posted on Reply
#72
Chrispy_
debido666Could you send me a link where AMD says 9060 series will be 8GB?
Nope, because AMD haven't said that yet to the best of my knowledge.

This is what I said:
Chrispy_There's precious little noise about the 9060-series, just rumours and speculation that they are coming in March 2025. Most of the rumours seem to show that Navi 44 powering the 9060 and 9060XT is exactly half of the Navi48 powering the 9070 and 9070XT
Here's a few examples of the speculation so far:
videocardz.com/newz/amds-next-gen-navi-44-gpu-package-said-to-be-29x29mm-in-size-smaller-than-navi-23-33
wccftech.com/amd-navi-44-package-size-29-x-29-mm-smaller-than-navi-33-bigger-than-navi-24/
www.techspot.com/news/105144-amd-navi-44-rdna-4-gpu-package-size.html

One of the youtubers (might have been one of the Steves) speculated that the package size decrease from Navi 48 was too much for Navi 44 to be a 3072-core/192-bit/12GB config, and that it was even smaller than the 2048/128-bit/8GB config of the RX 7600 and 6600 series. There has been a node shrink, but if Navi 44 is even smaller than Navi 33 or 23, it's "likely to be a 2048 quad-channel solution" (GDDR6 channels are 32-bits wide, so presumably that means a 128-bit card).

Some redditors speculated that Navi48 was just two Navi44 dies glued together but that's been debunked now that we have actual die shots of Navi48.
Posted on Reply
#73
Marcus L
Chrispy_Nope, because AMD haven't said that yet to the best of my knowledge.

This is what I said:


Here's a few examples of the speculation so far:
videocardz.com/newz/amds-next-gen-navi-44-gpu-package-said-to-be-29x29mm-in-size-smaller-than-navi-23-33
wccftech.com/amd-navi-44-package-size-29-x-29-mm-smaller-than-navi-33-bigger-than-navi-24/
www.techspot.com/news/105144-amd-navi-44-rdna-4-gpu-package-size.html

One of the youtubers (might have been one of the Steves) speculated that the package size decrease from Navi 48 was too much for Navi 44 to be a 3072-core/192-bit/12GB config, and that it was even smaller than the 2048/128-bit/8GB config of the RX 7600 and 6600 series. There has been a node shrink, but if Navi 44 is even smaller than Navi 33 or 23, it's "likely to be a 2048 quad-channel solution" (GDDR6 channels are 32-bits wide, so presumably that means a 128-bit card).

Some redditors speculated that Navi48 was just two Navi44 dies glued together but that's been debunked now that we have actual die shots of Navi48.
You literally said:
Chrispy_The 9060 series worries me with its 8GB of VRAM.
So which is it?
Posted on Reply
#74
ModEl4
This is very good news, if true it points to only 5% performance difference in raster vs 4080S (+13% vs 4070Ti Super) but real world RT performance will be (in best case scenario) close but slower than 4070Ti Super (which anyway is good news imo)
The real question is if the reference model stays at 265W (that's what all the leakers was saying) or the board power increased in a last time decision by AMD?
Also if this is an OC model then the 3060MHz core speed is factory OC speed or manual OC? (because if you remember back in RX 7900 XTX launch in W1zzard's review of ASUS TUF (also probably ASUS TUF series the leaked RX 9070 XT, but it could be Prime also) the average clock of reference was 2630MHz and the ASUS model could hit 3200MHz with manual OC for a +15% real-world performance increase vs reference...)
But the most important thing is MSRP, with the right price any performance will be acceptable!
Posted on Reply
#75
AusWolf
Zach_01Didi you notice that the 3D mark test was done with 285K?
If that makes any difference...
In Speedway? No, it makes absolutely no difference.
Posted on Reply
Add your own comment
Jan 10th, 2025 04:05 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts