Thursday, January 9th 2025

AMD Radeon RX 9070 XT Benchmarked in 3D Mark Time Spy Extreme and Speed Way

Although it has only been a few days since the RDNA 4-based GPUs from Team Red hit the scene, it appears that we have already been granted a first look at the 3D Mark performance of the highest-end Radeon RX 9070 XT GPU, and to be perfectly honest, the scores seemingly live up to our expectations - although with disappointing ray tracing performance. Unsurprisingly, the thread has been erased over at Chiphell, but folks have managed to take screenshots in the nick of time.

The specifics reveal that the Radeon RX 9070 XT will arrive with a massive TBP in the range of 330 watts, as revealed by a FurMark snap, which is substantially higher than the previous estimated numbers. With 16 GB of GDDR6 memory, along with base and boost clocks of 2520 and 3060 MHz, the Radeon RX 9070 XT managed to rake in an impressive 14,591 points in Time Spy Extreme, an around 6,345 points in Speed Way. Needless to say, the drivers are likely far from mature, so it is not outlandish to expect a few more points to get squeezed out of the RDNA 4 GPU.
Regarding the scores we currently have, it appears that the Radeon RX 9070 XT fails to match the Radeon RX 7900 XTX in both the tests, although it easily exceeds the GeForce RTX 4080 Super in the non-ray-traced TS Extreme test. In the Speed Way test, which is a ray-traced benchmark, the RX 9070 XT fails to match the RTX 4080 Super, falling noticeably short. Considering that it costs less than half the price of the RTX 4080 Super, this is no small feat. Interestingly, an admin at Chiphell, commented that those planning on grabbing an RTX 50 card should wait, further hinting that the GPU world has "completely changed". Considering the lack of context, the interpretation of the statement is debatable, but it does seem RDNA 4 might pack impressive price-to-performance that may give mid-range Blackwell a run for its money.
Sources: Chiphell, @0x22h
Add your own comment

146 Comments on AMD Radeon RX 9070 XT Benchmarked in 3D Mark Time Spy Extreme and Speed Way

#126
remekra
AusWolf16 GB on a 192-bit bus is not possible. It's either 8/16 GB on 128-bit, or 12 GB on 192-bit. A site like Digital Trends should know this.


No you don't. Have you tried Alan Wake 2 without RT? It's basically indistinguishable from using RT, I'd say.

It's more prominent in Cyberpunk, but even there, all I see right away without pixel-peeping is shiny puddles. Not exactly a revolution in gaming tech.


Good stuff. :) 14k vs 14.5k is already a match, if I dare to say (within a difference undetectable to the naked eye).

In other news: We might get something - a full release, or maybe just some more info, on 22 Jan.
videocardz.com/newz/radeon-rx-9070-xt-announcement-expected-on-january-22-review-samples-shipping-already
It's still using RT, even when you have Path Tracing disabled. Just not per triangle, less accurate. They probably could use Hardware acceleration to make it look better without PT but then came Nvidia and probably offered them a deal.
Posted on Reply
#127
Chrispy_
AusWolf16 GB on a 192-bit bus is not possible. It's either 8/16 GB on 128-bit, or 12 GB on 192-bit. A site like Digital Trends should know this.
I think you're misinterpreting that article:
[INDENT][/INDENT]
[INDENT]"If the above leak is true, we might still see an RX 9060 with 8GB memory and an RX 9060 XT with 16GB", but the XT model might now get a wider, 192-bit interface."[/INDENT]

So the first part of that sentence is referencing the 'if the above leak is true' and refers to a 128-bit Navi44 in 8GB and 16GB flavours for the 9060 and 9060XT respectively.

The second part of that sentence "but the XT model might now get a wider, 192-bit interface" is referring to newer information that Navi44 might actually be a 192-bit die, overruling the "if the above leak is true" first part of that sentence.

I don't read that sentence and interpret it as "16GB 192-bit 9060XT"
I Interpret it as "leak says 8+16GB 128-bit cards, new rumours hint that it's actually a 192-bit design"

If Navi44 is actually a 192-bit design, then that doesn't rule out a die-harvested, cut-down version being sold as the vanilla 9060 with only 128-bits enabled for likely 8GB configs, but also possibly a 16GB config - if not for gamers then potentially for a Radeon Pro workstation variant.
Posted on Reply
#128
AusWolf
Chrispy_I think you're misinterpreting that article:
[INDENT][/INDENT]
[INDENT]"If the above leak is true, we might still see an RX 9060 with 8GB memory and an RX 9060 XT with 16GB", but the XT model might now get a wider, 192-bit interface."[/INDENT]

So the first part of that sentence is referencing the 'if the above leak is true' and refers to a 128-bit Navi44 in 8GB and 16GB flavours for the 9060 and 9060XT respectively.
It clearly says "and an RX 9060 XT with 16GB", then it speculates about a 192-bit bus, there's no way around it. You know it's wrong, I know it's wrong, but for the masses, it's just misinformation.

The sentence should have gone like
"If the above leak is true, we might still see an RX 9060 with 8GB memory and an RX 9060 XT with 12 GB on a 192-bit interface."
Posted on Reply
#129
Zach_01
AusWolf16 GB on a 192-bit bus is not possible. It's either 8/16 GB on 128-bit, or 12 GB on 192-bit. A site like Digital Trends should know this.


No you don't. Have you tried Alan Wake 2 without RT? It's basically indistinguishable from using RT, I'd say.

It's more prominent in Cyberpunk, but even there, all I see right away without pixel-peeping is shiny puddles. Not exactly a revolution in gaming tech.


Good stuff. :) 14k vs 14.5k is already a match, if I dare to say (within a difference undetectable to the naked eye).

In other news: We might get something - a full release, or maybe just some more info, on 22 Jan.
videocardz.com/newz/radeon-rx-9070-xt-announcement-expected-on-january-22-review-samples-shipping-already
Just dont forget the CPU score.
Unlike speedway, CPU affects final score in timespy.
Posted on Reply
#130
GodisanAtheist
I mean, we still don't really know jack about squat and a lot of the die size rumors regarding N48 appear to have been wrong (lot of leaks guesses of a sub 300mm^2 die).

So whose to say N44 isn't a smidge larger than initially reported guessed at and ends up with a 192-bit bus?

Honestly wouldn't surprise me to see N33 get rebranded and crammed in down at the bottom as a 9050, that card can't cost more than a buck to make at this point.
Posted on Reply
#131
Guwapo77
debido666Can't wait to see the 9060. Should be a good card for people that don't spend half their rent or more on GPU's.
Bro, I wish this was half my rent!!
Posted on Reply
#132
Super XP
I don't like the name change. They should have stuck with 9700XT. Anyhow hopefully the 9070xt performs near or at the 7900XT levels for a decent price e tag.
It's too bad Nvidia is pushing gimmicks like frame generation & charging a high cost for this turd.
Posted on Reply
#133
Chrispy_
AusWolfIt clearly says "and an RX 9060 XT with 16GB", then it speculates about a 192-bit bus, there's no way around it. You know it's wrong, I know it's wrong, but for the masses, it's just misinformation.

The sentence should have gone like
"If the above leak is true, we might still see an RX 9060 with 8GB memory and an RX 9060 XT with 12 GB on a 192-bit interface."
It's ambiguous. I'm not defending that and I hate ambiguous wording but I'm just the messenger, It's their article not mine. If they hadn't inserted the word "now" (bolded below) I'd be inclined to agree with your interpretation of what they wrote, but the word "now" completely changes the meaning, and makes little sense in the sentence otherwise.

"If the above leak is true, we might still see an RX 9060 with 8GB memory and an RX 9060 XT with 16GB", but the XT model might now get a wider, 192-bit interface."

To my understanding of English grammar, the ",but" and "now" are two strong indicators that the part of the sentence talking about a wider, 192-bit interface are contradictory to the first part of the sentence talking about the 128-bit bus.

I agree that it should be worded more clearly but DigitalTrends aren't clueless, I'm fairly certain they know that you can't have 16GB on a 192-bit bus based on the level of technical depth some of their other articles have gone into, so I'm willing to chalk this one up to ambiguous wording rather than technical ignorance.

Anyway, it is what it is. Hopefully the smaller Navi44 die is a 192-bit product, because that would make the entry level an absolute banger for 2025 :)
Posted on Reply
#134
flyingdoc
It would be nice to stick with AMD, but so many issues with my 7900XTX. So many TDRs. Following everything out there to find some sort of stability. Honestly, cannot stick with them anymore. The only thing I found that somewhat stabilizes the card from crashing all the time is...to set the max frequency to 90%. So basically take 10% off the cards performance.
Posted on Reply
#135
Chrispy_
flyingdocIt would be nice to stick with AMD, but so many issues with my 7900XTX. So many TDRs. Following everything out there to find some sort of stability. Honestly, cannot stick with them anymore. The only thing I found that somewhat stabilizes the card from crashing all the time is...to set the max frequency to 90%. So basically take 10% off the cards performance.
Sounds like you have a defective GPU really.

I have both Nvidia and AMD at home, and build mostly Nvidia machines at work these days and it's not any better on the Nvidia side. Faulty hardware is faulty, doesnt't matter what brand logo is on the hardware.
Posted on Reply
#136
Zach_01
Chrispy_Sounds like you have a defective GPU really.
Or insufficient power delivery (I don’t mean PSU Watt only), or some other system issue…

0 issues with the 7900XTX for more than a year now.
Posted on Reply
#138
flyingdoc
1200 watt psu.Replaced from an older 1200 watt.
Posted on Reply
#139
ModEl4

If you check AMD slide, it places the top RX 9060 model (XT) at performance higher than RX 7700XT which it means either it will be a cutdown Navi 48 variant with 192bit bit bus,
or the other option would be Navi 44 with 64RBs and 2560 shaders that will probably match/exceed slightly 7700XT in FHD (and maybe match it at QHD) if it has high enough clocks (at least 3040MHz turbo for the reference model) paired with 128bit bus and 16GB memory (at this performance level - and price range accordingly - 8GB would be a detriment).
If the package size of Navi44 is just 29 x 29 mm we are probably talking about max 162mm2 die size so nearly impossible to house a 192bit bus design at these dimensions.
Posted on Reply
#140
Zach_01
flyingdocReplaced twice! Same issues.
Then its not the card, unless you are the most unlucky buyer of the(that) year
flyingdoc1200 watt psu.Replaced from an older 1200 watt.
Watt dont say the whole story unfortunately. Although the 1.2KW sounds enough for 2x 7900XTX to be honest
Some times other factors of PSUs play a role too.

What variant of 7900XTX was?

After 2x GPU replacements, that equal 3 GPUs (assuming different ones and not the same send back) I would be convinced that the GPU(s) had issues (by design) only if tested on different PC and issue remains.
Sorry but I'm dealing with PCs for 25 years and had too many GPUs, and these kind of issues could be anything.
Posted on Reply
#141
3valatzy
ModEl4
If you check AMD slide
Where are the RTX 5090 and RTX 4090 in this slide?
It places RX 9070 series around RTX 5050 - RTX 5060 performance, while RX 9060 series around RTX 5030 - 5010 performance levels !
Posted on Reply
#142
flyingdoc
Zach_01Then its not the card, unless you are the most unlucky buyer of the(that) year
Watt dont say the whole story unfortunately. Although the 1.2KW sounds enough for 2x 7900XTX to be honest
Some times other factors of PSUs play a role too.
What variant of 7900XTX was?
After 2x GPU replacements, that equal 3 GPUs (assuming different ones and not the same send back) I would be convinced that the GPU(s) had issues (by design) only if tested on different PC and issue remains.
Sorry but I'm dealing with PCs for 25 years and had too many GPUs, and these kind of issues could be anything.
Exactly, it is not the card. It is the drivers.

1.) Two different rigs tested with different X670E motherboards.
2.) DDU and AMD Clean Utility.
3.) Windows 10 and 11 tested
4.) Memory tested with no overclocked.
5.) BIOS updated.
6.) 12 months of AMD drivers tested.
7.) Two power supplies. With new cables tested.
8.) Multiple replaced GPUs.
9.) If you google around the net there are many of reported cases of TDRs not being fixed at all. This is an AMD driver issue with certain games and it will never be resolved.
Posted on Reply
#143
ModEl4
3valatzyWhere are the RTX 5090 and RTX 4090 in this slide?
It places RX 9070 series around RTX 5050 - RTX 5060 performance, while RX 9060 series around RTX 5030 - 5010 performance levels !
Probably AMD with this slide it places what models the competition offers in relation with their products based on price, so it doesn't correspond to performance but to price regarding Nvidia models
Posted on Reply
#144
Zach_01
flyingdocExactly, it is not the card. It is the drivers.

1.) Two different rigs tested with different X670E motherboards.
2.) DDU and AMD Clean Utility.
3.) Windows 10 and 11 tested
4.) Memory tested with no overclocked.
5.) BIOS updated.
6.) 12 months of AMD drivers tested.
7.) Two power supplies. With new cables tested.
8.) Multiple replaced GPUs.
9.) If you google around the net there are many of reported cases of TDRs not being fixed at all. This is an AMD driver issue with certain games and it will never be resolved.
Yes those bad drivers... always.
Any news on what model of 7900XTX was that?
And in what games was the drivers "crashing"?
flyingdocIt would be nice to stick with AMD, but so many issues with my 7900XTX. So many TDRs. Following everything out there to find some sort of stability. Honestly, cannot stick with them anymore. The only thing I found that somewhat stabilizes the card from crashing all the time is...to set the max frequency to 90%. So basically take 10% off the cards performance.
BTW when you want to reduce clocks you just limit power from -1% up to -10% from adrenalin, and when that is not enough for whatever reason then you reduce clock limit.
Its common practice among AMD GPU users when they want to reduce clocks/power (for whatever reason)

Starting to doubt about if any of this is real. Did you just joined TPU today to make your anti-AMD statement?
You cant help getting that idea when you dont have a single issue with your games, even the most demanding ones for over a year now...
Posted on Reply
#145
r.h.p
The ShieldThis thing always lets me ashtonished. How is it possibile that a such useless and - at the same time - very very very expensive feature (both from hardware and price stand point) has reached a so prominent role in EVERY Gpu discussion among users?
Why people ALWAYS pop up with "eh, but the Ray Tracing..."?
yeah, seams the case, i was at the pc store the other day and random talk with the tech/returns guy about it led to another random guy jumping in to say how great ray tracing was on his 4700 playing portal...
Posted on Reply
#146
Col_Panic
TheinsanegamerNYou're right, it IS beyond tiresome how AMD, now on it's 3rd RT generation, cannot meaningfully improve their RT performance, to the point that half a dozen Nvidia cards place above it.
It is not that easy. Almost all "RT" effects implemented in games now are done so via Cuda. Since Cuda is propriatary, its not just a matter of adding more accelerators they also have to figure out how to make them work correctly too. It looks like there may be a "bypass"/"workaround" soon that should narrow the RT gap if not close it, though.

Im actually not a fan of the way RT is used most of the time, though. Its usually just slathered over everything, making old dry dusty concrete/brick look like it was sprayed with glossy epoxy and buffed to a high shine finish. Piles of dirt should not prob not reflect light like it was hosed down with baby oil, ya know? Im also pretty bummed that it is pretty much the only type of advanced lighting effects used anymore. I would like to see slmething else or at least see it used more intentionally and not just applied to all things reflective or not. I get the feeling that the people that spend the most on PC hardware(or have their parents do it), are less worried about playing a good looking, interesting game, than they are worried about saying the names of the parts they have and the numbers they can get. I like hot rodding PCs, too, but just going out and buying a ferrari is not "hot rodding" imo. Getting an older system to perform mucn better than it should is what I enjoy doing. Althouh, since all games are pretty much developed for XB/PS and constrained by their many limitations, I see little point in blowing a ton of money on a PC anymore....for a while actually. The best a PC can do is look better, bug considering it takes everything a PS/XB has just to look as good as it can, there is almost nothing left for much else. That is why basicly the same games jusg keep getting regurgitated with only slightly better graphics. Their scope/scale and depth stays the same...if it is not just reduced(often happens towards the end of a gen as they try to provide better visuals compared to previous games which requires them to dumb the underlying game down even more to do so. Thats why PC games from like 18-20 years ago have better enemy/npc AI and so many more options than new games. I dont care if people want to play games on a console, but since MS and Sony demand that the PC version of a game is the same as their console version other than visuals, they are limited by things that would not be a problem on PC. STALKER 2 is a perfect example. The A-Life (AI and offline NPC tracking system) is not anything like it should be. The spawing and de spawning is almsot as bad as Far Cry 5! Its not because they cant do it....they did it almost 20 years ago, the reason is that xbox can not run that in the background. When my PC is maxed out with the visuals, I still have like 8+ threads and like 15GB of RAM sping nothing. Xbox uses EVERY last bit of power to just run the game at med-high setting at 30fps. They would have to really lower the visual quailty to get it to work and since an Xbox cant do it, they cant let my PC do it. Modders will prob have to fix it since the licsence terms prevent GSC Game World from doing so. Its all so lame.
end /commrant/digressed\;
Posted on Reply
Add your own comment
Feb 5th, 2025 01:45 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts