Thursday, January 9th 2025

AMD Radeon RX 9070 XT Benchmarked in 3D Mark Time Spy Extreme and Speed Way

Although it has only been a few days since the RDNA 4-based GPUs from Team Red hit the scene, it appears that we have already been granted a first look at the 3D Mark performance of the highest-end Radeon RX 9070 XT GPU, and to be perfectly honest, the scores seemingly live up to our expectations - although with disappointing ray tracing performance. Unsurprisingly, the thread has been erased over at Chiphell, but folks have managed to take screenshots in the nick of time.

The specifics reveal that the Radeon RX 9070 XT will arrive with a massive TBP in the range of 330 watts, as revealed by a FurMark snap, which is substantially higher than the previous estimated numbers. With 16 GB of GDDR6 memory, along with base and boost clocks of 2520 and 3060 MHz, the Radeon RX 9070 XT managed to rake in an impressive 14,591 points in Time Spy Extreme, an around 6,345 points in Speed Way. Needless to say, the drivers are likely far from mature, so it is not outlandish to expect a few more points to get squeezed out of the RDNA 4 GPU.
Regarding the scores we currently have, it appears that the Radeon RX 9070 XT fails to match the Radeon RX 7900 XTX in both the tests, although it easily exceeds the GeForce RTX 4080 Super in the non-ray-traced TS Extreme test. In the Speed Way test, which is a ray-traced benchmark, the RX 9070 XT fails to match the RTX 4080 Super, falling noticeably short. Considering that it costs less than half the price of the RTX 4080 Super, this is no small feat. Interestingly, an admin at Chiphell, commented that those planning on grabbing an RTX 50 card should wait, further hinting that the GPU world has "completely changed". Considering the lack of context, the interpretation of the statement is debatable, but it does seem RDNA 4 might pack impressive price-to-performance that may give mid-range Blackwell a run for its money.
Sources: Chiphell, @0x22h
Add your own comment

121 Comments on AMD Radeon RX 9070 XT Benchmarked in 3D Mark Time Spy Extreme and Speed Way

#101
ThomasK
_roman_Are you trolling?

you get 150% of performance and 150% of better software for windows operating systems with a nvidia 4090.
But you want to pay the same price as a slow radeon 7900XTX?

Please pick graphic cards with similar performance for price comparison. I think a nvidia 4070 or 4070 ti or if it exists 4070 ti super is the proper coutnerpart. (i do not really care for the lower nvidia parts)
Judging by your comments, you are the one trolling.

Check your facts and come back afterwards.
Posted on Reply
#102
Hakker
Also FSR4 seem to be really good Hardware unboxed had a YT video of it and it's a huge leap compared to FSR3.1 like night and day difference and then I mean positively.
Posted on Reply
#103
Chrispy_
HakkerAlso FSR4 seem to be really good Hardware unboxed had a YT video of it and it's a huge leap compared to FSR3.1 like night and day difference and then I mean positively.
I'm quite excited to try out FSR4 and DLSS4.

I really dislike both FSR3 and DLSS3, but if they have genuinely fixed the smearing, ghosting, and temporal blur that will be maybe enough to convert me. It's so hard to see how good or bad it is via captured, encoded, compressed YouTube though, so the only verdict I will trust is my own eyes but I do agree that the DF and HUB closeup footage of FSR4 seems very promising from here.

I'm going to grab the cheapest 16GB Nvidia and AMD cards (probably the 5070Ti and vanilla 9070) as soon as they're available for work reasons anyway, so I'll try both and see which one I prefer.
Posted on Reply
#104
Hakker
Chrispy_I'm quite excited to try out FSR4 and DLSS4.

I really dislike both FSR3 and DLSS3, but if they have genuinely fixed the smearing, ghosting, and temporal blur that will be maybe enough to convert me. It's so hard to see how good or bad it is via captured, encoded, compressed YouTube though, so the only verdict I will trust is my own eyes but I do agree that the DF and HUB closeup footage of FSR4 seems very promising from here.

I'm going to grab the cheapest 16GB Nvidia and AMD cards (probably the 5070Ti and vanilla 9070) as soon as they're available for work reasons anyway, so I'll try both and see which one I prefer.
you can see the difference here. Perfect probably not but a massive improvement? Sure as can be.
Posted on Reply
#105
Chrispy_
Hakker
you can see the difference here. Perfect probably not but a massive improvement? Sure as can be.
Yeah, that's the HUB video I mentioned. The other one is DF:
Posted on Reply
#106
Macro Device
ThomasKyou are the one trolling.
Well, 4070 Ti Super is pretty accurate. Sure, it's slower in most rasterised scenarios by a significant yet low margin (around 10 percent) and has way less VRAM but it wins at RT, especially if we talk heavy RT, and has more features. Also boasts more energy efficiency. That's why I deem it a fair GPU to call similar to the XTX.

4070 and 4070S surely are trolling. These are meant to compete with 7800 XT / 7900 GRE respectively.
Chrispy_I'm quite excited to try out FSR4 and DLSS4.
Highly doubt it's gonna be better enough to convert you. DLAA on top of a 1080p display (DSR / 1440p display if you can afford a xx70 Ti+ class GPU) still strikes me as THE way to play vidya. Upscaling is a great way to enjoy 4K gaming but idk, GPUs strong enough for DLSS Q / XeSS UQ / FSR Q are unobtanium for most gamers. And at Balanced and lower, it's usually less exciting than plain 1440p. And at 1440p and lower, upscaling is just one last resort. Good to have it as a fall-back option but bad if you need it.
Posted on Reply
#107
Chrispy_
Macro DeviceHighly doubt it's gonna be better enough to convert you. DLAA on top of a 1080p display (DSR / 1440p display if you can afford a xx70 Ti+ class GPU) still strikes me as THE way to play vidya. Upscaling is a great way to enjoy 4K gaming but idk, GPUs strong enough for DLSS Q / XeSS UQ / FSR Q are unobtanium for most gamers. And at Balanced and lower, it's usually less exciting than plain 1440p. And at 1440p and lower, upscaling is just one last resort. Good to have it as a fall-back option but bad if you need it.
I game at 4K120 in one room (DLSS FG) and 1440p240 in the other (currently FSR FG with as little upscaling as possible, depends if the 7800XT has the grunt to run the game fast enough without FG assistance.

Since neither of my GPUs can do modern AAA games at high-refresh native res, I'm usually either dropping to 1080p120 on the Geforce and 1440p120 (with strobing) on the Radeon.

I don't want to use DLSS3 or FSR3 but FG is the carrot that makes me willing to keep trying it occasionally because FG is a great way to get the high-refresh, so long as the latency (1/2 FG'd framerate) is still over about 80fps.
Posted on Reply
#108
Macro Device
Chrispy_
I game at 4K120 in one room (DLSS FG) and 1440p240 in the other

Maybe you sell one room and buy a 5090 so you don't need upscaling..?
Chrispy_strobing
This one is interesting. I have only seen it on YT videos and not in the real life. My displays don't support this feature.
Chrispy_FG is a great way to get the high-refresh
I can clearly see that half frames are fake and this bugs me every time I enable it. Much worse on AMD. I can live with upscaling artifacts but not with this chicanery. No.

Still, I am interested to see if 9070 XT makes any difference. Feels like it's about two years too late to the party.
Posted on Reply
#109
W3RN3R
Regarding upscaling, don't know why fsr was ever compared to dlss since it's release, you can't compare software to hardware based upscaling, was always a unfair comparison. This is amd's first attempt at machine learning based upscaling, i for one think it's a great start.
Posted on Reply
#110
freeagent
I ran Speedway yesterday, and I scored 200 points less than the screenshot.

Bah. Now I need a new GPU :banghead:
Posted on Reply
#111
ThomasK
Macro DeviceWell, 4070 Ti Super is pretty accurate. Sure, it's slower in most rasterised scenarios by a significant yet low margin (around 10 percent) and has way less VRAM but it wins at RT, especially if we talk heavy RT, and has more features. Also boasts more energy efficiency. That's why I deem it a fair GPU to call similar to the XTX.
My opinion about RT was given on the first page of this thread. Feel free to go and have a look.
Posted on Reply
#112
QuietBob
Okay, so I re-tested Speedway on my reference 7900XTX with stock clocks. This result is 6% (4 fps) higher than the old one, probably due to driver or app optimizations:



That would make the 9070XT possibly equal to the 7900XTX here, rather than faster (as was my previous approximation based on the old result). Accounting for the differences in alleged boost clocks and Ray Accelerator count, RDNA4 would show close to 25% improvement over the previous gen in this particular benchmark.

Naturally, these are still speculations on my part. Speedway is a hybrid RT implementation, and it's likely that in RT heavy games RDNA4 could show bigger improvement.
Posted on Reply
#113
Chrispy_
Macro DeviceMaybe you sell one room and buy a 5090 so you don't need upscaling..?
I can have any hardware I want. I don't pay for it and even if I did, money is not a problem.

I took a 3090 out to put a 7800XT in because it was too hot and noisy, and I found all of the RT titles at the time underwhelming anyway, because I hate temporal blur and most RT implementations add a lot of it.

The other PC is running a 16GB 4060Ti because it performs fine when I choke it down to 150W. I'm hoping there's something faster at 150W in this coming round of GPUs from AMD and Nvidia.
Posted on Reply
#114
Macro Device
Chrispy_I don't pay for it
I want the same perk...
Chrispy_I'm hoping there's something faster at 150W in this coming round of GPUs from AMD and Nvidia.
I'd place my bet on 5070 no doubt and 9070 seems capable of that, too.
Posted on Reply
#115
Zach_01
QuietBobSpeedway is a hybrid RT implementation, and it's likely that in RT heavy games RDNA4 could show bigger improvement.
I believe so too

What was the GPU clock during your test?
From graph in the screenshot I would say around 2200-2300MHz?
And what was the TBP limit?
Macro DeviceI'd place my bet on 5070 no doubt and 9070 seems capable of that, too.
5070nonTi and 9070nonXT to be around 150W?
Highly unlikely.

Both of them will be at 250~300W range.

nVidia already show the 5070 reference at 250W and 9070 cant be that far behind the 9070XT, and most likely the latter reference one will start around 330W.

www.techpowerup.com/gpu-specs/geforce-rtx-5070.c4218

5060nonTi maybe will be around 150W
Posted on Reply
#116
springs113
MrDweezil+1 to this, the raw hardware specs of the card don't seem like it should be able to hit the performance levels AMD is claiming.
Where did amd claim these levels. This is the problem because i specifically remembered a publication said amd themselves that the leaks are incorrect. Nowhere did they say their number was higher or lower. Ppl make false claims way too often.
Posted on Reply
#117
Macro Device
Zach_015070nonTi and 9070nonXT to be around 150W?
Highly unlikely.

Both of them will be at 250~300W range.
I know but you can manually tune them to eat no more than 150 and I'm sure as hell they both will destroy 4060 Ti at this power limit.
Posted on Reply
#118
QuietBob
Zach_01What was the GPU clock during your test?
2275 - 2333 MHz with default TBP of 355 W:

Posted on Reply
#119
Zach_01
Macro DeviceI know but you can manually tune them to eat no more than 150 and I'm sure as hell they both will destroy 4060 Ti at this power limit.
Sure, take (almost) any GPU and cut power by 40-45% and you loose "only" 20~25% performance.

An OC version of the same GPU a 40% decrease in power could mean even less performance drop, like 10-15%
Posted on Reply
#120
freeagent
Just posting Speedway for reference to a 4070Ti in case inquiring minds wanted to know.. also it is overclocked..



Pathetic :(
Posted on Reply
#121
Chrispy_
Zach_015070nonTi and 9070nonXT to be around 150W?
Highly unlikely.

Both of them will be at 250~300W range.

nVidia already show the 5070 reference at 250W and 9070 cant be that far behind the 9070XT, and most likely the latter reference one will start around 330W.

www.techpowerup.com/gpu-specs/geforce-rtx-5070.c4218

5060nonTi maybe will be around 150W
50-series is on the same process node as 40-series, so I would expect most models to increase in TDP. My hope is that the 5060Ti isn't quite as hamstrung on memory bandwidth as the 4060Ti.

On the AMD side, there's a node shrink from 5+6nm to 4nm, so there's potential for some excellent undervolting there. The TDPs we're seeing at the moment seem to be factory OC models pushing >3GHz which is always going to be so far beyond the efficiency sweet spot that it's ridiculous. We don't know the configs of the 9060-series yet, nor the vanilla 9070 but I'm hoping that the 9070-series can be run at, say, ~2.4GHz at somewhere between 150 and 200W. I'm usually seeing 50% power draw for 80% clock speeds with RDNA2 and RDNA3....
Posted on Reply
Add your own comment
Jan 10th, 2025 11:16 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts