Thursday, January 9th 2025
AMD Radeon RX 9070 XT Benchmarked in 3D Mark Time Spy Extreme and Speed Way
Although it has only been a few days since the RDNA 4-based GPUs from Team Red hit the scene, it appears that we have already been granted a first look at the 3D Mark performance of the highest-end Radeon RX 9070 XT GPU, and to be perfectly honest, the scores seemingly live up to our expectations - although with disappointing ray tracing performance. Unsurprisingly, the thread has been erased over at Chiphell, but folks have managed to take screenshots in the nick of time.
The specifics reveal that the Radeon RX 9070 XT will arrive with a massive TBP in the range of 330 watts, as revealed by a FurMark snap, which is substantially higher than the previous estimated numbers. With 16 GB of GDDR6 memory, along with base and boost clocks of 2520 and 3060 MHz, the Radeon RX 9070 XT managed to rake in an impressive 14,591 points in Time Spy Extreme, an around 6,345 points in Speed Way. Needless to say, the drivers are likely far from mature, so it is not outlandish to expect a few more points to get squeezed out of the RDNA 4 GPU.Regarding the scores we currently have, it appears that the Radeon RX 9070 XT fails to match the Radeon RX 7900 XTX in both the tests, although it easily exceeds the GeForce RTX 4080 Super in the non-ray-traced TS Extreme test. In the Speed Way test, which is a ray-traced benchmark, the RX 9070 XT fails to match the RTX 4080 Super, falling noticeably short. Considering that it costs less than half the price of the RTX 4080 Super, this is no small feat. Interestingly, an admin at Chiphell, commented that those planning on grabbing an RTX 50 card should wait, further hinting that the GPU world has "completely changed". Considering the lack of context, the interpretation of the statement is debatable, but it does seem RDNA 4 might pack impressive price-to-performance that may give mid-range Blackwell a run for its money.
Sources:
Chiphell, @0x22h
The specifics reveal that the Radeon RX 9070 XT will arrive with a massive TBP in the range of 330 watts, as revealed by a FurMark snap, which is substantially higher than the previous estimated numbers. With 16 GB of GDDR6 memory, along with base and boost clocks of 2520 and 3060 MHz, the Radeon RX 9070 XT managed to rake in an impressive 14,591 points in Time Spy Extreme, an around 6,345 points in Speed Way. Needless to say, the drivers are likely far from mature, so it is not outlandish to expect a few more points to get squeezed out of the RDNA 4 GPU.Regarding the scores we currently have, it appears that the Radeon RX 9070 XT fails to match the Radeon RX 7900 XTX in both the tests, although it easily exceeds the GeForce RTX 4080 Super in the non-ray-traced TS Extreme test. In the Speed Way test, which is a ray-traced benchmark, the RX 9070 XT fails to match the RTX 4080 Super, falling noticeably short. Considering that it costs less than half the price of the RTX 4080 Super, this is no small feat. Interestingly, an admin at Chiphell, commented that those planning on grabbing an RTX 50 card should wait, further hinting that the GPU world has "completely changed". Considering the lack of context, the interpretation of the statement is debatable, but it does seem RDNA 4 might pack impressive price-to-performance that may give mid-range Blackwell a run for its money.
95 Comments on AMD Radeon RX 9070 XT Benchmarked in 3D Mark Time Spy Extreme and Speed Way
- The die size of the 9070xt is bigger than the 4080, but it has 2k less shaders than the 7900xt despite being on a smaller process node.
- AMD has explicitly stated in public that the “performance figures” are all wrong.
- Rumors that they have deliberately sent out gimped drivers to throw off leaks.
Something else is under the hood of the 9070 that AMD is not sharing with us that is taking up valuable real estate. Something top that could be a fruition of their purchase of Xilenx or some new type of programming unit.If the 9700pro was "Isildur" slicing the fingers off of Nvidia's hand, the 9070xt might be the return of the King.
I do hope AMD is allowing disappointing leaks out, and that the final product is much more impressive, but... Even AMD themselves, are positioning RX 9070 (XT) as a replacement to the 7900GRE - XT 'range'.
^That slide alone, is what had me pull the trigger on a $900+ 24GB XTX, right after CES. AMD has no replacement tier card.
Funny enough, AMD seems to be ending the separation of RDNA 'Graphics' and CDNA 'Compute' the exact same way it started:
The RX 7900 XTX becomes the 'Radeon VII of Today'
and the
RX 9070 (XT) becomes the 'RX 5700 (XT) of Today'
www.pcgamer.com/hardware/graphics-cards/from-the-developers-standpoint-they-love-this-strategyamds-plan-to-merge-its-rdna-and-cdna-gpu-architectures-to-a-unified-system-called-udna/ Extraordinarily wishful thinking.
The situation has changed considerably since then. nVidia is a monster of a company, today -resources and all.
-------------------------------------------------------------------------------------------------------------
Personally, this is shaping up to be ever more appetising to me as my upgrade path. If it really is;
- A raster match~ish for a 7900XTX or 4080/S
- RT performance that is generationally ahead of Ampere
- FSR4 (or what was directly called by them to be a research project) is as good as what we saw in their booth for all or at least most games (and not just fine tuned for 1 title), and is easily adopted widespread or able to be substituted in place of 3.1 as has been rumoured
- Some AIB cards have 2x HDMI 2.1
- And of course, priced to party...
Well then I'm going to have a hard time justifying to myself paying a bare minimum of $1519 AUD for a 5070Ti or better.bring on the release and reviews.
And it's not like i had the lowest RT setting turned on - was 4090 with everything cranked. Some scenes look cool, but then you turn off RT and realize they're just as cool with it off, but now you also get 150FPS.
ATi at the time with their X800 series was just running Pixel Shader 2.0. I'm going to leave an example here of the difference of HDR on/off from back in the old days with Elder Scrolls IV: Oblivion
With HDR:
Without HDR:
I used to think HDR was the shizz nizz back in the day but as I've been playing over the years, I've been noticing that the HDR colours were very bright and that bloom was a more proper representation of real life colours.
You could also run 8x AA on Bloom but not HDR which I have done with the example screen shots shown above.
Which one do you think is better?
Great screen shots of your RT implementation btw. Nice Work!
Wow, there's hardly any difference! :rolleyes:
The 8xAA looks 'better' but the no AA crisp with the oversaturated colors kind of have that oblivion mood. When I got my mitts on the 8800GT you could do HDR with CSAA and a 60fps vsync lock, which back then was like the pinnacle of gaming graphics for me.
I haven't read the whole thread guys, but the 9070 XT doesn't look too bad if they price it competitively.
Anybody got any idea of these cards pricing atm?
And we have people here saying that I don't care about features. Of course I do when they're good. The problem with features these days is that they either make your game run like a slideshow (RT), or make it a blurry mess (upscaling), not to mention manufacturers use them as excuses to pay more for cards that don't have any business being in the price range that they're in, which I find disgusting.
Pity for me, much like CP2077, it's just not quite my kind of game from an actual gameplay perspective.
For RT, clearly I'm an enjoyer but that doesn't mean I vouch for universally turning it on in every game, every situation and so on. But boy I've had times it has absolutely added to the visual immersion and to an extent - blown me away.
AMD's talk and posturing would seem to suggest they are focusing on it too, seeing's it's merit in customer attraction and a more rounded capable product. They already have a fairly dedicated crowd of people that are all about their cards, their bigger issue is getting the ones that don't currently use them, and perhaps haven't for a few generations, to come back/jump on.
STOP RIGHT THERE, CRIMINAL SCUM. Nobody plays Oblivion with the nasty bloom shader on my watch. I'm confiscating your stolen goods. Now pay your fine, or it's off to jail.
I remember when I first turned on HDR vs Bloom with the GeForce cards, gees it chewed the card. Kinda why I'm referencing it a bit to what RT does this day and age.
Thread/