Thursday, January 9th 2025

AMD Radeon RX 9070 XT Benchmarked in 3D Mark Time Spy Extreme and Speed Way

Although it has only been a few days since the RDNA 4-based GPUs from Team Red hit the scene, it appears that we have already been granted a first look at the 3D Mark performance of the highest-end Radeon RX 9070 XT GPU, and to be perfectly honest, the scores seemingly live up to our expectations - although with disappointing ray tracing performance. Unsurprisingly, the thread has been erased over at Chiphell, but folks have managed to take screenshots in the nick of time.

The specifics reveal that the Radeon RX 9070 XT will arrive with a massive TBP in the range of 330 watts, as revealed by a FurMark snap, which is substantially higher than the previous estimated numbers. With 16 GB of GDDR6 memory, along with base and boost clocks of 2520 and 3060 MHz, the Radeon RX 9070 XT managed to rake in an impressive 14,591 points in Time Spy Extreme, an around 6,345 points in Speed Way. Needless to say, the drivers are likely far from mature, so it is not outlandish to expect a few more points to get squeezed out of the RDNA 4 GPU.
Regarding the scores we currently have, it appears that the Radeon RX 9070 XT fails to match the Radeon RX 7900 XTX in both the tests, although it easily exceeds the GeForce RTX 4080 Super in the non-ray-traced TS Extreme test. In the Speed Way test, which is a ray-traced benchmark, the RX 9070 XT fails to match the RTX 4080 Super, falling noticeably short. Considering that it costs less than half the price of the RTX 4080 Super, this is no small feat. Interestingly, an admin at Chiphell, commented that those planning on grabbing an RTX 50 card should wait, further hinting that the GPU world has "completely changed". Considering the lack of context, the interpretation of the statement is debatable, but it does seem RDNA 4 might pack impressive price-to-performance that may give mid-range Blackwell a run for its money.
Sources: Chiphell, @0x22h
Add your own comment

95 Comments on AMD Radeon RX 9070 XT Benchmarked in 3D Mark Time Spy Extreme and Speed Way

#76
Acuity
Chrispy_Exactly. 3DMark is a mess for this stupid reason and that's why I asked, because honestly the numbers are totally meaningless without stock results to compare to.

Does any website test GPUs at stock settings with 3DMark and publish scores of GPUs that are representative of what people actually own?
On the 3dmark website in the search section, you can enter the standard clock values of gpu and gpu memory clock of the 7900xt or xtx and then compare them without overclocking.
Zach_015900X + 7900XTX (TBP 366+10%=402W, GPU clock 2620~2670MHz, VRAM 2600MHz)

What brand is your GPU? Sapphire?
Posted on Reply
#77
Sound_Card
Something is very off about all of this if you think about it.
  • The die size of the 9070xt is bigger than the 4080, but it has 2k less shaders than the 7900xt despite being on a smaller process node.
  • AMD has explicitly stated in public that the “performance figures” are all wrong.
  • Rumors that they have deliberately sent out gimped drivers to throw off leaks.
Something else is under the hood of the 9070 that AMD is not sharing with us that is taking up valuable real estate. Something top that could be a fruition of their purchase of Xilenx or some new type of programming unit.

If the 9700pro was "Isildur" slicing the fingers off of Nvidia's hand, the 9070xt might be the return of the King.
Posted on Reply
#78
LabRat 891
Sound_CardSomething is very off about all of this if you think about it.
  • The die size of the 9070xt is bigger than the 4080, but it has 2k less shaders than the 7900xt despite being on a smaller process node.
  • AMD has explicitly stated in public that the “performance figures” are all wrong.
  • Rumors that they have deliberately sent out gimped drivers to throw off leaks.
Something else is under the hood of the 9070 that AMD is not sharing with us that is taking up valuable real estate. Something top that could be a fruition of their purchase of Xilenx or some new type of programming unit.
Perhaps. I've wondered where the die real estate went to, myself.
I do hope AMD is allowing disappointing leaks out, and that the final product is much more impressive, but... Even AMD themselves, are positioning RX 9070 (XT) as a replacement to the 7900GRE - XT 'range'.

^That slide alone, is what had me pull the trigger on a $900+ 24GB XTX, right after CES. AMD has no replacement tier card.

Funny enough, AMD seems to be ending the separation of RDNA 'Graphics' and CDNA 'Compute' the exact same way it started:
The RX 7900 XTX becomes the 'Radeon VII of Today'
and the
RX 9070 (XT) becomes the 'RX 5700 (XT) of Today'
www.pcgamer.com/hardware/graphics-cards/from-the-developers-standpoint-they-love-this-strategyamds-plan-to-merge-its-rdna-and-cdna-gpu-architectures-to-a-unified-system-called-udna/
Sound_CardIf the 9700pro was "Isildur" slicing the fingers off of Nvidia's hand, the 9070xt might be the return of the King.
Extraordinarily wishful thinking.
The situation has changed considerably since then. nVidia is a monster of a company, today -resources and all.
Posted on Reply
#79
AusWolf
Sound_CardSomething is very off about all of this if you think about it.
  • The die size of the 9070xt is bigger than the 4080, but it has 2k less shaders than the 7900xt despite being on a smaller process node.
Perhaps by "improved RT" they meant they're giving us more RT cores? Or maybe the AI that does FSR 4 is taking up space? It could explain why some models need so much power. Personally, as long as it's a fine card for a decent price, I don't care.
Sound_Card
  • AMD has explicitly stated in public that the “performance figures” are all wrong.
Where?
Sound_Card
  • Rumors that they have deliberately sent out gimped drivers to throw off leaks.
Why would they have done that?
Sound_CardSomething else is under the hood of the 9070 that AMD is not sharing with us that is taking up valuable real estate. Something top that could be a fruition of their purchase of Xilenx or some new type of programming unit.

If the 9700pro was "Isildur" slicing the fingers off of Nvidia's hand, the 9070xt might be the return of the King.
If the price is right, it very well may be.
Posted on Reply
#80
Zach_01
AcuityWhat brand is your GPU? Sapphire?
Yes, the Nitro+
Posted on Reply
#81
mkppo
Honestly couldn't give a rats tit about RT performance but if raster comes around the same as 7900XT, it would be pretty decent. I'm more interested in figuring out what they've done die wise because it looks strangely similar to two 9060XT's side by side. Are they upto some sort of modular arch or what i'm not sure but i want that die annotation..
Posted on Reply
#82
wolf
Better Than Native
clopeziHave you checked Cyberpunk, Indiana Jones or Alan Wake II with Full RT? It's a huge tanking in performance, yes, but it's beautiful.
I wouldn't expect the usual suspects here to admit that even if they had seen it tbh, it'd be easy enough to showcase some gorgeous differences and cherry pick those screenshots or video segments (like what was done to show how little difference it can make - which I don't deny depending on the game or scene), but I certainly wouldn't expect to convince those so vocally against it anyway, they've already made up their minds and appear to enjoy patting themselves on the back for it.

-------------------------------------------------------------------------------------------------------------

Personally, this is shaping up to be ever more appetising to me as my upgrade path. If it really is;
  • A raster match~ish for a 7900XTX or 4080/S
  • RT performance that is generationally ahead of Ampere
  • FSR4 (or what was directly called by them to be a research project) is as good as what we saw in their booth for all or at least most games (and not just fine tuned for 1 title), and is easily adopted widespread or able to be substituted in place of 3.1 as has been rumoured
  • Some AIB cards have 2x HDMI 2.1
  • And of course, priced to party...
Well then I'm going to have a hard time justifying to myself paying a bare minimum of $1519 AUD for a 5070Ti or better.

bring on the release and reviews.
Posted on Reply
#83
r.h.p
phanbueyIm talking about Cyberpunk specifically. I've poured 600+ hours into that game -- the RT is one of the best implementations ive seen, and it still looks like crap (IMO).




It's grainier, blurrier:



Pick the RT shot -- it's the one on the left.
maybe my eyes are bad but its hardly any different imo
Posted on Reply
#84
phanbuey
r.h.pmaybe my eyes are bad but its hardly any different imo
right - it's almost the same with the RT being slightly blurrier... for -60% FPS.

And it's not like i had the lowest RT setting turned on - was 4090 with everything cranked. Some scenes look cool, but then you turn off RT and realize they're just as cool with it off, but now you also get 150FPS.
Posted on Reply
#85
Outback Bronze
phanbueyIm talking about Cyberpunk specifically. I've poured 600+ hours into that game -- the RT is one of the best implementations ive seen, and it still looks like crap (IMO).




It's grainier, blurrier:



Pick the RT shot -- it's the one on the left.
RT reminds me of what Nvidia did several moons ago with HDR - Pixel Shader 3.0. If my memory serves me correctly it was the 6xxx series that brough HDR which was their sales pitch.

ATi at the time with their X800 series was just running Pixel Shader 2.0. I'm going to leave an example here of the difference of HDR on/off from back in the old days with Elder Scrolls IV: Oblivion

With HDR:


Without HDR:

I used to think HDR was the shizz nizz back in the day but as I've been playing over the years, I've been noticing that the HDR colours were very bright and that bloom was a more proper representation of real life colours.

You could also run 8x AA on Bloom but not HDR which I have done with the example screen shots shown above.

Which one do you think is better?

Great screen shots of your RT implementation btw. Nice Work!
Posted on Reply
#86
AusWolf
phanbueyright - it's almost the same with the RT being slightly blurrier... for -60% FPS.

And it's not like i had the lowest RT setting turned on - was 4090 with everything cranked. Some scenes look cool, but then you turn off RT and realize they're just as cool with it off, but now you also get 150FPS.
Exactly. No one denies that RT is nice. The problem is the performance cost even on Nvidia, and the fact that it isn't really a night and day difference, just a little icing on the cake. If you turn it on, you see it's nice. But then you turn it off and still enjoy your game just the same. After 5-10 minutes, you don't even care.
Posted on Reply
#87
wolf
Better Than Native
r.h.pmaybe my eyes are bad but its hardly any different imo
Someone posts screenshots chosen to demonstrate no/little difference;

Wow, there's hardly any difference! :rolleyes:
Posted on Reply
#88
phanbuey
Outback BronzeRT reminds me of what Nvidia did several moons ago with HDR - Pixel Shader 3.0. If my memory serves me correctly it was the 6xxx series that brough HDR which was their sales pitch.

ATi at the time with their X800 series was just running Pixel Shader 2.0. I'm going to leave an example here of the difference of HDR on/off from back in the old days with Elder Scrolls IV: Oblivion

With HDR:


Without HDR:

I used to think HDR was the shizz nizz back in the day but as I've been playing over the years, I've been noticing that the HDR colours were very bright and that bloom was a more proper representation of real life colours.

You could also run 8x AA on Bloom but not HDR which I have done with the example screen shots shown above.

Which one do you think is better?

Great screen shots of your RT implementation btw. Nice Work!
That game was amazing -- I am partial to the HDR oversaturated mushroom trip version - especially with the expansions. Bethesda at their peak.

The 8xAA looks 'better' but the no AA crisp with the oversaturated colors kind of have that oblivion mood. When I got my mitts on the 8800GT you could do HDR with CSAA and a 60fps vsync lock, which back then was like the pinnacle of gaming graphics for me.
Posted on Reply
#89
Outback Bronze
phanbueyWhen I got my mitts on the 8800GT
I was running an X800 XTPE at the time of oblivion so only pixel shader 2.0 when I first started playing it. Then got a 7800 GT which would allow me to run Pixel Shader 3.0. Yes, they were great graphics for that era but it wasn't until the 8800GTS 640mb when I was running Crysis in DX10 did I think that was the Pinnacle of graphics and for some time I might add.

I haven't read the whole thread guys, but the 9070 XT doesn't look too bad if they price it competitively.

Anybody got any idea of these cards pricing atm?
Posted on Reply
#90
AusWolf
phanbueyThat game was amazing -- I am partial to the HDR oversaturated mushroom trip version - especially with the expansions. Bethesda at their peak.

The 8xAA looks 'better' but the no AA crisp with the oversaturated colors kind of have that oblivion mood. When I got my mitts on the 8800GT you could do HDR with CSAA and a 60fps vsync lock, which back then was like the pinnacle of gaming graphics for me.
I agree. That game made me swap my amazing ATi X800 XT for an overheating, loud mess of a card known as 7800 GS AGP just to be able to play it with HDR. Good old times! :)

And we have people here saying that I don't care about features. Of course I do when they're good. The problem with features these days is that they either make your game run like a slideshow (RT), or make it a blurry mess (upscaling), not to mention manufacturers use them as excuses to pay more for cards that don't have any business being in the price range that they're in, which I find disgusting.
Posted on Reply
#91
wolf
Better Than Native
Outback BronzeWith HDR:
Honestly love this one with the colour saturation, sky highlight and seemingly deeper contrast. 6800Ultra was the first top GPU I ever bought and it was great to taste those visuals.

Pity for me, much like CP2077, it's just not quite my kind of game from an actual gameplay perspective.

For RT, clearly I'm an enjoyer but that doesn't mean I vouch for universally turning it on in every game, every situation and so on. But boy I've had times it has absolutely added to the visual immersion and to an extent - blown me away.

AMD's talk and posturing would seem to suggest they are focusing on it too, seeing's it's merit in customer attraction and a more rounded capable product. They already have a fairly dedicated crowd of people that are all about their cards, their bigger issue is getting the ones that don't currently use them, and perhaps haven't for a few generations, to come back/jump on.
Posted on Reply
#92
Dr. Dro
Outback BronzeRT reminds me of what Nvidia did several moons ago with HDR - Pixel Shader 3.0. If my memory serves me correctly it was the 6xxx series that brough HDR which was their sales pitch.

ATi at the time with their X800 series was just running Pixel Shader 2.0. I'm going to leave an example here of the difference of HDR on/off from back in the old days with Elder Scrolls IV: Oblivion

With HDR:


Without HDR:

I used to think HDR was the shizz nizz back in the day but as I've been playing over the years, I've been noticing that the HDR colours were very bright and that bloom was a more proper representation of real life colours.

You could also run 8x AA on Bloom but not HDR which I have done with the example screen shots shown above.

Which one do you think is better?

Great screen shots of your RT implementation btw. Nice Work!
Ah, that was DIrectX 9.0c. Shader Model 3.0 brought parity between DirectX on Windows and the Xbox 360's graphics capabilities, Oblivion's bloom shader was a fallback path for older DirectX 9.0b cards like the GeForce FX series that were about 3 years old when it came out. Oblivion really stretched the pre-unified shader GPUs to the max, and IMO it still looks stunning to this day. No official pricing info on the new Radeon cards either, I reckon it comes soon. That being said...

STOP RIGHT THERE, CRIMINAL SCUM. Nobody plays Oblivion with the nasty bloom shader on my watch. I'm confiscating your stolen goods. Now pay your fine, or it's off to jail.
Posted on Reply
#93
Outback Bronze
Dr. DroNobody plays Oblivion with the nasty bloom shader on my watch.
It's funny you know, I used to not touch bloom when I was running GeForce cards, then one day I tried it and pow, I was hooked. Not sure if it was the 8xAA that was helping through the forests or what.

I remember when I first turned on HDR vs Bloom with the GeForce cards, gees it chewed the card. Kinda why I'm referencing it a bit to what RT does this day and age.

Thread/
Posted on Reply
#94
Dr. Dro
Outback BronzeIt's funny you know, I used to not touch bloom when I was running GeForce cards, then one day I tried it and pow, I was hooked. Not sure if it was the 8xAA that was helping through the forests or what.

I remember when I first turned on HDR vs Bloom with the GeForce cards, gees it chewed the card. Kinda why I'm referencing it a bit to what RT does this day and age.

Thread/
It's true, though. In the beginning it was hardware transform and lighting, then HDR rendering effects, then GPU accelerated physics, tessellation, instancing, now raytracing... each generation of games has brought its own challenges to hardware and graphics drivers. RT is one of the most complex graphics techniques ever and widely considered to be the holy grail of computer graphics as it enables truly photorealistic scene generation, the problem is the simply ginormous amount of compute required to pull this off. NV uses AI as a crutch to achieve that goal, but true RT is probably within the next 5 GPU generations IMO.
Posted on Reply
#95
AusWolf
Outback BronzeIt's funny you know, I used to not touch bloom when I was running GeForce cards, then one day I tried it and pow, I was hooked. Not sure if it was the 8xAA that was helping through the forests or what.

I remember when I first turned on HDR vs Bloom with the GeForce cards, gees it chewed the card. Kinda why I'm referencing it a bit to what RT does this day and age.

Thread/
My favourite workaround to gain more performance was enabling HDR while disabling grass. Grass ate your GPU even harder than HDR, I'd say. Not to mention you could find your missed arrows a lot easier without it. :laugh:
Posted on Reply
Add your own comment
Jan 10th, 2025 03:43 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts