Tuesday, December 24th 2024

AMD Radeon RX 9070 XT Alleged Benchmark Leaks, Underwhelming Performance

Recent benchmark leaks have revealed that AMD's upcoming Radeon RX 9070 XT graphics card may not deliver the groundbreaking performance initially hoped for by enthusiasts. According to leaked 3DMark Time Spy results shared by hardware leaker @All_The_Watts, the RDNA 4-based GPU achieved a graphics score of 22,894 points. The benchmark results indicate that the RX 9070 XT performs only marginally better than AMD's current RX 7900 GRE, showing a mere 2% improvement. It falls significantly behind the RX 7900 XT, which maintains almost a 17% performance advantage over the new card. These findings contradict earlier speculation that suggested the RX 9070 XT would compete directly with NVIDIA's RTX 4080.

However, synthetic benchmarks tell only part of the story. The GPU's real-world gaming performance remains to be seen, and rumors indicate that the RX 9070 XT may offer significantly improved ray tracing capabilities compared to its RX 7000 series predecessors. This could be crucial for market competitiveness, particularly given the strong ray tracing performance of NVIDIA's RTX 40 and the upcoming RTX 50 series cards. The success of the RX 9070 XT depends on how well it can differentiate itself through features like ray tracing while maintaining an attractive price-to-performance ratio in an increasingly competitive GPU market. We expect these scores not to be the final tale in the AMD RDNA 4 story, as we must wait and see what AMD delivers during CES. Third-party reviews and benchmarks will give the final verdict in the RDNA 4 market launch.
Sources: @All_The_Watts, @GawroskiT
Add your own comment

204 Comments on AMD Radeon RX 9070 XT Alleged Benchmark Leaks, Underwhelming Performance

#201
njshah
eidairaman1Delusional much
Nvidia sales say otherwise
Posted on Reply
#202
Speedyblupi
kapone32Using A Path Tracing Benchmark? What AMD card supports Path Tracing?
They all do. They're just not very good at it.
Path tracing isn't magic, it's mathematics, just like any other method of generating computer graphics. You can ray trace or path trace on a GPU with no RT cores, or even on a CPU. It won't run very well, but you can do it. The indie game "Teardown" is fully ray-traced (not rasterised - its use of voxels allows ray tracing to work at low ray count without looking like complete ass), doesn't use RT cores, and is playable (albeit only at relatively low resolutions and frame rates) on old GPUs like the RX 580 and GTX 1060. Nvidia's RT cores are just much better at path tracing than AMD's, and RDNA4 will hopefully change that.
Am*I hate to be that guy, but literally nobody cares about path tracing when it comes with that much of a performance penalty. Not even the most die hard Nvidia zealots running 4090s. Ask anyone running one if they'd rather run this game at 1080p 60FPS path traced or 4K 60FPS with RT and DLSS Quality on their 4K monitors when actually playing the game and not benchmarking.
You don't need to hate anything, I completely agree.
I used Cyberpunk 2077 with PT as an example, because I wanted to find a situation which was as close to a performance of pure ray/path tracing performance as possible. The overwhelming majority of games which use RT or PT, are primarily rasterised, and only overlay the tracing on top for reflections, lighting, and shadows as an additional effect or embellishment on top of the rasterised image.
My choice of example was intended to show a situation where tracing performance is the primary factor in the performance result, and rasterisation isn't significant.
I fully understand that this isn't representative of the difference in performance in realistic gaming scenarios, and I apologise if my choice of example was misleading.

In a more realistic situation, of a primarily rasterised game which uses some traced effects, an RX 7900 GRE is much closer to the performance of an RTX 4070 Ti, and an RX 7900 XTX is often faster overall. The point I was trying to make is that the Nvidia GPU will lose much less performance when ray/path tracing is enabled compared to when it's disabled, and that RDNA4 having 3x the tracing performance of RDNA3 would allow them to close this gap. For example, rasterisation might be 85% of the frame time for an RTX 4070 Ti, with the remaining 15% being ray tracing, while an RX 7900 XTX might need to spend 45% of its frame time on ray tracing; so even though its rasterisation performance is much higher, it might not be much faster overall in games that use ray tracing.

And also, over the next few years, more games will make use of more intensive ray/path-traced effects, so tracing performance will become even more important over time. Even so, I don't expect that examples as extreme as Cyberpunk 2077 with PT will be directly relevant to the average gamer any time soon, but it is still indirectly relevant, as an indication of ray/path-tracing performance as a component of total gaming performance.

I was trying to highlight the point that RDNA4 having 3 times the ray tracing performance of RDNA3 would neither be impossible, nor would it give AMD a performance lead over competing Nvidia GPUs with similar rasterisation performance. It would merely be AMD catching up with Nvidia. 3 times the ray tracing performance is not equivalent to 3 times the performance in every game that uses ray tracing.
Posted on Reply
#203
Am*
SpeedyblupiI was trying to highlight the point that RDNA4 having 3 times the ray tracing performance of RDNA3 would neither be impossible, nor would it give AMD a performance lead over competing Nvidia GPUs with similar rasterisation performance. It would merely be AMD catching up with Nvidia. 3 times the ray tracing performance is not equivalent to 3 times the performance in every game that uses ray tracing.
I do agree -- but in my opinion, ray tracing is not where AMD's priority should be...but rather the feature parity against CUDA, DLSS, RTX video and RTX HDR. They're the only features I've actually missed moving from my RTX 3060 to the 7800 XT -- and by tackling these features, they'll get a 10x better return on their money than investing in anything related to ray tracing, since it will also finally be a showcase of what their cards can do in AI workloads for professionals and this is still where the current investor gold rush is. Even if AMD beats Nvidia in ray tracing at every equivalent SKU by 20%, the market will still pick Nvidia over AMD's GPUs for DLSS or CUDA alone.

So long as AMD are in both of the higher end (relative to Nintendo at least) consoles from Microsoft and Sony, ray tracing is going nowhere for mainstream gaming and will remain an afterthought -- at least until next gen consoles launch...that is more than evident now, considering we've had GPUs with this capability for almost 7 years with barely any progress (compared to past generations like the GTX 400 series for example, where mass feature adoption happened in about 3-5 years from launch and almost everyone got upgraded to hardware capable of the latest feature set, like decent tessellation performance)...by fragmenting the market with multiple different versions of DLSS, selling non-ray tracing capable SKUs like the GTX 1600 cards, the scalping/unavailability of GPUs for about 2 years and the lack of VRAM progression, Nvidia have been their own worst enemy in slowing down mass adoption of ray tracing capable GPUs. And to add to this -- not to mention the huge number of people running old integrated graphics or several generations old GPUs with no ray tracing capability (which is money no game developer is willing to turn down voluntarily -- especially when so many games are being re-released/ported from last gen consoles with not much else besides minor some visual improvements).
Posted on Reply
#204
Slackaveli
eidairaman1Known this for about a year, the big move in later 2025 into 26 is udna
IF they launched UDNA late this year I'd be shocked. I don't expect that until Fall of the following year.
That said I HOPE I am wrong that that UDNA is imminent. It is so weird how they just plan to release a 70-tier card and call it 'good' for an entire GPU Generation.
That shitz Whack AF, y'all.
Posted on Reply
Add your own comment
Jan 6th, 2025 19:44 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts