Tuesday, December 24th 2024
AMD Radeon RX 9070 XT Alleged Benchmark Leaks, Underwhelming Performance
Recent benchmark leaks have revealed that AMD's upcoming Radeon RX 9070 XT graphics card may not deliver the groundbreaking performance initially hoped for by enthusiasts. According to leaked 3DMark Time Spy results shared by hardware leaker @All_The_Watts, the RDNA 4-based GPU achieved a graphics score of 22,894 points. The benchmark results indicate that the RX 9070 XT performs only marginally better than AMD's current RX 7900 GRE, showing a mere 2% improvement. It falls significantly behind the RX 7900 XT, which maintains almost a 17% performance advantage over the new card. These findings contradict earlier speculation that suggested the RX 9070 XT would compete directly with NVIDIA's RTX 4080.
However, synthetic benchmarks tell only part of the story. The GPU's real-world gaming performance remains to be seen, and rumors indicate that the RX 9070 XT may offer significantly improved ray tracing capabilities compared to its RX 7000 series predecessors. This could be crucial for market competitiveness, particularly given the strong ray tracing performance of NVIDIA's RTX 40 and the upcoming RTX 50 series cards. The success of the RX 9070 XT depends on how well it can differentiate itself through features like ray tracing while maintaining an attractive price-to-performance ratio in an increasingly competitive GPU market. We expect these scores not to be the final tale in the AMD RDNA 4 story, as we must wait and see what AMD delivers during CES. Third-party reviews and benchmarks will give the final verdict in the RDNA 4 market launch.
Sources:
@All_The_Watts, @GawroskiT
However, synthetic benchmarks tell only part of the story. The GPU's real-world gaming performance remains to be seen, and rumors indicate that the RX 9070 XT may offer significantly improved ray tracing capabilities compared to its RX 7000 series predecessors. This could be crucial for market competitiveness, particularly given the strong ray tracing performance of NVIDIA's RTX 40 and the upcoming RTX 50 series cards. The success of the RX 9070 XT depends on how well it can differentiate itself through features like ray tracing while maintaining an attractive price-to-performance ratio in an increasingly competitive GPU market. We expect these scores not to be the final tale in the AMD RDNA 4 story, as we must wait and see what AMD delivers during CES. Third-party reviews and benchmarks will give the final verdict in the RDNA 4 market launch.
204 Comments on AMD Radeon RX 9070 XT Alleged Benchmark Leaks, Underwhelming Performance
Path tracing isn't magic, it's mathematics, just like any other method of generating computer graphics. You can ray trace or path trace on a GPU with no RT cores, or even on a CPU. It won't run very well, but you can do it. The indie game "Teardown" is fully ray-traced (not rasterised - its use of voxels allows ray tracing to work at low ray count without looking like complete ass), doesn't use RT cores, and is playable (albeit only at relatively low resolutions and frame rates) on old GPUs like the RX 580 and GTX 1060. Nvidia's RT cores are just much better at path tracing than AMD's, and RDNA4 will hopefully change that. You don't need to hate anything, I completely agree.
I used Cyberpunk 2077 with PT as an example, because I wanted to find a situation which was as close to a performance of pure ray/path tracing performance as possible. The overwhelming majority of games which use RT or PT, are primarily rasterised, and only overlay the tracing on top for reflections, lighting, and shadows as an additional effect or embellishment on top of the rasterised image.
My choice of example was intended to show a situation where tracing performance is the primary factor in the performance result, and rasterisation isn't significant.
I fully understand that this isn't representative of the difference in performance in realistic gaming scenarios, and I apologise if my choice of example was misleading.
In a more realistic situation, of a primarily rasterised game which uses some traced effects, an RX 7900 GRE is much closer to the performance of an RTX 4070 Ti, and an RX 7900 XTX is often faster overall. The point I was trying to make is that the Nvidia GPU will lose much less performance when ray/path tracing is enabled compared to when it's disabled, and that RDNA4 having 3x the tracing performance of RDNA3 would allow them to close this gap. For example, rasterisation might be 85% of the frame time for an RTX 4070 Ti, with the remaining 15% being ray tracing, while an RX 7900 XTX might need to spend 45% of its frame time on ray tracing; so even though its rasterisation performance is much higher, it might not be much faster overall in games that use ray tracing.
And also, over the next few years, more games will make use of more intensive ray/path-traced effects, so tracing performance will become even more important over time. Even so, I don't expect that examples as extreme as Cyberpunk 2077 with PT will be directly relevant to the average gamer any time soon, but it is still indirectly relevant, as an indication of ray/path-tracing performance as a component of total gaming performance.
I was trying to highlight the point that RDNA4 having 3 times the ray tracing performance of RDNA3 would neither be impossible, nor would it give AMD a performance lead over competing Nvidia GPUs with similar rasterisation performance. It would merely be AMD catching up with Nvidia. 3 times the ray tracing performance is not equivalent to 3 times the performance in every game that uses ray tracing.
So long as AMD are in both of the higher end (relative to Nintendo at least) consoles from Microsoft and Sony, ray tracing is going nowhere for mainstream gaming and will remain an afterthought -- at least until next gen consoles launch...that is more than evident now, considering we've had GPUs with this capability for almost 7 years with barely any progress (compared to past generations like the GTX 400 series for example, where mass feature adoption happened in about 3-5 years from launch and almost everyone got upgraded to hardware capable of the latest feature set, like decent tessellation performance)...by fragmenting the market with multiple different versions of DLSS, selling non-ray tracing capable SKUs like the GTX 1600 cards, the scalping/unavailability of GPUs for about 2 years and the lack of VRAM progression, Nvidia have been their own worst enemy in slowing down mass adoption of ray tracing capable GPUs. And to add to this -- not to mention the huge number of people running old integrated graphics or several generations old GPUs with no ray tracing capability (which is money no game developer is willing to turn down voluntarily -- especially when so many games are being re-released/ported from last gen consoles with not much else besides minor some visual improvements).
That said I HOPE I am wrong that that UDNA is imminent. It is so weird how they just plan to release a 70-tier card and call it 'good' for an entire GPU Generation.
That shitz Whack AF, y'all.