Tuesday, December 24th 2024
AMD Radeon RX 9070 XT Alleged Benchmark Leaks, Underwhelming Performance
Recent benchmark leaks have revealed that AMD's upcoming Radeon RX 9070 XT graphics card may not deliver the groundbreaking performance initially hoped for by enthusiasts. According to leaked 3DMark Time Spy results shared by hardware leaker @All_The_Watts, the RDNA 4-based GPU achieved a graphics score of 22,894 points. The benchmark results indicate that the RX 9070 XT performs only marginally better than AMD's current RX 7900 GRE, showing a mere 2% improvement. It falls significantly behind the RX 7900 XT, which maintains almost a 17% performance advantage over the new card. These findings contradict earlier speculation that suggested the RX 9070 XT would compete directly with NVIDIA's RTX 4080.
However, synthetic benchmarks tell only part of the story. The GPU's real-world gaming performance remains to be seen, and rumors indicate that the RX 9070 XT may offer significantly improved ray tracing capabilities compared to its RX 7000 series predecessors. This could be crucial for market competitiveness, particularly given the strong ray tracing performance of NVIDIA's RTX 40 and the upcoming RTX 50 series cards. The success of the RX 9070 XT depends on how well it can differentiate itself through features like ray tracing while maintaining an attractive price-to-performance ratio in an increasingly competitive GPU market. We expect these scores not to be the final tale in the AMD RDNA 4 story, as we must wait and see what AMD delivers during CES. Third-party reviews and benchmarks will give the final verdict in the RDNA 4 market launch.
Sources:
@All_The_Watts, @GawroskiT
However, synthetic benchmarks tell only part of the story. The GPU's real-world gaming performance remains to be seen, and rumors indicate that the RX 9070 XT may offer significantly improved ray tracing capabilities compared to its RX 7000 series predecessors. This could be crucial for market competitiveness, particularly given the strong ray tracing performance of NVIDIA's RTX 40 and the upcoming RTX 50 series cards. The success of the RX 9070 XT depends on how well it can differentiate itself through features like ray tracing while maintaining an attractive price-to-performance ratio in an increasingly competitive GPU market. We expect these scores not to be the final tale in the AMD RDNA 4 story, as we must wait and see what AMD delivers during CES. Third-party reviews and benchmarks will give the final verdict in the RDNA 4 market launch.
130 Comments on AMD Radeon RX 9070 XT Alleged Benchmark Leaks, Underwhelming Performance
That being said I would love to only have native options and the game's engine just throw ALL the pixels in their correct positions instead of trying to guess what to show next.
"Please AMD, build something good and cheap, so Intel and Nvidia drop their prices, so I could go and buy cheaper Intel and Nvidia hartdware".
The performance hit from RT for what amounts to slightly better shadows and reflections isn't worth it at all.
Upscaling is another step backwards, reducing image quality and introducing shimmering and other artifacting.
AMD shouldn't bother with either and say why they're doing it. They'll never get anywhere copying Nvidia's sales tactics but doing it much worse.
What about the real buyers who want to buy a Radeon at a discount ? :confused: It will offer around 35% higher performance than RX 7700 XT, around 15% higher performance than RX 7800 XT, and around 90% higher performance than RX 7600.
If AMD thinks progress, it has to market it as an RX 7600 successor, and charge no more than $299 for it.
In this way, the reviewers, readers and buyers would get impressions that it actually brings the long-awaited performance progress.
This card at $500 is way too high it should be $350, no more.
It's not trolling. It's something happening for over a decade. AMD having the stump of the value for money brand makes many to avoid it, even when offering better value and/or performance. Many will go and buy the "premium" sticker than accepting anything with an AMD logo on it. It's something that AMD needs to deal with, it's the reason why Qualcomm came out with only expensive laptops for Windows, instead of offering cheap options to get market share. They scared to be seen as a budget option. You would never put such a low price on an equivalent Nvidia model. While Intel's B580 is promising and 12GBs for $250 MSRP is a step in the right direction, B580 looks good compared to 2 and 4 years old models from AMD and Nvidia. We haven't seen RTX 5050 and RX 9060 or whatever AMD names it, to see how B580 positions next to them. But yes, for now Intel looks like the one player who tries to get market share with better pricing. With AMD things are more complicated. They usually try to avoid provoking a price war with Nvidia while at the same time taking advantage of Nvidia's high pricing. Now the market is different and while they might start again with high MSRP's, if Intel's solutions start gaining consumer attention and support, they could be forced to lower pricing. What I fear is that even if AMD lowers pricing, people today will keep ignoring AMD's offerings. So monopoly will remain intact for the next years, with consumers expecting the other premium brand, Intel, to save them.
Star Wars Outlaws
Cyberpunk 2077 Overdrive
Indiana Jones and The Great Circle
Black Myth Wukong
Desordre
Alan Wake 2
Minecraft RTX
Quake 2 RTX
Portal RTX
Portal Prelude RTX
Call of Duty Modern Warfare 3
Call of Duty Warzone
Dragon's Dogma 2 (mod)
Resident Evil 2 Remake (mod)
Resident Evil 3 Remake (mod)
Resident Evil 4 Remake (mod)
Doom 1 (mod)
Doom 2 (mod)
Quake 1 (mod)
Half Life 1 (mod)
Serious Sam First Encounter (mod)
Lisa doesn't care about discrete GPU's. It's pretty obvious given how she's operated AMD for all these years. Love her for the CPU work, if you like, but don't sugarcoat the complete failure she's done in the discrete market. Who goes four generations without a hardware-based upscaling solution and proper ray tracing support and expects success? Radeon is something she uses to advertise for APU's. One day soon she's going to ditch the discrete GPU space entirely once the APU's thing starts making people say they don't "really" need a discrete card unless they're rich. "There's always upscaling." Intel will have gone belly up by then and carved up for IP. And Jensen will finally be able to do whatever he likes to what's left of the discrete market, sending the rest scrambling to his cloud service he's happy to charge you for per month plus per hour after you exceed his artificially created cap.
Or you can buy a handheld/console with one of those AMD APU's Lisa thinks is enough because that's all she'll have to offer you.
All because Lisa doesn't care for discrete GPU's. I wish they'd spin Radeon off just like I wish Nvidia would spin Geforce off. Then the ones making our discrete GPU's could finally start competing again and give us cards we deserve with features we need at prices appropriate to the audience as though they really want our business instead of waiting on us to require an upgrade because our card's soldering was too poor to last more than ten years.
Also, the same leak (why doesn’t TPU ever link sources?) says $649 USD. $100 more than the current 7900 GRE price for the same performance. Now you know why AMD discontinued the GRE.
While the RX 5700 XT was not the best card back in 2019, it was a significant change since they released a new architecture, from GCN to RDNA for consumer cards. If I remember correctly, it was matched between the RTX 2070 and the RTX 2070 Super (64 ROPs/2560 shaders/160 TMUs).
Remember that the Radeon VII and the RX 5700 XT had very similar gaming performance, but in the end the RX 5700 XT evolved more once the drivers for RDNA matured, particularly in 2020.
If it ends up being 5% faster than the 7900GRE in games and costs $500 it would be a big improvement over anything that we have now! The GRE sold for $600 for the longest time, though they can still pop up here and there for $570 or $580.
So a GPU that is faster than the GRE, is more power efficient, has up to 2.5x RT performance and costs $100 less, fuck yeah!!! That is a great card and a big win for consumers, considering Nvidia's 5070 is going to cost $800 and the 5070ti is going to cost $1000.
The way to fight that is consistently offer a product stack with some benefit. Rx 6000s had that, they were faster then Nvidia in raster, but they fell behind the very next gen and never caught up. Why 250? You can build a perfectly functional PC with a 100w TDP rating.
Also missed the part of me not caring about it due to the performance hit without providing anything to the gameplay.
Oh, hi Wolf :D