Tuesday, December 24th 2024
AMD Radeon RX 9070 XT Alleged Benchmark Leaks, Underwhelming Performance
Recent benchmark leaks have revealed that AMD's upcoming Radeon RX 9070 XT graphics card may not deliver the groundbreaking performance initially hoped for by enthusiasts. According to leaked 3DMark Time Spy results shared by hardware leaker @All_The_Watts, the RDNA 4-based GPU achieved a graphics score of 22,894 points. The benchmark results indicate that the RX 9070 XT performs only marginally better than AMD's current RX 7900 GRE, showing a mere 2% improvement. It falls significantly behind the RX 7900 XT, which maintains almost a 17% performance advantage over the new card. These findings contradict earlier speculation that suggested the RX 9070 XT would compete directly with NVIDIA's RTX 4080.
However, synthetic benchmarks tell only part of the story. The GPU's real-world gaming performance remains to be seen, and rumors indicate that the RX 9070 XT may offer significantly improved ray tracing capabilities compared to its RX 7000 series predecessors. This could be crucial for market competitiveness, particularly given the strong ray tracing performance of NVIDIA's RTX 40 and the upcoming RTX 50 series cards. The success of the RX 9070 XT depends on how well it can differentiate itself through features like ray tracing while maintaining an attractive price-to-performance ratio in an increasingly competitive GPU market. We expect these scores not to be the final tale in the AMD RDNA 4 story, as we must wait and see what AMD delivers during CES. Third-party reviews and benchmarks will give the final verdict in the RDNA 4 market launch.
Sources:
@All_The_Watts, @GawroskiT
However, synthetic benchmarks tell only part of the story. The GPU's real-world gaming performance remains to be seen, and rumors indicate that the RX 9070 XT may offer significantly improved ray tracing capabilities compared to its RX 7000 series predecessors. This could be crucial for market competitiveness, particularly given the strong ray tracing performance of NVIDIA's RTX 40 and the upcoming RTX 50 series cards. The success of the RX 9070 XT depends on how well it can differentiate itself through features like ray tracing while maintaining an attractive price-to-performance ratio in an increasingly competitive GPU market. We expect these scores not to be the final tale in the AMD RDNA 4 story, as we must wait and see what AMD delivers during CES. Third-party reviews and benchmarks will give the final verdict in the RDNA 4 market launch.
204 Comments on AMD Radeon RX 9070 XT Alleged Benchmark Leaks, Underwhelming Performance
RX 7600 based on it's name was the successor of the RX 6600. BUT it had the specs of RX 6650XT and was selling at the market price of RX 6650XT.
So, successor by name, but not based on specs and price.
If RX 9070XT comes at the current retail price of 7700XT, then yes.
Raster was something AMD were, and still are, good at. Fermi was released 6 months after evergreen and was still significantly inferior in raster perf/watt compared to Fermi. So what did nVidia do? Push game companies to use unnecessary amounts of Tessellation (read: absolutely fucking unnecessary amounts) by paying them directly to do so because the Fermi was good at it. It showed their garbage mentality just how they did the same, just in a less obvious way, with RT.
Here's the thing though - in both instances they pushed graphics with unnecessary gimmicks that didn't look better but performance absolutely tanked on their GPU's and even worse on their competitors. The consumers were the ones to lose out in the end while they gained relative performance compared to their competitors.
For it's time, Fermi was garbage. Maybe less so than Geforce FX which got it's ass handed to it by the 9700Pro, but it definitely lost the battle compared to evergreen which was a much better balanced architecture released half a year before it. Also remember that the 5870 overclocked significantly better. But hey, you can pay reviewers, game companies and throw around quotations like it's 40% faster in extreme tessellation and change the consumers mentality and still sell a ton of half baked GPU's.
AMD pulled a few of their own shenanigans too, but it's peanuts compared to the awful, slimy crock of BS nvidia pulled for decades.
Meanwhile 7000 users have Rebar for another 10-17% increase in raster and SAM for another 10-17% increase in raster. Do you know what that means? The 7000 users club always has new posts. Not one post about 7000 GPUs has complained about performance. AMD also confirmed that to us. What you don't see is that AMD is in both Consoles and we are getting out of the Exclusive (some what) to bring all Games to PC. When Hogwarts launched and there were so many complaints about performance 7000 users wondered what all the noise was about. You see with technology the fact that Ryzen and Radeon are in the Consoles guarantees that you will have better performance if you have parts that are in the same famiy but faster. What would I use as evidence? CP2077 in that Game I can get up to 180 FPS running raw raster at 4K. RT is for those that want it but is anethema to the PC narrative. You see we used to be all about resolution and raster. 1440P looks better than 1080P and 4K looks better than 1440P. The narrative now says that 1440P is the best option but that is because only 6 or 7 GPUs can give you high frame rates at 4K. Guess what? That includes the 7900XTX and 7900XT.
4070 is probably strong enough to take advantage of 4 extra GBs and Nvidia probably knows it. Nvidia is using VRAM as a way to limit the life of a good product for many many years. If Nvidia could offer 16GBs at $400, it could definitely offer those 16GBs for $600, the MSRP price of 4070. But they didn't. They put 16GBs on a card that can't use them, and they limit the VRAM capacity on a card that can use it. That way both products will have limited life before starting losing in benchmarks. anathema
The rest of your rambling is just your daily ranting hogwash. Seriously, every post you make is the same tired anti-Nvidia falsehoods.
Thats how brains works if owning AMD hardware.
The company you work for buying AMD hardware doesn't count, CPU's don't count either, and I know the people that are always crapping on AMD never even consider an AMD graphics card because they're part of the mindshare. It's not the past, there was a poll here and most people voted for rasterization over ray tracing, and when most GPU's sold are in the xx60 class aren't capable of RT without fake frames, most people don't care about RT either. Although, Nvidia and the tech media has pushed RT as being the best thing ever to the point game devs are forcing RT on by default.
This is a AMD thread though, you're the one always bashing on AMD in these threads.
I don’t play childish epeen games, that’s why my specs aren’t listed. I mean seriously, what brand of chip is in a device that isn’t yours matters that much to you? Trying to put me on a team isn’t going to work, because being a fanboy of a consumer brand is freaking stupid.
Please explain to me what the 7900 means to “AMD users”. I’m fascinated by people that have an emotional attachment to a manufactured commodity. Especially people that think AMD is different from Nvidia, Intel, or Volkswagen.
If I am a fan boy of anything it is Thermalright CPU Coolers that I am indeed guilty of.
I avoid posting my specs here because I know the mindshare will only go after me even more for it, and as for 7900 users, they aren't listening to the biased nonsense and don't care about gimmicky features, some just want to play games without having to replace their GPU in a year because of insufficient VRAM.
As for the rest, thanks for proving my point AGAIN. You just can’t stop talking about Nvidia in an AMD thread. Do you have all your points saved in notepad so you can just copy-paste them multiple times a day? Or do you use Onenote?
That precious GRE lineup isn't coming back (for obvious reasons) and the 7800XT is more than enough for mainstream desktop that the rest literally doesn't matter. I need the additional resolution for Desktop+VR+Compute+Rec+Stream, which is a sliver of an imaginary percentage of users let alone gamers that double as developers. At this point we're just waiting for whatever ground breaking features hit the spotlight. The 9070 is going to have a very warm welcome if it hits the shelves at 7900GRE pricing, otherwise the adjacent SKUs get a price drop while the rest will rot. Either way I win.
The way you said it sounded like you could enable rebar, get 10~17% perf, and then add SAM to get another 10~17% on top of the previous one, which is not the case.