Tuesday, December 24th 2024

AMD Radeon RX 9070 XT Alleged Benchmark Leaks, Underwhelming Performance

Recent benchmark leaks have revealed that AMD's upcoming Radeon RX 9070 XT graphics card may not deliver the groundbreaking performance initially hoped for by enthusiasts. According to leaked 3DMark Time Spy results shared by hardware leaker @All_The_Watts, the RDNA 4-based GPU achieved a graphics score of 22,894 points. The benchmark results indicate that the RX 9070 XT performs only marginally better than AMD's current RX 7900 GRE, showing a mere 2% improvement. It falls significantly behind the RX 7900 XT, which maintains almost a 17% performance advantage over the new card. These findings contradict earlier speculation that suggested the RX 9070 XT would compete directly with NVIDIA's RTX 4080.

However, synthetic benchmarks tell only part of the story. The GPU's real-world gaming performance remains to be seen, and rumors indicate that the RX 9070 XT may offer significantly improved ray tracing capabilities compared to its RX 7000 series predecessors. This could be crucial for market competitiveness, particularly given the strong ray tracing performance of NVIDIA's RTX 40 and the upcoming RTX 50 series cards. The success of the RX 9070 XT depends on how well it can differentiate itself through features like ray tracing while maintaining an attractive price-to-performance ratio in an increasingly competitive GPU market. We expect these scores not to be the final tale in the AMD RDNA 4 story, as we must wait and see what AMD delivers during CES. Third-party reviews and benchmarks will give the final verdict in the RDNA 4 market launch.
Sources: @All_The_Watts, @GawroskiT
Add your own comment

144 Comments on AMD Radeon RX 9070 XT Alleged Benchmark Leaks, Underwhelming Performance

#1
AcE
If true, not surprised. It's still a midrange card then, could be it lost a lot of additional raster performance due to having proper RT cores, those cores take space. In the end it could be a ~7800 XT with way better RT performance, this is like what I expect worst case. Wouldn't be a bad card, there are only bad prices, so the pricing decides if it's good or not.

In the end, this would be like a much more balanced 7800 XT, the raster and RT performance shouldn't be too much apart.
Posted on Reply
#2
Dahita
AcEIf true, not surprised. It's still a midrange card then, could be it lost a lot of additional raster performance due to having proper RT cores, those cores take space. In the end it could be a ~7800 XT with way better RT performance, this is like what I expect worst case. Wouldn't be a bad card, there are only bad prices, so the pricing decides if it's good or not.
Well,

A mid-range card of the NEXT generation usually is up to part with the top end of the previous one. In that aspect, it is disappointing. I am part of the group of people who don't want to encourage Nvidia's greedy rampage, and this is not exactly helping :(
Posted on Reply
#3
phanbuey
Timespy <> gaming.

That's why the 4070ti outsells the 7900xt by 2.7x
Posted on Reply
#4
john_
At $500 and with at least double or - why not? - triple the RT performance compared to GRE will be more than fine.
Posted on Reply
#5
phanbuey
john_At $500 and with at least double or - why not? - triple the RT performance compared to GRE will be more than fine.
plus an upscaler that you're not hesitant to turn on and that actually improves render quality, rather than make it look like someone rubbed vaseline on your screen (current FSR)
Posted on Reply
#6
eidairaman1
The Exiled Airman
AcEIf true, not surprised. It's still a midrange card then, could be it lost a lot of additional raster performance due to having proper RT cores, those cores take space. In the end it could be a ~7800 XT with way better RT performance, this is like what I expect worst case. Wouldn't be a bad card, there are only bad prices, so the pricing decides if it's good or not.
Known this for about a year, the big move in later 2025 into 26 is udna
Posted on Reply
#7
john_
phanbueyplus an upscaler that you're not hesitant to turn on and that actually improves render quality rather than make it look like someone rubbed vaseline on your screen
It's fine. The only possible problem I have noticed is with water. Other than that, in gaming, no one notices. If there is a problem anyway, just throw a 720p/1080p resolution and let the monitor/TV do the upscaling to it's native resolution. 14 years ago I was playing Borderlands at 720p, on a 1080p 32'' TV from a distance of 70 centimeters(~2 feet?) and never really thought about the resolution once while playing. And I totally enjoyed the game.
Posted on Reply
#8
Speedyblupi
john_At $500 and with at least double or - why not? - triple the RT performance compared to GRE will be more than fine.
I disagree. It wouldn't be terrible, but it wouldn't be good either, it would just be a continuation the same painfully slow gradual process we've been getting for the past 4 years.
AMD released the 7800 XT with similar performance to the 6800 XT with slightly better efficiency and ray tracing, at a slightly lower price. It sold ok (I bought one), but they lost market share overall.
AMD doing the same with the "9070 XT" compared to the RX 7900 GRE isn't enough if they want to retake market share. If Nvidia release an RTX 5070 that's effectively an RTX 4070 Ti for $600, most people will still buy that instead. The 9070 XT would still be behind in path traced or fully ray-traced games, even if it has 3x the ray tracing performance of the GRE.
Posted on Reply
#9
AcE
DahitaA mid-range card of the NEXT generation usually is up to part with the top end of the previous one. In that aspect, it is disappointing. I am part of the group of people who don't want to encourage Nvidia's greedy rampage, and this is not exactly helping :(
As I said, if you use RT it's way better, maybe it's on par with 7900 XT or XTX even. I use RT a lot.. can't speak for everyone

In the end this is rumours, let's wait for proper tests.
Posted on Reply
#10
john_
SpeedyblupiI disagree. It wouldn't be terrible, but it wouldn't be good either, it would just be a continuation the same painfully slow gradual process we've been getting for the past 4 years.
AMD released the 7800 XT with similar performance to the 6800 XT with slightly better efficiency and ray tracing, at a slightly lower price. It sold ok (I bought one), but they lost market share overall.
AMD doing the same with the "9070 XT" compared to the RX 7900 GRE isn't enough if they want to retake market share.
Nvidia dictates progress. If 5060 is the same as 4060 compared to 3060, then AMD will in fact offering more progress at the sub $500 market than Nvidia.
But even if those two fail to offer anything interested under $500, Intel might get serious this time. B580 and B570 look interesting and any B770 might look even more interesting.
Posted on Reply
#11
AcE
john_Nvidia dictates progress.
They don't, if they make low progress they will be overtaken, just what happened to Intel with Ryzen. ;) Historically, also there's extra performance every gen, so 5060 will 100% be faster than 4060.
Posted on Reply
#12
eidairaman1
The Exiled Airman
AcEThey don't, if they make low progress they will be overtaken, just what happened to Intel with Ryzen. ;) Historically, also there's extra performance every gen, so 5060 will 100% be faster than 4060.
To top it off they are pricing themselves out of the market
Posted on Reply
#13
Neo_Morpheus
If this rumor is true and they do charge 500 or more then it will be a mess.

Going by normal advances, it should match a 4080 or maybe a bit bellow but same or slower than a current 7800 XT is not good for that price.

About RT, to each their own, since I personally havent found worth the performance hit and PT is even worse and present in only one game (?) which doesnt really had nothing to the gameplay.

But they need to improve the raster, cant be the same or worse than a 7800 XT.
Posted on Reply
#14
maxfly
Rumors for Christmas, yay.
Posted on Reply
#15
john_
AcEThey don't, if they make low progress they will be overtaken, just what happened to Intel with Ryzen. ;) Historically, also there's extra performance every gen, so 5060 will 100% be faster than 4060.
What is going on in the GPU market looks different compared to what was going on in the CPU market. AMD offering more cores, even with lower IPC, was enough to convince people to go from Intel to AMD. In the GPU market people keep buying the sticker. They don't care about performance or value for money. The propaganda about AMD drivers starting a fire and burning down the house, or AMD features being crap, or games looking like B&W vector graphics without DLSS and Nvidia RT, is imprinted in people's minds. RTX 3050 sells 10 times more than RX 6600.

4060 wasn't faster than 3060 when it came out. It just that while time was passing, 4000 series was getting more optimizations for newer games and had also Frame Generation to create the illusion of a faster card. And that's ignoring the 12GB 3060 model that was and probably still sells at the same or a lower price. 5060 might be 5% faster, that will become 15% after a year or two, offering a couple new features that conveniently wouldn't be supported on 4000 to make it look as a better buy. And probably at a $329 - $349 price for the 8GB model.
Posted on Reply
#16
Dirt Chip
As long as it's priced correctly all is good.
It was customary for the new gen mid 1440p to mach and go over the last gen top 4k or so, but nowadays all rules are long gone.

A new strongly priced mid tire GPU is all we need, imo. The kind that will make the existing line to be non-relevant. Like it was in the past.
Posted on Reply
#17
Guwapo77
Damn, this is worst than I had imagined. I knew it would underperform for my needs, but this is just ridiculous. In short, this will not be a 5700XT.
Posted on Reply
#18
DBGT
john_At $500 and with at least double or - why not? - triple the RT performance compared to GRE will be more than fine.
Are you saying that it is ok to sell a new generation card with almost equal raster performance with same price, only because of RT improvement? I am actually not caring that much of the RT.
Posted on Reply
#19
maxfly
DBGTAre you saying that it is ok to sell a new generation card with almost equal raster performance with same price, only because of RT improvement? I am actually not caring that much of the RT.
No. Keep what you have until the next story is fact based is lesson here.
Posted on Reply
#20
AcE
john_In the GPU market people keep buying the sticker.
No, in fact it didn't help when RDNA 3 came out with driver issues that were caused by a new architecture that is more complicated - dual issue shaders being the culprit but also other things. As I mentioned somewhere else, if AMD streamlines their approach to architecture like Nvidia did, it will make driver issues less and less, and they will have very good drivers even at launch. Now we go back to RDNA 2, RDNA 2 had proper drivers at launch because it was just a bigger RDNA 1 with RT cores added, not a much changed architecture like RDNA 3 was. If RDNA 4 has very good drivers from second 1, it will get nice mind share, just like RDNA 2.
john_and had also Frame Generation to create the illusion of a faster card.
Illusion or not, frame gen works, not so in competitive games, where I would never use it, as it doesn't improve your relative latency, but in other games it's working well and it is really like you are having more FPS. The buyer cares about how good the game works, not about intrinsic technicalities like "if it is generated or real performance". "Real" is also relative and freely arguable. "Real" is for me what works, and not "traditional" performance. So frame gen is very much real, as long as it performs like advertised (and it usually does, I used it extensively in CP2077 for example - I used it in D4 because of CPU bottlenecks in the cities, it worked in both cases). Frame gen has three downsides: 1) it's not really usable for competitive games as it's not making you see enemies better, as the relative latency doesn't improve, 2) it has rare image glitches, the quality is mostly very good, but not always. 3) frame gen is not (really) usable if your fps is under 50-60 fps, without it being turned on. So you can't use it if your frames with settings X aren't high enough.
eidairaman1To top it off they are pricing themselves out of the market
If the prices of Nvidia are too high, AMD can profit easily from this. Capitalism 101. What Nvidia usually does then is lower prices as well, they're quite aggressive, not Intel-like. With RTX 40 gen Nvidia did multiple price cuts, the most prominent 4070 Ti, 4070 Ti Super was just brought to counter 7900 XT, and 4080 Super was a price cut and a counter to 7900 XTX. So no, nvidia does not whatever they want, they react a lot to competition, if there is competition. Same with 4070 Super, which was a reaction to 7800 XT.

edit: more talking points
Posted on Reply
#21
phanbuey
Honestly instead of path tracing i wish they could just save better lightmaps and effects using ML models or some other technique. Path tracing is so grainy and pixelated for so much unnecessary performance hit to produce what at the end of the day is a low quality effect.

Path tracing needs to die, it's an inefficient way of lighting.
Posted on Reply
#22
3valatzy
SpeedyblupiI disagree. It wouldn't be terrible, but it wouldn't be good either, it would just be a continuation the same painfully slow gradual process we've been getting for the past 4 years.
AMD released the 7800 XT with similar performance to the 6800 XT with slightly better efficiency and ray tracing, at a slightly lower price. It sold ok (I bought one), but they lost market share overall.
AMD doing the same with the "9070 XT" compared to the RX 7900 GRE isn't enough if they want to retake market share. If Nvidia release an RTX 5070 that's effectively an RTX 4070 Ti for $600, most people will still buy that instead. The 9070 XT would still be behind in path traced or fully ray-traced games, even if it has 3x the ray tracing performance of the GRE.
john_Nvidia dictates progress. If 5060 is the same as 4060 compared to 3060, then AMD will in fact offering more progress at the sub $500 market than Nvidia.
But even if those two fail to offer anything interested under $500, Intel might get serious this time. B580 and B570 look interesting and any B770 might look even more interesting.
Nvidia doesn't dictate anything. The company will have problems once TSMC stops releasing new processes, which will inevitably happen because the Moore's law has been dead for a while already.
Nvidia relies on the old nodes 6nm and 4nm, and this is a disaster for them.
Posted on Reply
#23
Nostras
I guessed within 5% of a 7900XT but this underwhelming. Like XTX performance was not going to happen or they wouldn't have bothered with the new naming scheme.
If it really is effectively a 7900GRE card in performance and if they "spiritually" make it a 8700XT card by pricing it at 450$ it's pretty good albeit concerning for AMD.
Bonus points for getting consumption down (remember, the 7700XT and 7800XT had similar power consumption making the former piss perf/W) and raytracing in shape.
Posted on Reply
#24
Dahita
john_It's fine. The only possible problem I have noticed is with water. Other than that, in gaming, no one notices. If there is a problem anyway, just throw a 720p/1080p resolution and let the monitor/TV do the upscaling to it's native resolution. 14 years ago I was playing Borderlands at 720p, on a 1080p 32'' TV from a distance of 70 centimeters(~2 feet?) and never really thought about the resolution once while playing. And I totally enjoyed the game.
Come on man. That's not really why we're on the market for a new graphic card.
Posted on Reply
#25
igormp
Wasn't this kinda expected? I'm assuming a 9080 should be taking place eventually, high would be their "high-end" and has always been rumored to perform around or a bit better than the 7900xtx, albeit with better RT performance and lower prices.
If the above is true, then this means that this 9070 is one level below and is what should be competing with the likes of the 5070 at an even lower price.

However, if this is their current "flagship" product, then let's hope it can manage 7900xtx performance in raw raster perf and way better RT perf, or that it has an amazing pricing.
Posted on Reply
Add your own comment
Dec 26th, 2024 01:28 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts