Tuesday, December 24th 2024

AMD Radeon RX 9070 XT Alleged Benchmark Leaks, Underwhelming Performance

Recent benchmark leaks have revealed that AMD's upcoming Radeon RX 9070 XT graphics card may not deliver the groundbreaking performance initially hoped for by enthusiasts. According to leaked 3DMark Time Spy results shared by hardware leaker @All_The_Watts, the RDNA 4-based GPU achieved a graphics score of 22,894 points. The benchmark results indicate that the RX 9070 XT performs only marginally better than AMD's current RX 7900 GRE, showing a mere 2% improvement. It falls significantly behind the RX 7900 XT, which maintains almost a 17% performance advantage over the new card. These findings contradict earlier speculation that suggested the RX 9070 XT would compete directly with NVIDIA's RTX 4080.

However, synthetic benchmarks tell only part of the story. The GPU's real-world gaming performance remains to be seen, and rumors indicate that the RX 9070 XT may offer significantly improved ray tracing capabilities compared to its RX 7000 series predecessors. This could be crucial for market competitiveness, particularly given the strong ray tracing performance of NVIDIA's RTX 40 and the upcoming RTX 50 series cards. The success of the RX 9070 XT depends on how well it can differentiate itself through features like ray tracing while maintaining an attractive price-to-performance ratio in an increasingly competitive GPU market. We expect these scores not to be the final tale in the AMD RDNA 4 story, as we must wait and see what AMD delivers during CES. Third-party reviews and benchmarks will give the final verdict in the RDNA 4 market launch.
Sources: @All_The_Watts, @GawroskiT
Add your own comment

129 Comments on AMD Radeon RX 9070 XT Alleged Benchmark Leaks, Underwhelming Performance

#26
TheDeeGee
phanbueyPath tracing needs to die, it's an inefficient way of lighting.
It will make the developer's life a LOT easier than having to hand craft baked lighting.

Path Tracing is the next step in bringing the current stagnated graphics to a new level, wether people like it or not, it's going to happen eventually.
Posted on Reply
#27
KLMR
For me, at this point, its a problem of price*, not performance.
If they've lowered their costs enough it would be a very interesting card as the 7xxx series or the new arc are.

*The problem is the inflation of prices since 1080 and many prople trying to play unoptimized games at "4k" (UHD) with raytracing (doubtly usefull graphics improvements).

We need to cool down, and let them cook in their financial problems.
Nobody needs a gaming PC, an AI laptop, etc. except for the influcencers and "pros", the ones bringing the illusion "4k", 600fps for "competitive" games, etc.
Posted on Reply
#28
AcE
phanbueyPath tracing is so grainy and pixelated for so much unnecessary performance hit to produce what at the end of the day is a low quality effect.
It can look quite good and do things traditional rendering can't. If it doesn't look good, it usually doesn't take much performance either, so there is that.
Posted on Reply
#29
phanbuey
TheDeeGeeIt will make the developer's life a LOT easier than having to hand craft baked lighting.

Path Tracing is the next step in bringing the current stagnated graphics to a new level, wether people like it or not, it's going to happen eventually.
Yeah but with AI, and ML models there has to be a better way that also makes developers lives easier. It's a brute force way of rendering light, and most brute force things are usually the wrong way to do something. It's been 3-4 generations and it still looks like quite bad - maybe in another 3 generations it will become better -- but if we're already at ML rendering techniques, why not just switch to that?

If AI pipeline rendering is good enough to fill in someones face, then it's good enough to fill in shadows and generate what light looks like.
Posted on Reply
#30
damric
What if it IS a rebranded 7900 GRE?
Posted on Reply
#31
TumbleGeorge
phanbueyYeah but with AI, and ML models there has to be a better way that also makes developers lives easier. It's a brute force way of rendering light, and most brute force things are usually the wrong way to do something. It's been 3-4 generations and it still looks like quite bad.
I refuse to pay more for hardware to help live of game developers to be easier. Because this hurts my budget.
Posted on Reply
#32
JohH
It is performing like a 64 or 60 CU card. No surprise as it is such a card. It was long suspected to be between 7900 GRE and XT.

I'm sorry to anyone who had unrealistic expectations because of random people on the web making things up. But that's your own fault. The rumored 4080 performance referenced in the article was posted by a Chinese Nvidia fanboy. One can only guess at his goal in overhyping Radeon.
Posted on Reply
#33
Darmok N Jalad
Price is everything, no matter what they call it. If it launches ahead of 5000 series, that price is just a place keeper until nvidia plays its hand. It doesn’t help that rumors have the Arc B770 delayed almost a year, so there’s not a whole lot of pressure from below either. I guess we also don’t know if this will be a more balanced generation, as AMD has been playing catchup in RT from the beginning.
Posted on Reply
#34
Broken Processor
If the benchmark is correct even the most hardened AMD fanboy will require a full bottle of copium to shell out 600 quid on one.
Posted on Reply
#35
gridracedriver
Disappointing without knowing the price?

Then the B580 is also disappointing, 270mmq for low-end performance, not mid range.
Posted on Reply
#36
Dahita
AcEAs I said, if you use RT it's way better, maybe it's on par with 7900 XT or XTX even. I use RT a lot.. can't speak for everyone

In the end this is rumours, let's wait for proper tests.
Except that was the one downside of the 7900XT. So getting a new product that is as good as the worst perf of the top tier of previous gen... is not being on part with it everywhere else.
Posted on Reply
#37
Ruru
S.T.A.R.S.
To be honest, who cares about synthetics, it's the gaming performance we care about.
Posted on Reply
#38
TPUnique
gridracedriverDisappointing without knowing the price?

Then the B580 is also disappointing, 270mmq for low-end performance, not mid range.
Yeah. If it's, say, released at 350€, that'd be quite tempting.
Posted on Reply
#39
ZoneDymo
doesnt matter, at hte end of the day, as the (sadly) late Gorden Mah Ung would say: "you are going to buy Nvidia anyway".
Posted on Reply
#40
rv8000
Seems the rumor dooming and nvidia zealots are already kicking at full tilt.

P.S.A.: all upscalers degrade image quality regardless of whose implementation.
Posted on Reply
#41
3valatzy
Broken ProcessorIf the benchmark is correct even the most hardened AMD fanboy will require a full bottle of copium to shell out 600 quid on one.
If the price is that high, the card will be naturally DOA (dead-on-arrival).
And since we know that AMD will not ask a normal price, then we can declare DOA already!
Posted on Reply
#42
Vayra86
john_It's fine. The only possible problem I have noticed is with water. Other than that, in gaming, no one notices. If there is a problem anyway, just throw a 720p/1080p resolution and let the monitor/TV do the upscaling to it's native resolution. 14 years ago I was playing Borderlands at 720p, on a 1080p 32'' TV from a distance of 70 centimeters(~2 feet?) and never really thought about the resolution once while playing. And I totally enjoyed the game.
Its about a stable image though, resolution isn't an influence on that. Image stability means pixels are what they are, and not having constant artifacting on anything that is in motion, or a heavily blurred image to hide those motion artifacts. And unfortunately that is what FSR still is, and DLSS is not much better at it either: things in motion will create artifacts. Sometimes akin to ghosting. Other times you've got several trailed copies of a license plate (Cyberpunk) while driving, or the grass seems to split in front of you instead of moving with the wind.

Even if minor, those errors stand out because they are simply unnatural, uncanny, and butt ugly. Upscaling is not quite at the point of completely eliminating those issues - not even DLSS - but FSR is notably worse at that. In static imagery, it looks fine. In motion? I can't get used to it. The same thing applies to the vaseline filter that is running a below native res form of upscaling. It ain't better. Its a notable loss of fidelity to gain some FPS, simple as that.
Posted on Reply
#43
Noyand
phanbueyYeah but with AI, and ML models there has to be a better way that also makes developers lives easier. It's a brute force way of rendering light, and most brute force things are usually the wrong way to do something. It's been 3-4 generations and it still looks like quite bad - maybe in another 3 generations it will become better -- but if we're already at ML rendering techniques, why not just switch to that?

If AI pipeline rendering is good enough to fill in someones face, then it's good enough to fill in shadows and generate what light looks like.
Visual statbility and consistency frame by frame is probably going to be an issue. Path tracing does a lot of things that are more complex than it looks, and still allow you to have some creative freedom. It’s not just about how light and shadows will look like, but how lights is going to interact with the properties of a specific material, and it’s not uncommon to see rendering artists using settings that don’t make sense when you look at the law of physics, but give the right look for the artistic direction.

VFX studios are still favouring the brute force approach since it’s still the most stable and reliable thing around.
Posted on Reply
#44
Am*
SpeedyblupiI disagree. It wouldn't be terrible, but it wouldn't be good either, it would just be a continuation the same painfully slow gradual process we've been getting for the past 4 years.
AMD released the 7800 XT with similar performance to the 6800 XT with slightly better efficiency and ray tracing, at a slightly lower price. It sold ok (I bought one), but they lost market share overall.
AMD doing the same with the "9070 XT" compared to the RX 7900 GRE isn't enough if they want to retake market share. If Nvidia release an RTX 5070 that's effectively an RTX 4070 Ti for $600, most people will still buy that instead. The 9070 XT would still be behind in path traced or fully ray-traced games, even if it has 3x the ray tracing performance of the GRE.
I hate to be that guy, but literally nobody cares about path tracing when it comes with that much of a performance penalty. Not even the most die hard Nvidia zealots running 4090s. Ask anyone running one if they'd rather run this game at 1080p 60FPS path traced or 4K 60FPS with RT and DLSS Quality on their 4K monitors when actually playing the game and not benchmarking.

Until low/mid-range cards like the RTX 5060 can pull off framerates of 30FPS or at least above slideshow levels, these path tracing benches are as worthless as those Ashes of the Singularity async compute benches that AMD used to be obsessed with back in the day: it only matters on one page for reviewers and absolutely nowhere in the real world. If this card's RT performance is even under Nvidia's by 10-15% and raster performance is better than Nvidia's equivalent card by at least that same amount, anyone with a brain will pick the 16GB AMD card over the already obsoleted 12GB Nvidia card (if the VRAM capacity rumours are to be believed of re-using 12GB capacity for 5070 class GPUs and 8GB for the 5060s) because PS5 ports without RT already need 12GB or more VRAM for native 4K. I'll take the much higher frame rate and native resolution + settings with realistic lighting but no/minimal RT over the lower frame rate and resolution path traced one with glistening mirror-looking floors and fuzzy reflections of RT anyday.
Posted on Reply
#45
mb194dc
If its true, it'll be one of the most disappointing product launches in a long time. Even a 6950xt tweaked can get 23k gpu score on timespy.

Probably signals the death of progress in the mid range gpu market.
Posted on Reply
#46
Nostras
mb194dcIf its true, it'll be one of the most disappointing product launches in a long time. Even a 6950xt tweaked can get 23k gpu score on timespy.

Probably signals the death of progress in the mid range gpu market.
Did everyone forget about the RX480 or even the 5700XT? The former was a massive downgrade compared to prev gen and the latter was also notably behind.
If the price is right it's about as far as it could be from the death of progress in the mid range gpu market.
Unless you think a 4080 Super for 1000$ is mid-end. Fair enough I guess.
Posted on Reply
#47
3valatzy
NoyandVisual statbility and consistency frame by frame is probably going to be an issue. Path tracing does a lot of things that are more complex than it looks, and still allow you to have some creative freedom. It’s not just about how light and shadows will look like, but how lights is going to interact with the properties of a specific material, and it’s not uncommon to see rendering artists using settings that don’t make sense when you look at the law of physics, but give the right look for the artistic direction.

VFX studios are still favouring the brute force approach since it’s still the most stable and reliable thing around.
I think that the studios should first take real-world existing objects, scan them, and put them in their game engines, because it literally makes no sense to have minecraft running with super expensive path/ray-tracing at 1080p60. It's nonsense. It's just a waste of precious computing time and resources.
mb194dcIf its true, it'll be one of the most disappointing product launches in a long time. Even a 6950xt tweaked can get 23k gpu score on timespy.

Probably signals the death of progress in the mid range gpu market.
This is because the product is not a die shrink. Historically, it's the die shrinks which drive progress - you get double the shaders in the same wafer area in the same power envelope. This is what is gone now, hence we get rebrands.
Posted on Reply
#48
720p low
Let's say I took a sheet of paper, drew a line down the center vertically to form two columns, wrote, "Next-gen AND GPU Rumors" at the top of first column, and, "Actual Next-gen AMD GPU Facts" at the top of the second. I'd likely need to re-sharpen the pencil after completing the first column, but, would I, or anyone else, be able to write a single, meaningful thing in the second column at this point? Perhaps, "On the way?" Has anyone at RTG, like Sam Naffziger or Frank Azor offered anything yet? (Assuming either are still AMD employees.)

I well understand that it's entertaining to chew over rumors and such, but let's keep some perspective.

And, yes, you're right... I suppose I'm not much fun at parties.
Posted on Reply
#49
Dragokar
For me it comes down to price and performance with my self applied tdp limit ~250W. I have to build several PCs early next year and wait to see the entry offers from all three of them, and do hope that we get nice offerings thanks to Intel.
Posted on Reply
#50
TumbleGeorge
DragokarFor me it comes down to price and performance with my self applied tdp limit ~250W. I have to build several PCs early next year and wait to see the entry offers from all three of them, and do hope that we get nice offerings thanks to Intel.
I hope UN prohibit the use of computers drawing more than 250 watts from the electrical outlet, including the case, monitor and all other peripherals.
Posted on Reply
Add your own comment
Dec 25th, 2024 12:46 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts