Thursday, January 9th 2025
AMD Radeon RX 9070 XT Benchmarked in 3D Mark Time Spy Extreme and Speed Way
Although it has only been a few days since the RDNA 4-based GPUs from Team Red hit the scene, it appears that we have already been granted a first look at the 3D Mark performance of the highest-end Radeon RX 9070 XT GPU, and to be perfectly honest, the scores seemingly live up to our expectations - although with disappointing ray tracing performance. Unsurprisingly, the thread has been erased over at Chiphell, but folks have managed to take screenshots in the nick of time.
The specifics reveal that the Radeon RX 9070 XT will arrive with a massive TBP in the range of 330 watts, as revealed by a FurMark snap, which is substantially higher than the previous estimated numbers. With 16 GB of GDDR6 memory, along with base and boost clocks of 2520 and 3060 MHz, the Radeon RX 9070 XT managed to rake in an impressive 14,591 points in Time Spy Extreme, an around 6,345 points in Speed Way. Needless to say, the drivers are likely far from mature, so it is not outlandish to expect a few more points to get squeezed out of the RDNA 4 GPU.Regarding the scores we currently have, it appears that the Radeon RX 9070 XT fails to match the Radeon RX 7900 XTX in both the tests, although it easily exceeds the GeForce RTX 4080 Super in the non-ray-traced TS Extreme test. In the Speed Way test, which is a ray-traced benchmark, the RX 9070 XT fails to match the RTX 4080 Super, falling noticeably short. Considering that it costs less than half the price of the RTX 4080 Super, this is no small feat. Interestingly, an admin at Chiphell, commented that those planning on grabbing an RTX 50 card should wait, further hinting that the GPU world has "completely changed". Considering the lack of context, the interpretation of the statement is debatable, but it does seem RDNA 4 might pack impressive price-to-performance that may give mid-range Blackwell a run for its money.
Sources:
Chiphell, @0x22h
The specifics reveal that the Radeon RX 9070 XT will arrive with a massive TBP in the range of 330 watts, as revealed by a FurMark snap, which is substantially higher than the previous estimated numbers. With 16 GB of GDDR6 memory, along with base and boost clocks of 2520 and 3060 MHz, the Radeon RX 9070 XT managed to rake in an impressive 14,591 points in Time Spy Extreme, an around 6,345 points in Speed Way. Needless to say, the drivers are likely far from mature, so it is not outlandish to expect a few more points to get squeezed out of the RDNA 4 GPU.Regarding the scores we currently have, it appears that the Radeon RX 9070 XT fails to match the Radeon RX 7900 XTX in both the tests, although it easily exceeds the GeForce RTX 4080 Super in the non-ray-traced TS Extreme test. In the Speed Way test, which is a ray-traced benchmark, the RX 9070 XT fails to match the RTX 4080 Super, falling noticeably short. Considering that it costs less than half the price of the RTX 4080 Super, this is no small feat. Interestingly, an admin at Chiphell, commented that those planning on grabbing an RTX 50 card should wait, further hinting that the GPU world has "completely changed". Considering the lack of context, the interpretation of the statement is debatable, but it does seem RDNA 4 might pack impressive price-to-performance that may give mid-range Blackwell a run for its money.
146 Comments on AMD Radeon RX 9070 XT Benchmarked in 3D Mark Time Spy Extreme and Speed Way
[INDENT][/INDENT]
[INDENT]"If the above leak is true, we might still see an RX 9060 with 8GB memory and an RX 9060 XT with 16GB", but the XT model might now get a wider, 192-bit interface."[/INDENT]
So the first part of that sentence is referencing the 'if the above leak is true' and refers to a 128-bit Navi44 in 8GB and 16GB flavours for the 9060 and 9060XT respectively.
The second part of that sentence "but the XT model might now get a wider, 192-bit interface" is referring to newer information that Navi44 might actually be a 192-bit die, overruling the "if the above leak is true" first part of that sentence.
I don't read that sentence and interpret it as "16GB 192-bit 9060XT"
I Interpret it as "leak says 8+16GB 128-bit cards, new rumours hint that it's actually a 192-bit design"
If Navi44 is actually a 192-bit design, then that doesn't rule out a die-harvested, cut-down version being sold as the vanilla 9060 with only 128-bits enabled for likely 8GB configs, but also possibly a 16GB config - if not for gamers then potentially for a Radeon Pro workstation variant.
The sentence should have gone like
"If the above leak is true, we might still see an RX 9060 with 8GB memory and an RX 9060 XT with 12 GB on a 192-bit interface."
Unlike speedway, CPU affects final score in timespy.
leaksguesses of a sub 300mm^2 die).So whose to say N44 isn't a smidge larger than initially
reportedguessed at and ends up with a 192-bit bus?Honestly wouldn't surprise me to see N33 get rebranded and crammed in down at the bottom as a 9050, that card can't cost more than a buck to make at this point.
It's too bad Nvidia is pushing gimmicks like frame generation & charging a high cost for this turd.
"If the above leak is true, we might still see an RX 9060 with 8GB memory and an RX 9060 XT with 16GB", but the XT model might now get a wider, 192-bit interface."
To my understanding of English grammar, the ",but" and "now" are two strong indicators that the part of the sentence talking about a wider, 192-bit interface are contradictory to the first part of the sentence talking about the 128-bit bus.
I agree that it should be worded more clearly but DigitalTrends aren't clueless, I'm fairly certain they know that you can't have 16GB on a 192-bit bus based on the level of technical depth some of their other articles have gone into, so I'm willing to chalk this one up to ambiguous wording rather than technical ignorance.
Anyway, it is what it is. Hopefully the smaller Navi44 die is a 192-bit product, because that would make the entry level an absolute banger for 2025 :)
I have both Nvidia and AMD at home, and build mostly Nvidia machines at work these days and it's not any better on the Nvidia side. Faulty hardware is faulty, doesnt't matter what brand logo is on the hardware.
0 issues with the 7900XTX for more than a year now.
If you check AMD slide, it places the top RX 9060 model (XT) at performance higher than RX 7700XT which it means either it will be a cutdown Navi 48 variant with 192bit bit bus,
or the other option would be Navi 44 with 64RBs and 2560 shaders that will probably match/exceed slightly 7700XT in FHD (and maybe match it at QHD) if it has high enough clocks (at least 3040MHz turbo for the reference model) paired with 128bit bus and 16GB memory (at this performance level - and price range accordingly - 8GB would be a detriment).
If the package size of Navi44 is just 29 x 29 mm we are probably talking about max 162mm2 die size so nearly impossible to house a 192bit bus design at these dimensions.
Some times other factors of PSUs play a role too.
What variant of 7900XTX was?
After 2x GPU replacements, that equal 3 GPUs (assuming different ones and not the same send back) I would be convinced that the GPU(s) had issues (by design) only if tested on different PC and issue remains.
Sorry but I'm dealing with PCs for 25 years and had too many GPUs, and these kind of issues could be anything.
It places RX 9070 series around RTX 5050 - RTX 5060 performance, while RX 9060 series around RTX 5030 - 5010 performance levels !
1.) Two different rigs tested with different X670E motherboards.
2.) DDU and AMD Clean Utility.
3.) Windows 10 and 11 tested
4.) Memory tested with no overclocked.
5.) BIOS updated.
6.) 12 months of AMD drivers tested.
7.) Two power supplies. With new cables tested.
8.) Multiple replaced GPUs.
9.) If you google around the net there are many of reported cases of TDRs not being fixed at all. This is an AMD driver issue with certain games and it will never be resolved.
Any news on what model of 7900XTX was that?
And in what games was the drivers "crashing"? BTW when you want to reduce clocks you just limit power from -1% up to -10% from adrenalin, and when that is not enough for whatever reason then you reduce clock limit.
Its common practice among AMD GPU users when they want to reduce clocks/power (for whatever reason)
Starting to doubt about if any of this is real. Did you just joined TPU today to make your anti-AMD statement?
You cant help getting that idea when you dont have a single issue with your games, even the most demanding ones for over a year now...
Im actually not a fan of the way RT is used most of the time, though. Its usually just slathered over everything, making old dry dusty concrete/brick look like it was sprayed with glossy epoxy and buffed to a high shine finish. Piles of dirt should not prob not reflect light like it was hosed down with baby oil, ya know? Im also pretty bummed that it is pretty much the only type of advanced lighting effects used anymore. I would like to see slmething else or at least see it used more intentionally and not just applied to all things reflective or not. I get the feeling that the people that spend the most on PC hardware(or have their parents do it), are less worried about playing a good looking, interesting game, than they are worried about saying the names of the parts they have and the numbers they can get. I like hot rodding PCs, too, but just going out and buying a ferrari is not "hot rodding" imo. Getting an older system to perform mucn better than it should is what I enjoy doing. Althouh, since all games are pretty much developed for XB/PS and constrained by their many limitations, I see little point in blowing a ton of money on a PC anymore....for a while actually. The best a PC can do is look better, bug considering it takes everything a PS/XB has just to look as good as it can, there is almost nothing left for much else. That is why basicly the same games jusg keep getting regurgitated with only slightly better graphics. Their scope/scale and depth stays the same...if it is not just reduced(often happens towards the end of a gen as they try to provide better visuals compared to previous games which requires them to dumb the underlying game down even more to do so. Thats why PC games from like 18-20 years ago have better enemy/npc AI and so many more options than new games. I dont care if people want to play games on a console, but since MS and Sony demand that the PC version of a game is the same as their console version other than visuals, they are limited by things that would not be a problem on PC. STALKER 2 is a perfect example. The A-Life (AI and offline NPC tracking system) is not anything like it should be. The spawing and de spawning is almsot as bad as Far Cry 5! Its not because they cant do it....they did it almost 20 years ago, the reason is that xbox can not run that in the background. When my PC is maxed out with the visuals, I still have like 8+ threads and like 15GB of RAM sping nothing. Xbox uses EVERY last bit of power to just run the game at med-high setting at 30fps. They would have to really lower the visual quailty to get it to work and since an Xbox cant do it, they cant let my PC do it. Modders will prob have to fix it since the licsence terms prevent GSC Game World from doing so. Its all so lame.
end /commrant/digressed\;