Sunday, January 12th 2025

AMD Radeon RX 9070 XT Tested in Cyberpunk 2077 and Black Myth: Wukong

The recently unveiled AMD Radeon RX 9070 XT has been the epicenter of a plethora of leaks in the past few weeks. We now have not only a rough idea of what kind of synthetic performance the highest-end RDNA 4 GPU is about to bring to the table, but also how much the card is expected to cost. Now, yet another leak seemingly sheds light on perhaps the most crucial aspect of any gaming GPU - it's in the name, gaming.

Hilariously enough, this leak, once again, has been sourced from a now-nerfed forum post on Chiphell. This time around, site admin nApoleon was able to run a few games on the brand-new RDNA 4 card, including Black Myth: Wukong and Cyberpunk 2077. The results have been added as screenshots below, but here is a summary - the RX 9070 XT performs decently well, trading blows with the RTX 4070 Ti Super. In NVIDIA's favorite Cyberpunk 2077 at 4K with ray tracing, the RX 9070 XT managed 26 FPS, neck and neck with the RTX 4070 Ti, whereas the RTX 4080 Super was well ahead with 32 FPS.
In 1080p, the RX 9070 XT managed 85 FPS, whereas the RTX 4070 Ti edged past with 2 more frames. The RTX 4080 Super, unsurprisingly, was well ahead once again with 101 FPS. The RX 9070 XT appears far from promising in rasterization performance, as revealed by Black Myth: Wukong. At 4K, the RX 9070 XT managed around 30 FPS, defeating the 4070 Ti Super which managed 28. And at 1080p, the RX 9070 XT raked in 97 FPS, coming shockingly close to the 4080 Super, which managed 99 FPS. The RTX 4070 Ti Super was left behind, managing 87 FPS 'only'.

Of course, it now all boils down to how competitive Blackwell is priced, and what kind of performance it brings to the table. DLSS 4, along with all the multi-frame generation AI chicanery appears to be all set to shake up the way we measure performance, making things even more complicated. Either way, for those who cannot care less about ray tracing, the RDNA 4 GPU sure does look pretty enticing, if the leaked benchmarks and pricing information is taken to be true.
Sources: GameGPU, VideoCardz
Add your own comment

169 Comments on AMD Radeon RX 9070 XT Tested in Cyberpunk 2077 and Black Myth: Wukong

#76
Daven
AusWolfSee, that's the wrong thinking many people fall into. Just because you're not interested, no one else should be. Well, I, for example, don't care about CUDA, encoding, AI, and all the rest that you mentioned, but I find a $100 lower price absolutely welcome. I want to pop the GPU into my system and play games, and for that the 9070 XT seems more and more enticing with every leak.
I would guess that luminescent would response with 'that's what consoles are for'. But I just tried playing Risk of Rain 2 on my PC with a controller yesterday. It felt like playing drunk. Nothing beats a keyboard and mouse IMHO. Therefore, I too just want a good, affordable GPU to pop into my computer for gaming.
Jtuck9Given how popular the whole "fake frame" discussion is perhaps that's where battle lines will be drawn. Personally, I'm not thinking about native 4K performance and not expecting anything to be competing at 5080 levels until next gen (UDNA).
That's where I have a problem with DLSS, FSR, XeSS, etc. We have so many great, affordable displays capable of 4K at 120 Hz and greater but it seems like game developers and GPU manufacturers just gave up on rendering natively at this resolution and at this frame rate but yet charge us more for GPUs and games (microtransactions) than ever before.
Posted on Reply
#77
AusWolf
DavenI would guess that luminescent would response with 'that's what consoles are for'. But I just tried playing Risk of Rain 2 on my PC with a controller yesterday. It felt like playing drunk. Nothing beats a keyboard and mouse IMHO. Therefore, I too just want a good, affordable GPU to pop into my computer for gaming.
Some games are fine with a controller, but most aren't, imo.

Personally, I just want a graphics card that delivers good performance at 1440 UW native, doesn't cost an arm and a leg, doesn't require a nuclear reactor in my back yard, and works well with Linux. The 9070 XT seems to tick all of these boxes. Whether a card has CUDA, AI, or more fake frames than the last generation doesn't interest me in the slightest. Especially if said card has only 12 GB VRAM for the same price.
Posted on Reply
#78
HOkay
AusWolfSome games are fine with a controller, but most aren't, imo.

Personally, I just want a graphics card that delivers good performance at 1440 UW native, doesn't cost an arm and a leg, doesn't require a nuclear reactor in my back yard, and works well with Linux. The 9070 XT seems to tick all of these boxes. Whether a card has CUDA, AI, or more fake frames than the last generation doesn't interest me in the slightest. Especially if said card has only 12 GB VRAM for the same price.
I mostly agree with this, the only thing on the above list that I do want is upscaling, because you need it for decent frame rates on the highest settings unfortunately. That Rachet & Clank FSR4 demo looked like they'd made huge progress though so I'm really hopeful that FSR4 has finally caught up with DLSS3 or possibly even DLSS4. I do not care about the multi frame generation part, I'm a high refresh snob so I can't be dealing with latencies below the native 80-90fps or so as a minimum.
Posted on Reply
#79
AusWolf
HOkayI mostly agree with this, the only thing on the above list that I do want is upscaling, because you need it for decent frame rates on the highest settings unfortunately. That Rachet & Clank FSR4 demo looked like they'd made huge progress though so I'm really hopeful that FSR4 has finally caught up with DLSS3 or possibly even DLSS4. I do not care about the multi frame generation part, I'm a high refresh snob so I can't be dealing with latencies below the native 80-90fps or so as a minimum.
Fair enough. I'm not a quick reflex gamer at all, so I'm fine with anything within my monitor's VRR range (anything above 48 Hz/FPS really), but I'd rather not rely on upscaling if I can help it.
Posted on Reply
#80
3valatzy
LuminescentThis round i might go Nvidia, that 5080 with dual decoders is a god send for video editing, then the amazing support for CUDA in all Adobe products and all the plugins, priority optimized for CUDA and then Opencl.
One Nvidia rtx 5000 decoder can do 8 streams of 4k 60 fps and RTX 5080 has two, that's just nuts, Nvidia killed Intel quicksync and all Apple M chips.
Amd is just dead, i can't see how they will compete with this at any price, they have better upscaling, better professional support, better raytracing, better AI, why would anyone buy AMD for 50-100$ cheaper.
I hope they can survive this so we have some competition but this might be over.
AMD can compete in gaming. The gamers need neither of the things you listed.
AusWolfSee, that's the wrong thinking many people fall into. Just because you're not interested, no one else should be. Well, I, for example, don't care about CUDA, encoding, AI, and all the rest that you mentioned, but I find a $100 lower price absolutely welcome. I want to pop the GPU into my system and play games, and for that the 9070 XT seems more and more enticing with every leak.
$100 lower price is nothing, if you consider that the Nvidia's clients / fans can pay whatever amounts are asked from them.
Look at this price table - AMD had never been this greedy as recently.

Posted on Reply
#81
TSiAhmat
AusWolfSome games are fine with a controller, but most aren't, imo.

Personally, I just want a graphics card that delivers good performance at 1440 UW native, doesn't cost an arm and a leg, doesn't require a nuclear reactor in my back yard, and works well with Linux. The 9070 XT seems to tick all of these boxes. Whether a card has CUDA, AI, or more fake frames than the last generation doesn't interest me in the slightest. Especially if said card has only 12 GB VRAM for the same price.
Technically Lossless Scaling now has 20x Frame Gen. It looks hilariously bad (tested it on 20 -> 160 frames, because my monitor can't do more) So if you want the most fake frames you can do it with both graphic card vendors...

Different question:

Can you stack fake frames? So DLSS 4 1 -> 5
Lossless Scaling 5 -> 80 99

1 "real" frame being followed up by 80 99 "fake" frames would be hilarious
Posted on Reply
#82
AusWolf
3valatzy$100 lower price is nothing, if you consider that the Nvidia's clients / fans can pay whatever amounts are asked from them.
Look at this price table - AMD had never been this greedy as recently.
Nvidia's clients / fans can pay whatever they want. If AMD offers similar performance, more VRAM, and is a fair bit cheaper, I'm game. Their profit margins don't interest me, and I don't care what other people buy.
TSiAhmatTechnically Lossless Scaling now has 20x Frame Gen. It looks hilariously bad (tested it on 20 -> 160 frames, because my monitor can't do more) So if you want the most fake frames you can do it with both graphic card vendors...
I don't want any of that. Like I said, I'm fine with anything within my monitor's VRR range, above 48 Hz/FPS. Above that, I don't need frame generation. Below that, it doesn't work. Absolutely pointless tech.
Posted on Reply
#83
3valatzy
AusWolfNvidia's clients / fans can pay whatever they want. If AMD offers similar performance, more VRAM, and is a fair bit cheaper, I'm game. Their profit margins don't interest me, and I don't care what other people buy.
AMD will not survive because there is the you.
AMD needs a working marketing with a new approach in order to find reasons other than the trivial ones in order to win more sales.
Posted on Reply
#84
AusWolf
3valatzyAMD will not survive because there is the you.
AMD needs a working marketing with a new approach in order to find reasons other than the trivial ones in order to win more sales.
Whatever they're doing seems to be working. 11th place on TPU's popularity list, right below one of the highest-selling AMD cards of all times is not bad from an unreleased product, imo.


Edit: Besides, you're forgetting about consoles.
Posted on Reply
#85
JustBenching
3valatzy$100 lower price is nothing, if you consider that the Nvidia's clients / fans can pay whatever amounts are asked from them.
Look at this price table - AMD had never been this greedy as recently.

The usual bashing of leadership. People are paying 600$ for 8core amd chips and you think "nvidia fans will pay whatever". Sure...

If the card releases within a reasonable time frame (and not much later than it's competition) with adequate stock (EU shops don't even have listings for amd cards for a couple of months) at 550$ with raster and RT being equal or better than the 5070, it will sell like hotcakes. I can easily see it selling as much as the 5070, just like other competitive cards did in the past (RX 470,480,570 580 etc.).
Posted on Reply
#86
Daven
AusWolfWhatever they're doing seems to be working. 11th place on TPU's popularity list, right below one of the highest-selling AMD cards of all times is not bad from an unreleased product, imo.
IMHO, the doom and gloom surrounding AMD in the GPU space is partly due to enthusiasts having to spend $1000 to $2000 on an Nvidia GPU. If you can convince yourself that you have no other choice, it makes spending that kind of money on a GPU easier. That means when another company's solution becomes more appealing (Intel or AMD) there is a huge rush to downplay the advantages else you feel like you wasted your money.

In truth, AMD is on a down cycle at the end of current gen consoles and the imminent release of new ones, the beginnings of new markets such as handhelds (Ryzen Z) and high power SoCs (Strix Halo) and floundering around in the discrete GPU market. New consoles are coming by the end of this year. We will know in a few weeks if RDNA4 is any good. It is too early to know about Strix Halo's popularity.

I am optimistic on all three above (consoles, handhelds and RDNA4) as well as Intel's efforts which will bring much needed competition back to the client GPU space. Data center GPUs are a whole other story which I won't get into.
Posted on Reply
#87
AusWolf
DavenIMHO, the doom and gloom surrounding AMD in the GPU space is partly due to enthusiasts having to spend $1000 to $2000 on an Nvidia GPU. If you can convince yourself that you have no other choice, it makes spending that kind of money on a GPU easier. That means when another company's solution becomes more appealing (Intel or AMD) there is a huge rush to downplay the advantages else you feel like you wasted your money.
Yeah, I'm wondering how much of Nvidia's buyer base is centered around a real need for CUDA and AI, and how much is just confirmation bias.
Posted on Reply
#88
lilhasselhoffer
AusWolfWhatever they're doing seems to be working. 11th place on TPU's popularity list, right below one of the highest-selling AMD cards of all times is not bad from an unreleased product, imo.


Edit: Besides, you're forgetting about consoles.
So...popularity is never the gauge of something good. It was popular to have a nice shiny coat of paint on your house, and the whitest white was lead based, so lead based paint became popular. This is one very old example, but it is a dipstick.

That said, the argument can be made that most people are looking for good 1920x1080 performance...and that's generally the realm of a xx60 or x700 card. That's pretty much what we compete with/against for the attention of most people. In that regard, the 9070 is as much a failure as the 4070, 5070, and anything higher. For 550 USD you can purchase a console that half ways 4k...and that's basically watching these cards fight for a premium market share.


All of this said, the 9070 is something I personally think may be reasonable. It looks to pull about the amount of power of a 3080, perform somewhere between a 4070 and a 4080, be priced similar to a 5070, and have enough VRAM without the injected frames to make plenty of performance available for the average high refresh rate 2560x1440. Being realistic here, if they actually come in closer to the $500 range then this thing is shaping up to be the best card in the high priced category. I just have to question why there's so much consternation over chiphell leaks....which are usually about as accurate as inaccurate. With only a few weeks to go let's be real here, and suggest that there's nothing which should yet drive anybody to preorder anything day one. Pretending otherwise is how people are disappointed by something like the B580...despite it selling like toilet wine in a prison. So we are clear, the B580 is nothing special until you consider it's priced very well in the underserved budget gaming segment.
Posted on Reply
#89
Luminescent
While i understand some of you would buy AMD because is 50 to 100$ cheaper, i did that too but life is too short and at some point second best is not enough.
Posted on Reply
#90
TSiAhmat
LuminescentWhile i understand some of you would buy AMD because is 50 to 100$ cheaper, i did that too but life is too short and at some point second best is not enough.
if you are buying a 4070ti or 5070 it's not the best, its 3rd best at most if you ignore the 7900xtx (and maybe the 9070xt if it performs well)
Posted on Reply
#91
tfdsaf
LuminescentThis round i might go Nvidia, that 5080 with dual decoders is a god send for video editing, then the amazing support for CUDA in all Adobe products and all the plugins, priority optimized for CUDA and then Opencl.
One Nvidia rtx 5000 decoder can do 8 streams of 4k 60 fps and RTX 5080 has two, that's just nuts, Nvidia killed Intel quicksync and all Apple M chips.
Amd is just dead, i can't see how they will compete with this at any price, they have better upscaling, better professional support, better raytracing, better AI, why would anyone buy AMD for 50-100$ cheaper.
I hope they can survive this so we have some competition but this might be over.
NO ONE cares about AI or having 8 streams on, people who buy geforce and radeon care about gaming performance, value, longevity, quality perception.

No one's even heard of CUDA who is a casual gaming enthusiast, hell even professional forumers and hardware enthusiasts know very little about CUDA, let alone have any use of it.

For real, the biggest use I've had of CUDA is using it with a chess engine on my desktop, which now calculates to 30 depth in 20 seconds, instead of 30 seconds, big deal.

This is the type of extreme use case of CUDA. Plus AMD has the equivalent in rocm, yeah its not as fully fleshed out as cuda, but it does 95% of stuff at same level as cuda, that 5% extreme use case literally even professionals don't care that much.

Its all propaganda and bots pushing a narrative to close off people's minds and somehow project nvidia gpu's as somehow better, when they clearly are not. They are garbage value, garbage performance, garbage vram, garbage longevity, getting bigger, heavier, louder, hotter, etc...
Posted on Reply
#92
Vayra86
Jtuck9I was referring to AMDs competing suite (FSR 4 etc) vs previous gen
Look at how FSR and DLSS manage to proliferate and you know this isn't a real selling point. Yes, it exists. Yes, it appears in various games. But the version? DLSS3 Frame generation isn't everywhere. FSR4 isn't going to be everywhere. Even DLSS2 isn't everywhere and the next question is also 'when do you get it'. Cyberpunk and FSR weren't - and STILL aren't - good friends for a long time, to name a pretty shocking example. And that's just looking at new releases, forget about legacy.

The bottom line is you'll still rely on raw performance for a good two-three generations at the very least. Especially in the midrange, where the raw performance is barely growing. Much like RT and all the other developments in game graphics, this takes time. Far more time than anyone has patience. Don't early adopt this crap. Right now, its actively used as a marketing tool. Its not primarily there to make games better, its there to sell GPUs and pseudo next gen graphics.
Posted on Reply
#93
Daven
LuminescentWhile i understand some of you would buy AMD because is 50 to 100$ cheaper, i did that too but life is too short and at some point second best is not enough.
Are you talking about best of everything (i.e. 4090 and 5090) or best in your price range? If the latter, then yes things like CUDA would place Nvidia at the top of your list versus the competition. But keep in mind that CUDA is exclusive to Nvidia. This thread is for people who can consider any GPU from Intel, AMD or Nvidia. Deciding you need exclusive features means there is no need for discussion and therefore not appropriate for this thread.

Edit: Also see tfdsaf's response above if CUDA is optional.
Posted on Reply
#94
AusWolf
lilhasselhofferSo...popularity is never the gauge of something good.
I never said it was. I was merely countering an "AMD is dead" fear mongering post.
Posted on Reply
#95
mkppo
I suspect they might even price it more aggressively than recent generations to gain some market share as It's the last of the RDNA cards. Plus B580 is half decent so there's some downwards pressure from that as well.

A lot hinges on UDNA, which should bring them back in the same direction as GCN. That was AMD's first arch after taking over from ATI, and while it got good reviews during launch and sold well, it never received the praise it deserved because it was flat out better compared to the current and two future nvidia archs at the time (Tesla, Fermi and Kepler). Maxwell was the one which knocked the wind out of AMD's sails at a time when they were also struggling financially.

Obviously something like that is wishful thinking but If AMD can even come close to parity this time around, it'll be a big win for them. They have the resources and capabilities now to design it well and improve ROCm as well which should play well with UDNA. Time will tell.
Posted on Reply
#96
Makaveli
AusWolfSome games are fine with a controller, but most aren't, imo.
Agreed and it will depend on the game.

I play Helldivers 2, GOW, and pretty much anything 3rd person view on PC with a PS5 Controller.
Street fighter 6, Mortal kombat anything fighting with a Hori fightpad
All of my first person shooter games Delta Force, Counter Strike 2, Halo infinite etc are Mouse and keyboard.
Posted on Reply
#97
Jtuck9
Vayra86Look at how FSR and DLSS manage to proliferate and you know this isn't a real selling point. Yes, it exists. Yes, it appears in various games. But the version? DLSS3 Frame generation isn't everywhere. FSR4 isn't going to be everywhere. Even DLSS2 isn't everywhere and the next question is also 'when do you get it'. Cyberpunk and FSR weren't - and STILL aren't - good friends for a long time, to name a pretty shocking example. And that's just looking at new releases, forget about legacy.

The bottom line is you'll still rely on raw performance for a good two-three generations at the very least. Especially in the midrange, where the raw performance is barely growing. Much like RT and all the other developments in game graphics, this takes time. Far more time than anyone has patience. Don't early adopt this crap. Right now, its actively used as a marketing tool. Its not primarily there to make games better, its there to sell GPUs and pseudo next gen graphics.
Wasn't DLSS being heavily marketed with Stalker 2? Taking the game into "playable" frame territory? Not sure if that was the case for Wolfenstein also? I believe both have FSR patches on the way, if not already. Perhaps with their new FSR4 we might have more parity in regards to support at release?! I've read comments from people saying that they believe raster performance is a last gen measuring stick...

Like I said I'm more interested in upscaling than triple digit frames.
Posted on Reply
#98
dyonoctis
AusWolfYeah, I'm wondering how much of Nvidia's buyer base is centered around a real need for CUDA and AI, and how much is just confirmation bias.
Unless nvidia makes a survey to know what are doing with their GPUs we'll never know. But I also think that it's a bad idea for a GPU maker in 2025 to ignore that aspect when they are building their product. Once you go outside of the DIY space, a PC maker will have to make some choices about the kind of component that they are going to put in machines sold in retail, and the machine that can do the most will probably have less chance of sitting in inventory. Wich makes nvidia ubiquitous if you are shopping around for a prebuild, or even a system integrator. The default option is most cases is nvidia, and that contribute to mindshare.

And the geforce brand is actually pretty strong even for professionals, As puget system realised from their sales data. And it's also something that i can see when I talk with freelancers, take a look at the kind of machines that art/media schools are using...people tend to overestimate the appeal of "RTX quadro" for prossumers.

For all the shitty stuff that nvidia is pulling off as a company, they've been pretty good at not being caught slacking and off-guard. Unlike Intel they don't get complacent. If RT can be critiziced in gaming, Optix (CUDA +RT) was absolutely mental for the offline 3D industry, a single GPU managed to do stuff that required a small render farm a decade ago, now you got teenagers doing short 3d movies in their parents basement.


Posted on Reply
#99
Jtuck9
dyonoctisUnless nvidia makes a survey to know what are doing with their GPUs we'll never know. But I also think that it's a bad idea for a GPU maker in 2025 to ignore that aspect when they are building their product. Once you go outside of the DIY space, a PC maker will have to make some choices about the kind of component that they are going to put in machines sold in retail, and the machine that can do the most will probably have less chance of sitting in inventory. Wich makes nvidia ubiquitous if you are shopping around for a prebuild, or even a system integrator. The default option is most cases is nvidia, and that contribute to mindshare.

And the geforce brand is actually pretty strong even for professionals, As puget system realised from their sales data. And it's also something that i can see when I talk with freelancers, take a look at the kind of machines that art/media schools are using...people tend to overestimate the appeal of "RTX quadro" for prossumers.

For all the shitty stuff that nvidia is pulling off as a company, they've been pretty good at not being caught slacking and off-guard. Unlike Intel they don't get complacent. If RT can be critiziced in gaming, Optix (CUDA +RT) was absolutely mental for the offline 3D industry, a single GPU managed to do stuff that required a small render farm a decade ago, now you got teenagers doing short 3d movies in their parents basement.


Speaking of "mindshare", interested to see how something like the Strix Halo will mix things up going forward. PCGAMER speculating that UDNA might appear on a console first...Was interested myself but got swayed by the supposed RT improvement on RDNA 4 vs 3.5
Posted on Reply
#100
Vayra86
Jtuck9Wasn't DLSS being heavily marketed with Stalker 2? Taking the game into "playable" frame territory? Not sure if that was the case for Wolfenstein also? I believe both have FSR patches on the way, if not already. Perhaps with their new FSR4 we might have more parity in regards to support at release?! I've read comments from people saying that they believe raster performance is a last gen measuring stick...

Like I said I'm more interested in upscaling than triple digit frames.
Oh sure there are comments of people saying anything. I just look at the market and not the few incidental hypes that pass by.
Posted on Reply
Add your own comment
Jan 21st, 2025 20:49 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts