• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Sapphire Radeon RX 9070 XT Nitro+

You play with averages, not minimum. That 1% of 1% does not represent anything for gamer. Stability of frame rates does. But average is second to that.

You really don't! Unless you like to stutter. Which again many people blame on the games, but are in-fact a nVIDIA tactic. I know this fight isn't worth having. The people that get it already do.

Average is not stability. AVERAGE IS AN AVERAGE. Stability is the minimum. If you run 10000 frames per second and every 10 seconds it drops to 1fps, and the average is 1000fps, is that stable?

Please dude, this is starting to smell like damage control. This card needs no damage control. It's a fine product. The averages are just the individual results put together. If it placed first in every game, it would average higher, the games where it "wins" are simply outliers. If your specific purpose is to play said outlier games, then... by all means, purchase one. That's what it's here for, after all.



You're getting feelings on the way. If we're to start declaring outlier results as "wins", then reviews become completely worthless.



Disagree, this card is not a disaster.

How much does Silent Hill/Elden Ring effect those averages? The former a horrible nVIDIA-optimzed POS that barely runs on anything, the second still playable at 4k on a 9070 xt? Now look at EVERYTHING else.

Exactly. It's not damage control, nor is it about emotion. It's that people don't get it because of how W1zzard skews how results are shown.
I'm not writing this just for you, I'm writing for the people that also need help understanding.
 
Last edited:
You really don't! Unless you like to stutter. Which again many people blame on the games, but are in-fact a nVIDIA tactic. I know this fight isn't worth having. The people that get it already do.
1% is not stuttering.
 
You really don't! Unless you like to stutter. Which again many people blame on the games, but are in-fact a nVIDIA tactic. I know this fight isn't worth having. The people that get it already do.



How much does Silent Hill/Elden Ring effect those averages? The former a horrible nVIDIA-optimzed POS that barely runs on anything, the second still playable at 4k on a 9070 xt? Now look at EVERYTHING else.

Exactly. It's not damage control, nor is it about emotion. It's that people don't get it because of how W1zzard skews how results are shown.
I'm not writing this just for you, I'm writing for the people that also need help understanding.

How much does Cyberpunk 2077 and Alan Wake 2 affect the averages towards AMD as well? Or in the case of average fps, AC Mirage and RE4? It is very much about emotions, mate, or you wouldn't be outraged at the suggestion this GPU is worse than a 4080 (and it is) ;)
 
OCes UK pricing (but uh oh, doh, out of stock)

1741209101723.png
 
How much does Cyberpunk 2077 and Alan Wake 2 affect the averages towards AMD as well? Or in the case of average fps, AC Mirage and RE4? It is very much about emotions, mate, or you wouldn't be outraged at the suggestion this GPU is worse than a 4080 (and it is) ;)

A little. And it doesn't always win. That's not the point. The point is you can still have a very good experience at every conceivable resolution in pretty much every single game. The *real* gap isn't large.

Again, look at Wukong. Can 4080 hold 60? No. Can 9070 xt. Yes. Can you be guaranteed 4080 can OC to hit 60? No. Could you before DLSS4? Yes. Will this happen more often? Yes.
Why doesn't 4080 hold 60? To justify the Super, until the next thing.

Why will 1440p require more ram soon? To outdate the 5080. All the while 9070 xt will have been cheap, ready for this existance. And not 12GB and already outdated. Do you not see the cycle of nVIDIA?

Every single time I write this more people wake up.

1% is not stuttering.
It's often the same route cause, generally-speaking, and I think you know that. Sometimes it isn't.
 
Last edited:
Thanks for the review!

Sorry, but I'm going to leave some thought here...

However, judging by the results of this review, the particular card in question, is a complete flop. Dunno, why everyone is so euphoric and happy. While the card is really within a margin of error to the 7900XTX/4080S.That's actually a good bonus, because it was rumoured to be positioned at much lower tier. It still often has worse results at some games, dropping to the 4070 levels.
Also, the 16GB seems a bit lacklustre. But I doubt it will be able to utilize more VRAM, in gaming, or anything but "AI" tasks.

I might be wrong. However, 7900XTX performance, comes at XTX prices. The stores/sellers, as much as AMD themselves, got used to the unreasonably inflated prices for the last five years. So they will keep them at all costs (no pun intended). Maybe some lucky NA inhabitants would be able to carve out a card or two for a price close to MSRP. The other countries will never see it. And this is still wishful thinking. And no way AMD will let them to be lower than own previous gen cards, which this card directly positioned at, as much as the current nVidia counterparts. That's why AMD was lingering over for the nVidia 5070 series release for so long. If the card has the performance of these, it is going to cost as these as well.

My arguments are: here 4060Ti/7800XT having a price of $800+, the 4080 -$2000+, and every single 5090- above $3800-4000. These are official stores/distributors. Imagine what scalper prices would be.

But the really ugly thing, is power consumption. Maybe lowering some clocks, and undervolting, will keep the media-playback and gaming at more sane levels. But for now, they are underwhelming,. The consumption is basically as 4080, and the 7900XT/XTX, but with barely any performance and effficay gains, or rather straight up the same, or worse. There's no progress.

Also. The Pulse has the PCB, basically of the same size. But this effin Al (Aluminium, not AI :p) brick, is unable to carry three 8 pin connectors. Even if the PCB was an inch longer, it would accomodate them, with no issue. I understand the progress, the compact footprint, and the convenience. But the card is huge. Why not make two versions? Sapphire weeds out so many potential buyers.
Some would say, "For the card of this price, one would also be able to get a new PSU". Or, "The adapter already included" Or even "these are still the same, as cables extenders that go from the PSU". Thats not the point. There is no choice. I would definitely consider the card, if it wouldn't have the stupid garbage 12V 2x6 connector. So would be more people, considering to buy one from Sapphire, but alas! Looks like XFX and Powercolor want the money more.

Best regards!

What is less impressive is the Nitro+ premium. A $130 premium on a $600 card is an extra 22% to pay for something that has boost clocks that are likely within 3% of the $600 MSRP models like the Pulse.
Dual slot non-OC Nitro+ version. would be amazing. Superior cooling, better features, but without dumb reckless OC, just for the sake of spare couple fps, at the price of hundreds of watts.

Also, Sapphire seems completely lost their mind with the RDNA3. Asking premium for being inferior compared to even the direct partners like TUL and XFX.
OCes UK pricing (but uh oh, doh, out of stock)

View attachment 388030
Might get look at ebay, or local second-hand store....
 
Last edited:
Not by AMD, but by the tech circles and the fanbase. Expectations were certainly running high. People were calling this the Nvidia killer, and even the review is titled "beating NVIDIA"... except that on practice, looks like it is unable to conclusively beat 2022's RTX 4080, averaging -5% in raster and -20% in RT as per this very review (of what is arguably the most advanced model of 9070 XT we'll receive). These -5% seem to be after the games which are outliers, there were great gains in Cyberpunk 2077 with RT off, and the historically AMD friendly games seem to continue that trend (such as Call of Duty).

I do think it's a solid product with a far larger list of things to like than to dislike, but anyone calling this an Nvidia killer definitely hit their head somewhere. If the market situation wasn't so crazy, with the 50 series launch being completely botched, without any stock whatsoever, preorders from January going unfulfilled, 10+ week lead times, scalped prices on any few units that do show up for sale, and lukewarm progress by the competition (5090 aside, the 50 series barely even moves the needle in the overall level of performance made available stackwide), and if Navi 31 itself scaled better than it does, this card wouldn't look even half as good as it does.



Personally open to buying big UDNA once that's made available, this generation proves that AMD's still got it. Hopefully the suits will beat it and let their engineers work. Suits are what ruin AMD.
I don't think any sane person expected this to be the Nvidia killer. In any case, no single generation would affect Nvidia. Sustained excellence and missteps from Nvidia would be needed to regain parity. As for UDNA, after seeing the improvement in the compute units for RDNA 4, I hope that UDNA is just a rumour and the reality turns out to be RDNA 5.
 
How much does Cyberpunk 2077 and Alan Wake 2 affect the averages towards AMD as well? Or in the case of average fps, AC Mirage and RE4? It is very much about emotions, mate, or you wouldn't be outraged at the suggestion this GPU is worse than a 4080 (and it is) ;)
This is a false equivalence. Cyberpunk 2077 and Resident Evil 4 are acclaimed, successful, and well-optimized games that fully leverage the potential of each architecture. Silent Hill, on the other hand, is a game that simply runs poorly on everything; the computational load compared to the visuals on screen is absurd.

1741209868748.png


To be honest, I wouldn't include in a GPU review any game where high-end GPUs struggle at 30–40 FPS or where performance is excessively high, like 300–500 FPS.
It would be nice if, in the future, we could at least vote to include games we like.
 
Oh, ok. So it was being silly then.
 
This is a false equivalence. Cyberpunk 2077 and Resident Evil 4 are acclaimed, successful, and well-optimized games that fully leverage the potential of each architecture. Silent Hill, on the other hand, is a game that simply runs poorly on everything; the computational load compared to the visuals on screen is absurd.

View attachment 388032

To be honest, I wouldn't include in a GPU review any game where high-end GPUs struggle at 30–40 FPS or where performance is excessively high, like 300–500 FPS.
It would be nice if, in the future, we could at least vote to include games we like.

By calling it a false equivalence, it really sounds like you're implying when a game benefits AMD, it's an "acclaimed, successful, optimized title", but when it doesn't, it's just the game that's trash, particularly considered the very largest gap is shown with... probably the game that is the most successful and acclaimed of all (CS2).

We all know Silent Hill is a problematic title, that makes it an outlier in itself.

A little. And it doesn't always win. That's not the point. The point is you can still have a very good experience at every conceivable resolution in pretty much every single game. The *real* gap isn't large.

Never said it was, as a matter of fact, I mentioned more than once that the gap was small. It's still there, but this segment as a whole is stagnated, it's really a "just buy whatever you find cheaper" situation.
 
This is a false equivalence. Cyberpunk 2077 and Resident Evil 4 are acclaimed, successful, and well-optimized games that fully leverage the potential of each architecture. Silent Hill, on the other hand, is a game that simply runs poorly on everything; the computational load compared to the visuals on screen is absurd.


To be honest, I wouldn't include in a GPU review any game where high-end GPUs struggle at 30–40 FPS or where performance is excessively high, like 300–500 FPS.
It would be nice if, in the future, we could at least vote to include games we like.

^^^ This guy gets it. Like I said, Wukong and Star Wars are really about the only thing indicative of where AAA gaming is headed, afaict. I don't have the suite memorized.
I get why CS:GO is there (for those players), but it shouldn't be included in a roundup or included in averages. Make the "if you e-sport; these games" section or something.
*Most* people look at that game and understand they don't need above the frames either can deliver.

Again, people will say not everyone plays intensive games, and it is so ridiculous. That is true. But then why are you buying a $600-1000 card for these features?

Never said it was, as a matter of fact, I mentioned more than once that the gap was small. It's still there, but this segment as a whole is stagnated, it's really a "just buy whatever you find cheaper" situation.
This is some-what true, somewhat not. And I'll explain again why. The why is because the 4080, with less compute capability, can be relegated BELOW the 4080 Super as it already has, and will more so.
It is a constant cycle. Again, you would think "Well, 9070xt/4080s have similar compute/ram/rt" and that is TRUE, but one of them has a company that pushes new DLSS (which increases compute load) and also features such as frame gen that detract from that compute/buffer intended for a core game, and each little piece keeps pushing those cards down below those thresholds to sell the tiniest bit of advancement.

Like I said, the next humorus one will be the 5080, just like the freaking 4070 series before it with 12GB. The EXACT SAME THING will happen that people called bullshit on for the 5070, finally.

Two short years after they tried to sell it for $900...and it can't play many things at 1440p anymore in a way I think most people deserve.

The 9070 xt is already segmented to be one tier down when cards with larger buffer release for this 'midrange' segment next gen. The 5080 will be TWO tiers (to some people, one to me) down bc of that buffer.

It's really not that complicated! It's just a lot of words!
 
Last edited:
How much does Silent Hill/Elden Ring effect those averages? The former a horrible nVIDIA-optimzed POS that barely runs on anything, the second still playable at 4k on a 9070 xt? Now look at EVERYTHING else.
One thing I have to add that ER is actually a hard 60 fps capped game and requires a third party software to unlock the framerate (you can't even use that with online services and anticheat) . And pretty much anything over 3070TI can do 60fps even at 4K. So those results doesn't even reflect actual use case for most players either way. Considering their engine has also no hope to be a use in any other title too (Nightreign is just ER, looks and performs same as we saw in network test), I don't really think it should be used for GPU reviews anymore as it can skew the results without representing actual performance you'd get normally. It wouldnt matter if it performed about average but it's just a big outlier in benchmarks.
 
I want AMD to have a win here as much as anyone, but people are calling the win too early. We have learned from prior launches that we need availability and actual MSRP cards to be restocked. Hardware Unboxed is suggesting the $599.99 is a temporary early deal that's going to disappear and eventually the XT will be around $800 or more. If that happens, nothing is won.

Nothing will even be different.

AMD really should have made reference cards.
 
What the hell is going on with Alan Wake 2? That didn't strike me as an AMD title.
 
Love the bias...
  • Large price increase over MSRP
  • NVIDIA DLSS offers a better upscaling and frame generation experience
Small price increase over MSRP comparted to Gigabyte Aorus or Aero to which Sapphire is the equivalent quality.

And better upscaling/frame gen but at what cost? AMD Sapphire 9070xt $730 vs $999 for a Gigabyte 5070ti
Is DLSS $269.00 better, 37% better?
 
What the hell is going on with Alan Wake 2? That didn't strike me as an AMD title.
IDK.

Maybe an update to the game or to drivers (addressing that game)?
I was surprised to see AMD 'up there' too. Relative to the 7900 XTX, it's roughly 'where it should be', though.
 
Hell yeah, Nvidia lost my money with this generation. I don't upgrade every gen. I skipped the 40 series so they had to do something good with the 50. They did not. The prices alone are insane. Ive been an Nvidia user for 20 years, no more tho. Enough is enough. They just act like kings nows and expect us to bow down lol. Thank god for competition!
 
By calling it a false equivalence, it really sounds like you're implying when a game benefits AMD, it's an "acclaimed, successful, optimized title", but when it doesn't, it's just the game that's trash, particularly considered the very largest gap is shown with... probably the game that is the most successful and acclaimed of all (CS2).

We all know Silent Hill is a problematic title, that makes it an outlier in itself.

Never said it was, as a matter of fact, I mentioned more than once that the gap was small. It's still there, but this segment as a whole is stagnated, it's really a "just buy whatever you find cheaper" situation.

- Silent Hill is very firmly in the "visuals absolutely do not justify the performance" category for me but to your earlier point a savvy consumer should actually look at how a card performs in the games they actually want to play and the purchasing decision based on that information.

I would be nice to one day over the rainbow get "dynamic" performance charts where we could select a specific subset of tested games and the charts would dynamically tier the GPUs, but we're not there yet I suppose.
 
Damn, it's been a long time since I read an AMD launch review that seemingly went smoothly. It looks like the extra time was just what the doctor ordered. That apparently everything actually working the way it was intended to work. Hell of a job AMD! Now if they could just get those prices down a bit more, they could actually grab a huge chunk of the market. Still the MSRP is very competitive versus the current competition. I never thought I would be applauding a mid range graphics card with a $600 MSRP.
 
How much can performance increase over time in CS2 for example via software updates?
Performance in FPS shooter games it seems very meh.
 
Looks pretty good. The low (relatively speaking) CS2 performance is probably architectural. RDNA3 also posted lower than expected results vs RDNA2 in that game.
 
Pretty impressive numbers in these times. Imagine if they scaled even close to linearly with the architecture for a big die part to replace 7900XTX - 46% faster than a 7900XTX would put it between the 4090 and 5090. If it sold for $1000, it would be a killer.
Kinda sad they didn't try it. I would be less on the fence.
 
Kinda sad they didn't try it. I would be less on the fence.
Same feels as the RX 5700 XT era.

Can't wait to eventually see what RX 9080 XTX was built and tested in-lab, though
like the HBM Navi12
 
Back
Top