• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Sapphire Radeon RX 6950 XT Nitro+ Pure

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,653 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
The Sapphire Radeon RX 6950 XT Pure is a new addition to Sapphire's lineup that comes with a sexy white color theme. In our review, we find out that the card not only looks great, but the quad-slot cooler achieves fantastic noise levels which make 4K gaming at 60 FPS a whisper-quiet experience.

Show full review
 
380W during gaming... Jesus...
How much power does it use in gaming with the "Quiet" BIOS? @W1zzard
 
It's actually not that high. GFX card max out their TDP during gaming all time. so even high end 3080ti with 450TDP, easily max it out during gaming.
 
Pretty impressive in all but Raytracing for a little over than half the cost of the 3090 Ti. And IIRC, these are just the launch models; not the pinnacle versions of the 6950XT (Toxic, Atomic, Liquid Devil, etc). I wonder how long until we see those top end models and how well those will compare to the top end 3090 Ti models.
 
So the Quiet BIOS is 380W???
Jesus... Christ...
Have you seen any 3090 Ti reviews? It sits at or above 450W, most OC models approach (and some exceed) 500W.

Still, these power draw levels are pure insanity. I really wish AMD and/or their partners would publish "efficiency" BIOSes for these with lower clocks and voltages, as underclocking and undervolting can be a bit challenging due to AMD's aggressive boost algorithm, but they have tons of potential - I used to be able to run my 6900 XT at a peak of ~190W while losing ~15% performance from stock.

Still, not that it really stands up to the stupid power consumption, but I'm impressed by the overall performance gain. I was not expecting this to beat the 3090 Ti at 1440p, nor to beat the 3090 at 2160p.
 
It probably only changes fan profile, with power limit unchanged.
Nah the Quiet BIOS is around 380W and the OC one is 410W...

BTW @W1zzard
"Where AMD does have the weaker offering than NVIDIA is ray tracing. This is due to AMD's architecture, which executes some RT operations in shaders, while NVIDIA has dedicated hardware units for it."
Both Nvidia and AMD use shaders for parts of the RT pipeline. And both AMD and Nvidia have hardware, dedicated units for Ray Tracing. The difference is the potency of the ray tracing hardware and how much is done in shaders / how much is done by the dedicated hardware. But both... well... do both. Both have the hardware, both use the shaders...
 
Display connectivity includes two standard DisplayPort 1.4 and two HDMI 2.1.
Picture dis-agree. =)
 
Was considering a second reference card - but Im going to hold on for next gen and hope for early november/december launch
 
So the Quiet BIOS is 380W???
Jesus... Christ...
It is not specifically a "quiet" BIOS, just the default BIOS

Picture dis-agree. =)
Fixed

It probably only changes fan profile, with power limit unchanged.
Note the clocks here, this is actual OC
 
Newegg has several 6950xts left in stock at just $1100, the same price 6900s are going for or occasionally above that.

*and its gone

Have you seen any 3090 Ti reviews? It sits at or above 450W, most OC models approach (and some exceed) 500W.

Still, these power draw levels are pure insanity. I really wish AMD and/or their partners would publish "efficiency" BIOSes for these with lower clocks and voltages, as underclocking and undervolting can be a bit challenging due to AMD's aggressive boost algorithm, but they have tons of potential - I used to be able to run my 6900 XT at a peak of ~190W while losing ~15% performance from stock.

Still, not that it really stands up to the stupid power consumption, but I'm impressed by the overall performance gain. I was not expecting this to beat the 3090 Ti at 1440p, nor to beat the 3090 at 2160p.
How many people buy a 6900xt tier card only to undervolt and lose 15% performance? I mena yeah you got it down to 190w but one can argue that the use case for these cards is not to be efficient, its to push max performance. They're halo cards, after all.
 
How many people buy a 6900xt tier card only to undervolt and lose 15% performance? I mena yeah you got it down to 190w but one can argue that the use case for these cards is not to be efficient, its to push max performance. They're halo cards, after all.
I'm definitely an edge case - I'm running a 6900 XT and 5800X on a single 280mm radiator in a 13 liter case, after all (and yes, it handles their full wattage just fine, but cooling things down a bit is nice) - but there's a massive argument for less intensive underclocks and undervolts than mine, and that's quite common in my experience on higher end GPUs. Hitting 250W or even a bit lower with unnoticeable performance drops is perfectly doable on a 6900 XT - as long as you can find a combination of voltage and frequency range that suits your silicon and the boost algorithms. Savings ought to be even more significant with this higher binned and higher clocked silicon. And remember, my 6900 XT with that 190W profile performed about the same as a 6800 non-XT (a tad below a 3080) while consuming marginally more power than a 3060. That's definitely noteworthy IMO. Top end GPUs are immensely wasteful in how they are tuned, and undervolting and underclocking is a sensible thing to do unless you desperately need all the performance they can deliver.
 
Newegg has several 6950xts left in stock at just $1100, the same price 6900s are going for or occasionally above that.

*and its gone


How many people buy a 6900xt tier card only to undervolt and lose 15% performance? I mena yeah you got it down to 190w but one can argue that the use case for these cards is not to be efficient, its to push max performance. They're halo cards, after all.

I literally buy high-end GPUs to undervolt them (slightly), lower the power, get 95% of the performance at 250W and THEN even use limit FPS in games to 3 under the max refresh rate and never hear the computer spin, use even lower power.
 
I'm happy to see AMD doing so well, I feel like these cards are dead on arrival though, I mean RTX 4000 series is less than 6 months away and is going to smoke everything in it's path, and AMD will respond to that release shortly after I expect... and then we will have another 2-3 year wait cycle... so if I were in the market for a GPU right now, I'd just wait another 6 months personally.
 
I'm happy to see AMD doing so well, I feel like these cards are dead on arrival though, I mean RTX 4000 series is less than 6 months away and is going to smoke everything in it's path, and AMD will respond to that release shortly after I expect... and then we will have another 2-3 year wait cycle... so if I were in the market for a GPU right now, I'd just wait another 6 months personally.
I mean you could, then in 6 months they get delayed by 2 months, then 2 months later they come out, an dit takes 6 months for them to reliably be in stock, and.......

At some point you gotta bite the bullet and just buy a card, or youll be waiting forever. rDNA 3 and RTX 4000 could be good but I'll bet good money that itll be a year from now before you can reliably buy the things. May as well get one now and enjoy it for that year.
 
I'm happy to see AMD doing so well, I feel like these cards are dead on arrival though, I mean RTX 4000 series is less than 6 months away and is going to smoke everything in it's path, and AMD will respond to that release shortly after I expect... and then we will have another 2-3 year wait cycle... so if I were in the market for a GPU right now, I'd just wait another 6 months personally.
6 months at the tail end of a years-long shortage is quite a long time to sell a bunch of GPUs. And if rumors are to be believed, AMD's 7000-series isn't notably further out than RTX 4000. And beyond new bins and slightly more expensive memory it's not like this has any significant cost to AMD or their partners, so if they sell poorly then it's most likely the non-50 SKUs they could have become would have sold just as poorly - the addition of these new SKUs doesn't meaningfully change that, or represent a significant cost for anyone involved (except for the customer, obviously).
 
So the Quiet BIOS is 380W???
Jesus... Christ...

I mean, yes, the only way to make Navi 21 perform like a 3090 is by juicing it as haphazardly as possible... there's absolutely nothing new here, nothing we hadn't seen from the 6900 XT LC (which is this exact product, anyway).

You can get Navi 21 to draw 450 to 500 watts of power on the GPU core only by trying to run Port Royal at ~2.8 GHz with the power limits lifted and it still doesn't beat the 3090, mind you. At this point it's just down to architectural differences, RDNA 2 is better and has always been better at traditional raster graphics, but it's slower at raytracing, high resolutions and doesn't have as many features, pick your poison :D

As for me I love to see competition, I just wish it didn't take twenty months to see a single GPU having the same performance I've had for all this time, with obvious drawbacks.
 
it's slower at... high resolutions and doesn't have as many features

What features? There are no features, actually the AMD Radeon has more features like:
- Radeon Image Sharpening,
- Fidelity FX,
- Video Sharpness enhancement,
- Video Color Vibrance enhancement,
- AMD Steady Video,
- AMD Fluid Motion Video,
- Radeon Enhanced Sync,
- AMD FreeSync,
etc...

As to "high resolutions", it is due to the lower memory throughput - 256-bit with GDDR6 - only up to 360-370 GB/s which is tried to be offset with large 128 MB L3 cache.

@W1zzard Can you please update this graph because it is wrong?

1652201876849.png

AMD Radeon RX 6950 XT Specs | TechPowerUp GPU Database

The true performance:

1652201911278.png

MSI Radeon RX 6950 XT Gaming X Trio Review - Relative Performance | TechPowerUp
 
What features? There are no features, actually the AMD Radeon has more features like:
- Radeon Image Sharpening,
- Fidelity FX,
- Video Sharpness enhancement,
- Video Color Vibrance enhancement,
- AMD Steady Video,
- AMD Fluid Motion Video,
- Radeon Enhanced Sync,
- AMD FreeSync,
etc...

As to "high resolutions", it is due to the lower memory throughput - 256-bit with GDDR6 - only up to 360-370 GB/s which is tried to be offset with large 128 L3 cache.

I'm not sure what you're trying to say, because you're trying way too hard to defend AMD here like you have some holy duty to do so, or out of pure spite for the green team. You always seem to do this, it's arguing in bad faith.

Half of these "features" such as color vibrance and motion handling for video have been present in NVIDIA hardware for over fifteen years at this point, if not longer. These are features so old they've fallen into disuse and obsolescence (such as the video interpolation thing), or never worked correctly (such as Enhanced Sync). That kind of talk may impress someone who doesn't understand a thing about graphics cards but it has the complete opposite effect on someone who does.

FidelityFX suite practically works in its entirety on NVIDIA and Intel hardware, NV released image sharpening support to the entire product stack BEFORE AMD removed their gate keeping for RDNA and Polaris only (it works even on Kekler), FreeSync? Might as well try and see who pioneered adaptive sync displays... and if you want to be judgmental, let's be judgmental, need I remind you their promised support for Netflix DRM on Vega for 18.3.1 still hasn't shipped? Or that they've gated Radeon Super Resolution (their global FSR 1.0 pass) behind RDNA hardware? Yeah, no buddy, that's no argument.

I'm hardly an NVIDIA guy, been on Radeon Vanguard for 3 years now and owned practically the entire collection of AMD HBM GPUs. I had a great time with them. I've ended up with a 3090 out of a very unfortunate chain of events back in 2020, the delay on the 6900 XT caused it to arrive in my country far pricier than I had paid on my 3090 back then and that's the sole reason I don't have one on my PC right now. I just never felt it was worth switching because of the astronomical value of this GPU throughout the past few years making me wary of dealing with randos and its excellent stability. I fully intend on going back to AMD on my next upgrade cycle, but saying they're on equal footing, is simply not an argument vested in good faith right now. Not yet.
 
I'm not sure what you're trying to say, because you're trying way too hard to defend AMD here like you have some holy duty to do so, or out of pure spite for the green team. You always seem to do this, it's arguing in bad faith.

Half of these "features" such as color vibrance and motion handling for video have been present in NVIDIA hardware for over fifteen years at this point, if not longer. These are features so old they've fallen into disuse and obsolescence (such as the video interpolation thing), or never worked correctly (such as Enhanced Sync). That kind of talk may impress someone who doesn't understand a thing about graphics cards but it has the complete opposite effect on someone who does.

FidelityFX suite practically works in its entirety on NVIDIA and Intel hardware, NV released image sharpening support to the entire product stack BEFORE AMD removed their gate keeping for RDNA and Polaris only (it works even on Kekler), FreeSync? Might as well try and see who pioneered adaptive sync displays... and if you want to be judgmental, let's be judgmental, need I remind you their promised support for Netflix DRM on Vega for 18.3.1 still hasn't shipped? Or that they've gated Radeon Super Resolution (their global FSR 1.0 pass) behind RDNA hardware? Yeah, no buddy, that's no argument.

I'm hardly an NVIDIA guy, been on Radeon Vanguard for 3 years now and owned practically the entire collection of AMD HBM GPUs. I had a great time with them. I've ended up with a 3090 out of a very unfortunate chain of events back in 2020, the delay on the 6900 XT caused it to arrive in my country far pricier than I had paid on my 3090 back then and that's the sole reason I don't have one on my PC right now. I just never felt it was worth switching because of the astronomical value of this GPU throughout the past few years making me wary of dealing with randos and its excellent stability. I fully intend on going back to AMD on my next upgrade cycle, but saying they're on equal footing, is simply not an argument vested in good faith right now.

So, are you going to point the "more nvidia features" out that are not present with AMD Radeon or I should assume that it is simply an empty arguing?
 
So, are you going to point the "more nvidia features" out that are not present with AMD Radeon or I should assume that it is simply an empty arguing?

I've never seen someone write so much and say so little at the same time. May I suggest you go read the RTX 3090 Ti's review beyond the performance percentage charts?

I'm done here, this will just attract the attention of the moderators at this point.
 
I've never seen someone write so much and say so little at the same time. May I suggest you go read the RTX 3090 Ti's review beyond the performance percentage charts?

I'm done here, this will just attract the attention of the moderators at this point.
I for one would like to hear what those features you're missing from AMD are, as I frankly have no idea. It's quite well known otherwise that RDNA2 is stronger in rasterized graphics at lower resolutions while Nvidia is stronger at 2160p and with RT. Is anyone contesting that being generally true? But I'd honestly like to hear which features you mean.
 
Back
Top