# Sapphire Radeon RX 6950 XT Nitro+ Pure



## W1zzard (May 10, 2022)

The Sapphire Radeon RX 6950 XT Pure is a new addition to Sapphire's lineup that comes with a sexy white color theme. In our review, we find out that the card not only looks great, but the quad-slot cooler achieves fantastic noise levels which make 4K gaming at 60 FPS a whisper-quiet experience.

*Show full review*


----------



## Charcharo (May 10, 2022)

380W during gaming... Jesus...
How much power does it use in gaming with the "Quiet" BIOS? @W1zzard


----------



## luches (May 10, 2022)

It's actually not that high. GFX card max out their TDP during gaming all time. so even high end 3080ti with 450TDP, easily max it out during gaming.


----------



## W1zzard (May 10, 2022)

Charcharo said:


> 380W during gaming... Jesus...
> How much power does it use in gaming with the "Quiet" BIOS? @W1zzard


The 2nd BIOS is "OC BIOS" not "quiet"


----------



## Charcharo (May 10, 2022)

W1zzard said:


> The 2nd BIOS is "OC BIOS" not "quiet"



So the Quiet BIOS is 380W???
Jesus... Christ...


----------



## Denver (May 10, 2022)

Imagine if it had the same bandwidth as the RTX 3090. lol


----------



## TechLurker (May 10, 2022)

Pretty impressive in all but Raytracing for a little over than half the cost of the 3090 Ti. And IIRC, these are just the launch models; not the pinnacle versions of the 6950XT (Toxic, Atomic, Liquid Devil, etc). I wonder how long until we see those top end models and how well those will compare to the top end 3090 Ti models.


----------



## Chomiq (May 10, 2022)

Charcharo said:


> So the Quiet BIOS is 380W???
> Jesus... Christ...


It probably only changes fan profile, with power limit unchanged.


----------



## Valantar (May 10, 2022)

Charcharo said:


> So the Quiet BIOS is 380W???
> Jesus... Christ...


Have you seen any 3090 Ti reviews? It sits at or above 450W, most OC models approach (and some exceed) 500W.

Still, these power draw levels are pure insanity. I really wish AMD and/or their partners would publish "efficiency" BIOSes for these with lower clocks and voltages, as underclocking and undervolting can be a bit challenging due to AMD's aggressive boost algorithm, but they have _tons_ of potential - I used to be able to run my 6900 XT at a peak of ~190W while losing ~15% performance from stock.

Still, not that it really stands up to the stupid power consumption, but I'm impressed by the overall performance gain. I was not expecting this to beat the 3090 Ti at 1440p, nor to beat the 3090 at 2160p.


----------



## Charcharo (May 10, 2022)

Chomiq said:


> It probably only changes fan profile, with power limit unchanged.


Nah the Quiet BIOS is around 380W and the OC one is 410W...

BTW @W1zzard 
"Where AMD does have the weaker offering than NVIDIA is ray tracing. This is due to AMD's architecture, which executes some RT operations in shaders, while NVIDIA has dedicated hardware units for it."
Both Nvidia and AMD use shaders for parts of the RT pipeline. And both AMD and Nvidia have hardware, dedicated units for Ray Tracing. The difference is the potency of the ray tracing hardware and how much is done in shaders / how much is done by the dedicated hardware. But both... well... do both. Both have the hardware, both use the shaders...


----------



## rusTORK (May 10, 2022)

> Display connectivity includes two standard DisplayPort 1.4 and two HDMI 2.1.


Picture dis-agree. =)


----------



## jesdals (May 10, 2022)

Was considering a second reference card - but Im going to hold on for next gen and hope for early november/december launch


----------



## W1zzard (May 10, 2022)

Charcharo said:


> So the Quiet BIOS is 380W???
> Jesus... Christ...


It is not specifically a "quiet" BIOS, just the default BIOS



rusTORK said:


> Picture dis-agree. =)


Fixed



Chomiq said:


> It probably only changes fan profile, with power limit unchanged.











						Sapphire Radeon RX 6950 XT Nitro+ Pure Review
					

The Sapphire Radeon RX 6950 XT Pure is a new addition to Sapphire's lineup that comes with a sexy white color theme. In our review, we find out that the card not only looks great, but the quad-slot cooler achieves fantastic noise levels which make 4K gaming at 60 FPS a whisper-quiet experience.




					www.techpowerup.com
				



Note the clocks here, this is actual OC


----------



## TheinsanegamerN (May 10, 2022)

Newegg has several 6950xts left in stock at just $1100, the same price 6900s are going for or occasionally above that.

*and its gone



Valantar said:


> Have you seen any 3090 Ti reviews? It sits at or above 450W, most OC models approach (and some exceed) 500W.
> 
> Still, these power draw levels are pure insanity. I really wish AMD and/or their partners would publish "efficiency" BIOSes for these with lower clocks and voltages, as underclocking and undervolting can be a bit challenging due to AMD's aggressive boost algorithm, but they have _tons_ of potential - I used to be able to run my 6900 XT at a peak of ~190W while losing ~15% performance from stock.
> 
> Still, not that it really stands up to the stupid power consumption, but I'm impressed by the overall performance gain. I was not expecting this to beat the 3090 Ti at 1440p, nor to beat the 3090 at 2160p.


How many people buy a 6900xt tier card only to undervolt and lose 15% performance? I mena yeah you got it down to 190w but one can argue that the use case for these cards is not to be efficient, its to push max performance. They're halo cards, after all.


----------



## Valantar (May 10, 2022)

TheinsanegamerN said:


> How many people buy a 6900xt tier card only to undervolt and lose 15% performance? I mena yeah you got it down to 190w but one can argue that the use case for these cards is not to be efficient, its to push max performance. They're halo cards, after all.


I'm definitely an edge case - I'm running a 6900 XT and 5800X on a single 280mm radiator in a 13 liter case, after all (and yes, it handles their full wattage just fine, but cooling things down a bit is nice) - but there's a massive argument for less intensive underclocks and undervolts than mine, and that's quite common in my experience on higher end GPUs. Hitting 250W or even a bit lower with unnoticeable performance drops is perfectly doable on a 6900 XT - as long as you can find a combination of voltage and frequency range that suits your silicon and the boost algorithms. Savings ought to be even more significant with this higher binned and higher clocked silicon. And remember, my 6900 XT with that 190W profile performed about the same as a 6800 non-XT (a tad below a 3080) while consuming marginally more power than a 3060. That's definitely noteworthy IMO. Top end GPUs are immensely wasteful in how they are tuned, and undervolting and underclocking is a sensible thing to do unless you _desperately_ need all the performance they can deliver.


----------



## Charcharo (May 10, 2022)

TheinsanegamerN said:


> Newegg has several 6950xts left in stock at just $1100, the same price 6900s are going for or occasionally above that.
> 
> *and its gone
> 
> ...



I literally buy high-end GPUs to undervolt them (slightly), lower the power, get 95% of the performance at 250W and THEN even use limit FPS in games to 3 under the max refresh rate and never hear the computer spin, use even lower power.


----------



## Space Lynx (May 10, 2022)

I'm happy to see AMD doing so well, I feel like these cards are dead on arrival though, I mean RTX 4000 series is less than 6 months away and is going to smoke everything in it's path, and AMD will respond to that release shortly after I expect... and then we will have another 2-3 year wait cycle... so if I were in the market for a GPU right now, I'd just wait another 6 months personally.


----------



## TheinsanegamerN (May 10, 2022)

CallandorWoT said:


> I'm happy to see AMD doing so well, I feel like these cards are dead on arrival though, I mean RTX 4000 series is less than 6 months away and is going to smoke everything in it's path, and AMD will respond to that release shortly after I expect... and then we will have another 2-3 year wait cycle... so if I were in the market for a GPU right now, I'd just wait another 6 months personally.


I mean you could, then in 6 months they get delayed by 2 months, then 2 months later they come out, an dit takes 6 months for them to reliably be in stock, and.......

At some point you gotta bite the bullet and just buy a card, or youll be waiting forever. rDNA 3 and RTX 4000 could be good but I'll bet good money that itll be a year from now before you can reliably buy the things. May as well get one now and enjoy it for that year.


----------



## Valantar (May 10, 2022)

CallandorWoT said:


> I'm happy to see AMD doing so well, I feel like these cards are dead on arrival though, I mean RTX 4000 series is less than 6 months away and is going to smoke everything in it's path, and AMD will respond to that release shortly after I expect... and then we will have another 2-3 year wait cycle... so if I were in the market for a GPU right now, I'd just wait another 6 months personally.


6 months at the tail end of a years-long shortage is quite a long time to sell a bunch of GPUs. And if rumors are to be believed, AMD's 7000-series isn't notably further out than RTX 4000. And beyond new bins and slightly more expensive memory it's not like this has any significant cost to AMD or their partners, so if they sell poorly then it's most likely the non-50 SKUs they could have become would have sold just as poorly - the addition of these new SKUs doesn't meaningfully change that, or represent a significant cost for anyone involved (except for the customer, obviously).


----------



## Dr. Dro (May 10, 2022)

Charcharo said:


> So the Quiet BIOS is 380W???
> Jesus... Christ...



I mean, yes, the only way to make Navi 21 perform like a 3090 is by juicing it as haphazardly as possible... there's absolutely nothing new here, nothing we hadn't seen from the 6900 XT LC (which is this exact product, anyway).

You can get Navi 21 to draw 450 to 500 watts of power on the GPU core only by trying to run Port Royal at ~2.8 GHz with the power limits lifted and it still doesn't beat the 3090, mind you. At this point it's just down to architectural differences, RDNA 2 is better and has always been better at traditional raster graphics, but it's slower at raytracing, high resolutions and doesn't have as many features, pick your poison 

As for me I love to see competition, I just wish it didn't take twenty months to see a single GPU having the same performance I've had for all this time, with obvious drawbacks.


----------



## ARF (May 10, 2022)

Dr. Dro said:


> it's slower at... high resolutions and doesn't have as many features



What features? There are no features, actually the AMD Radeon has more features like:
- Radeon Image Sharpening,
- Fidelity FX,
- Video Sharpness enhancement,
- Video Color Vibrance enhancement,
- AMD Steady Video,
- AMD Fluid Motion Video,
- Radeon Enhanced Sync,
- AMD FreeSync,
etc...

As to "high resolutions", it is due to the lower memory throughput - 256-bit with GDDR6 - only up to 360-370 GB/s which is tried to be offset with large 128 MB L3 cache.

@W1zzard Can you please update this graph because it is wrong?





AMD Radeon RX 6950 XT Specs | TechPowerUp GPU Database

The true performance:




MSI Radeon RX 6950 XT Gaming X Trio Review - Relative Performance | TechPowerUp


----------



## Dr. Dro (May 10, 2022)

ARF said:


> What features? There are no features, actually the AMD Radeon has more features like:
> - Radeon Image Sharpening,
> - Fidelity FX,
> - Video Sharpness enhancement,
> ...



I'm not sure what you're trying to say, because you're trying way too hard to defend AMD here like you have some holy duty to do so, or out of pure spite for the green team. You always seem to do this, it's arguing in bad faith.

Half of these "features" such as color vibrance and motion handling for video have been present in NVIDIA hardware for over fifteen years at this point, if not longer. These are features so old they've fallen into disuse and obsolescence (such as the video interpolation thing), or never worked correctly (such as Enhanced Sync). That kind of talk may impress someone who doesn't understand a thing about graphics cards but it has the complete opposite effect on someone who does.

FidelityFX suite practically works in its entirety on NVIDIA and Intel hardware, NV released image sharpening support to the entire product stack BEFORE AMD removed their gate keeping for RDNA and Polaris only (it works even on Kekler), FreeSync? Might as well try and see who pioneered adaptive sync displays... and if you want to be judgmental, let's be judgmental, need I remind you their promised support for Netflix DRM on Vega for 18.3.1 still hasn't shipped? Or that they've gated Radeon Super Resolution (their global FSR 1.0 pass) behind RDNA hardware? Yeah, no buddy, that's no argument.

I'm hardly an NVIDIA guy, been on Radeon Vanguard for 3 years now and owned practically the entire collection of AMD HBM GPUs. I had a great time with them. I've ended up with a 3090 out of a very unfortunate chain of events back in 2020, the delay on the 6900 XT caused it to arrive in my country far pricier than I had paid on my 3090 back then and that's the sole reason I don't have one on my PC right now. I just never felt it was worth switching because of the astronomical value of this GPU throughout the past few years making me wary of dealing with randos and its excellent stability. I fully intend on going back to AMD on my next upgrade cycle, but saying they're on equal footing, is simply not an argument vested in good faith right now. Not yet.


----------



## ARF (May 10, 2022)

Dr. Dro said:


> I'm not sure what you're trying to say, because you're trying way too hard to defend AMD here like you have some holy duty to do so, or out of pure spite for the green team. You always seem to do this, it's arguing in bad faith.
> 
> Half of these "features" such as color vibrance and motion handling for video have been present in NVIDIA hardware for over fifteen years at this point, if not longer. These are features so old they've fallen into disuse and obsolescence (such as the video interpolation thing), or never worked correctly (such as Enhanced Sync). That kind of talk may impress someone who doesn't understand a thing about graphics cards but it has the complete opposite effect on someone who does.
> 
> ...



So, are you going to point the "more nvidia features" out that are not present with AMD Radeon or I should assume that it is simply an empty arguing?


----------



## Dr. Dro (May 10, 2022)

ARF said:


> So, are you going to point the "more nvidia features" out that are not present with AMD Radeon or I should assume that it is simply an empty arguing?



I've never seen someone write so much and say so little at the same time. May I suggest you go read the RTX 3090 Ti's review beyond the performance percentage charts?

I'm done here, this will just attract the attention of the moderators at this point.


----------



## Valantar (May 10, 2022)

Dr. Dro said:


> I've never seen someone write so much and say so little at the same time. May I suggest you go read the RTX 3090 Ti's review beyond the performance percentage charts?
> 
> I'm done here, this will just attract the attention of the moderators at this point.


I for one would like to hear what those features you're missing from AMD are, as I frankly have no idea. It's quite well known otherwise that RDNA2 is stronger in rasterized graphics at lower resolutions while Nvidia is stronger at 2160p and with RT. Is anyone contesting that being generally true? But I'd honestly like to hear which features you mean.


----------



## dayne878 (May 10, 2022)

This card looks pretty impressive. From what I see on the games I play the performance is somewhere between the 3080 and the 3080ti/3090. Considering I'm still with a 2080ti I have my eye out for upgrades.

That said, the big roadbloack from switching to AMD (or Intel, down the road) is lack of G-sync compatibility. I have a G-sync monitor and don't have money to buy both a new graphics card AND a new monitor when the monitor I have is working perfectly. When I eventually upgrade my monitor I'll find one that is either freesync or maybe has both technologies (if that's possible) because I don't want to continue locking myself into Team Green just to take advantage of a syncing technology. I love G-sync and it makes gaming very smooth for me, but at the cost of being locked in.


----------



## Dr. Dro (May 10, 2022)

Valantar said:


> I for one would like to hear what those features you're missing from AMD are, as I frankly have no idea. It's quite well known otherwise that RDNA2 is stronger in rasterized graphics at lower resolutions while Nvidia is stronger at 2160p and with RT. Is anyone contesting that being generally true? But I'd honestly like to hear which features you mean.



I've mentioned that before, I agree with you. I personally consider, as far as performance goes, them to be on equal footing - each with its own commendable strengths.

To address your question, think of the entire breadth of functionality that having tensor processing capabilities brings, a *dedicated productivity driver suite with ISV endorsement,* a significantly larger framebuffer to make use of these things... many other features AMD does offer, but with limitations (such as the video engine - it currently makes or breaks streaming, which is huge), I rest my case, I could write at length here but it's just not productive to do so. It's gone off-topic, this thread is about the 6950 XT and I'll be happy to debate this some other time.

Looking at it from the lenses of someone who just plays video games, updates their graphics driver maybe once a month and little else is a narrow view of things. The last time AMD offered anything like a Studio driver was with Vega FE and they didn't manage maintaining it, from experience, once proud owner of one...


----------



## Charcharo (May 10, 2022)

@ARF 
The graph is correct. Do not use the Gaming X Trio to represent the Reference / basic 6950XT.


----------



## ppn (May 10, 2022)

I wish we can see the release of navi33 already, but oh noes, we can't have it yet because it obsoletes the whole lineup.


----------



## ARF (May 10, 2022)

Charcharo said:


> @ARF
> The graph is correct. Do not use the Gaming X Trio to represent the Reference / basic 6950XT.



You are wrong. It is written *"Performance estimated based on architecture, shader count and clocks."*
and there is no RX 6950 XT that is so slow as in the graph positioning:






Look at all the available models, there is no so slow model:


----------



## mechtech (May 10, 2022)

Valantar said:


> Have you seen any 3090 Ti reviews? It sits at or above 450W, most OC models approach (and some exceed) 500W.
> 
> Still, these power draw levels are pure insanity. I really wish AMD and/or their partners would publish "efficiency" BIOSes for these with lower clocks and voltages, as underclocking and undervolting can be a bit challenging due to AMD's aggressive boost algorithm, but they have _tons_ of potential - I used to be able to run my 6900 XT at a peak of ~190W while losing ~15% performance from stock.
> 
> Still, not that it really stands up to the stupid power consumption, but I'm impressed by the overall performance gain. I was not expecting this to beat the 3090 Ti at 1440p, nor to beat the 3090 at 2160p.



Indeed.  A regular/OC/default BIOS, and a low power/undervolt/quiet BIOS, at the flip of a switch would be a very nice feature.



Dr. Dro said:


> I mean, yes, the only way to make Navi 21 perform like a 3090 is by juicing it as haphazardly as possible... there's absolutely nothing new here, nothing we hadn't seen from the 6900 XT LC (which is this exact product, anyway).
> 
> You can get Navi 21 to draw 450 to 500 watts of power on the GPU core only by trying to run Port Royal at ~2.8 GHz with the power limits lifted and it still doesn't beat the 3090, mind you. At this point it's just down to architectural differences, RDNA 2 is better and has always been better at traditional raster graphics, but it's slower at raytracing, high resolutions and doesn't have as many features, pick your poison
> 
> As for me I love to see competition,* I just wish it didn't take twenty months* to see a single GPU having the same performance I've had for all this time, with obvious drawbacks.


Blame Covid  









						Sapphire Radeon RX 6950 XT Nitro+ Pure Review
					

The Sapphire Radeon RX 6950 XT Pure is a new addition to Sapphire's lineup that comes with a sexy white color theme. In our review, we find out that the card not only looks great, but the quad-slot cooler achieves fantastic noise levels which make 4K gaming at 60 FPS a whisper-quiet experience.




					www.techpowerup.com
				



The extra bandwidth of the 3090Ti starts to show at 4k.  I know we need a lot of performance for 4k, but I wonder if increasing the memory bandwidth would allow a reduction in GPU for the same 4k performance??


----------



## eidairaman1 (May 11, 2022)

mechtech said:


> Indeed.  A regular/OC/default BIOS, and a low power/undervolt/quiet BIOS, at the flip of a switch would be a very nice feature.
> 
> 
> Blame Covid
> ...



If bios modding was allowed, its possible, afaik Trixx/ab have no means of adjusting memory timings so you can push the GDDR6 further from 18 to 20/22...

Sad enough its not.


----------



## Minus Infinity (May 11, 2022)

Just makes me all the more eager to see RDNA3 cards with massive increases in Infinity cache, IPC and clock speed uplifts and hopefully greatly improved hardware RT cores. 7700XT will beat this in rasterisation and smash it for RT if leaks are to be believed.


----------



## Dr. Dro (May 11, 2022)

mechtech said:


> Blame Covid



Not only Covid, but also a concurrent burst in cryptocurrency performance as its speculative lure reached new heights during a time where people could not work. The current ongoing crypto crash is only bound to worsen as more and more people cash out and return to a productive life. The strain on the semiconductor industry as a whole was unprecedented.

Still, I feel like this GPU is a nice touch and a great way to send off a great architecture that put AMD back into the map. Same goes for the Tie, either way one decides to go, they will have the best experience one can currently have, and this specific Sapphire SKU looks to be the juiciest one yet.


----------



## Count Shagula (May 11, 2022)

Wasn't expecting it to be so much faster than the 6900. Ordered one this morning to replace my 3080. I'm on a 360hz screen and this absolutely destroys my 3080 which is using a 450w bios. Excited to see how far i can push it with my custom loop. I can almost run my pc maxed atm with no fans. Pretty sure ill def be able to with the lower power draw of this


----------



## jigar2speed (May 11, 2022)

I am confused - Your graph shows RTX 3070 in Elden ring doing 48 FPS average using best CPU at 4K however i am doing 60 FPS full max out using 3100X at 4.3 GHZ. What i am doing different or "wrong" to get such good performance ?

PS: It never drops FPS, its always at 60. (May be my MSI afterburner is showing wrong FPS ? VRR and HDR is also enabled)


----------



## W1zzard (May 11, 2022)

jigar2speed said:


> I am confused - Your graph shows RTX 3070 in Elden ring doing 48 FPS average using best CPU at 4K however i am doing 60 FPS full max out using 3100X at 4.3 GHZ. What i am doing different or "wrong" to get such good performance ?
> 
> PS: It never drops FPS, its always at 60. (May be my MSI afterburner is showing wrong FPS ? VRR and HDR is also enabled)


Are you running at max settings? in the open world (not in a dungeon)?


----------



## jigar2speed (May 11, 2022)

W1zzard said:


> Are you running at max settings?


Yes sir,

choose Custom setting, all options max out (Some don't go beyond high and some are at maximum) at 3840 x 2160 resolution. 

EDIT: GPU is not overclocked, power limited at 80% (Gigabyte RTX 3070 vision oc) but GPU memory is overclocked by 1000 MHZ.


----------



## ratirt (May 11, 2022)

This 6950XT is basically a 6900XT just without power limits. It is hard for me to find any difference except that. I wonder, having a 6900xt Red Devil, if PowerColor releases a 6950XT, would I be able to flash one of my 6900XT Bioses to the 6950XT. I'm ok with the performance I have but just out of curiosity. Having the power limit removed would have been nice. 
For the performance increase, yeah it is there but the power consumption is up as well. That is kinda scary.


----------



## Charcharo (May 11, 2022)

ARF said:


> You are wrong. It is written *"Performance estimated based on architecture, shader count and clocks."*
> and there is no RX 6950 XT that is so slow as in the graph positioning:
> 
> View attachment 246968
> ...



Wrong:









There is a reference 6950 XT. It is on AMD's site too, you can buy it.


----------



## ARF (May 11, 2022)

Charcharo said:


> There is a reference 6950 XT. It is on AMD's site too, you can buy it.



The TPU graph DOESN'T follow a reference card. It is a guesstimate not based on real world performance but simulations.




AMD Radeon RX 6950 XT Specs | TechPowerUp GPU Database


----------



## Valantar (May 11, 2022)

Yeah, those database entries have been getting successively weirder recently. First the 6400XT review places it about the same as a 1650, but the database placed it much closer to a 1050 Ti (which has since been fixed). Now this, again, noticeably underestimates the performance of a new AMD GPU compared to TPU's own benchmarks - even if those benchmarks are SKUs with a couple percent faster core clocks than stock. @W1zzard @T4C Fantasy is there something off with the standard methodology behind database performance entries?


----------



## Charcharo (May 11, 2022)

ARF said:


> The TPU graph DOESN'T follow a reference card. It is a guesstimate not based on real world performance but simulations.
> 
> View attachment 247022
> AMD Radeon RX 6950 XT Specs | TechPowerUp GPU Database



Well as per AMD its 5% faster:

__
		https://www.reddit.com/r/Amd/comments/umk48z

SO on the graph it should be edited to 1% under 3080 Ti, or 2% more than it is now.

97x1.05 = 101.85

So a slight edit, but not like with the very fast Nitro Pure.


----------



## W1zzard (May 11, 2022)

I ordered a 6950 XT reference card to figure this out



Valantar said:


> which has since been fixed


Probably because one was based on the estimation algorithm, and has been updated to use my review data now.


----------



## Valantar (May 11, 2022)

W1zzard said:


> I ordered a 6950 XT reference card to figure this out
> 
> 
> Probably because one was based on the estimation algorithm, and has been updated to use my review data now.


Guess it might be time to update that estimation algorithm for RDNA2-based GPUs?


----------



## W1zzard (May 11, 2022)

Valantar said:


> Guess it might be time to update that estimation algorithm for RDNA2-based GPUs?


you make that sound so easy


----------



## Valantar (May 11, 2022)

W1zzard said:


> you make that sound so easy


Lol, that was not my intention. Something seeming necessary definitely doesn't mean it's easy. Too often the exact opposite, in my experience.


----------



## Jism (May 11, 2022)

Charcharo said:


> 380W during gaming... Jesus...
> How much power does it use in gaming with the "Quiet" BIOS? @W1zzard



Turn on vsync,
Limit power usage by the simple power slider

Quite some things you can do to limit it. The full power is when you want its highest performance (state).

I like the card. Its fast and it will last for the next 3 years easy. I think ill buy it.


----------



## ARF (May 11, 2022)

Jism said:


> Turn on vsync,
> Limit power usage by the simple power slider
> 
> Quite some things you can do to limit it. The full power is when you want its highest performance (state).
> ...



Radeon Chill saves power while limiting the unnecessary frames creation.


----------



## Valantar (May 11, 2022)

ARF said:


> Radeon Chill saves power while limiting the unnecessary frames creation.
> 
> View attachment 247025


Chill is very cool (no pun intended), but has quite a few drawbacks - in games that aren't interaction heavy but need some smoothness it leads to just low frame rates generally (Divinity: Original Sin, for example), while in others it has no effect as you're always using inputs and thus it isn't clocking down. Still, it's a decent idea, and it is very handy in certain situations. It's no replacement for a low power BIOS mode, but it's certainly better than nothing. Then again I also really, really wish AMD could make their global framerate limiter work properly, rather than having it added to and removed from driver releases all willy-nilly (and no, Chill with the same frame rate as both high and low bounds is not a good substitute for a proper framerate limiter).


----------



## ratirt (May 11, 2022)

Valantar said:


> Chill is very cool (no pun intended), but has quite a few drawbacks - in games that aren't interaction heavy but need some smoothness it leads to just low frame rates generally (Divinity: Original Sin, for example), while in others it has no effect as you're always using inputs and thus it isn't clocking down. Still, it's a decent idea, and it is very handy in certain situations. It's no replacement for a low power BIOS mode, but it's certainly better than nothing. Then again I also really, really wish AMD could make their global framerate limiter work properly, rather than having it added to and removed from driver releases all willy-nilly (and no, Chill with the same frame rate as both high and low bounds is not a good substitute for a proper framerate limiter).


Why would you get low framerates in divinity when capping the radeon chill slider?


----------



## medi01 (May 11, 2022)

TechLurker said:


> Pretty impressive in all but Raytracing for a little over than half the cost of the 3090 Ti.



On RT front it is curious that in newer titles AMD is catching up (and that  Control thing that was said to use different codepath for NV is hardly indicative)


----------



## Valantar (May 11, 2022)

ratirt said:


> Why would you get low framerates in divinity when capping the radeon chill slider?


Not low overall, just uncomfortable and ... bad? D:OS is a game that in significant segments of the game has pretty low levels of interaction - shops, dialogue, in-game cutscenes, etc. have very little mouse movement or button presses, all of which cause it to drop framerates to a level where they became bothersome to me. Turn-based combat is possibly even worse, as interaction is bursty - some minor interaction when selecting an action, then none while seeing it play out - caused the framerate to fluctuate up and down in a really uncomfortable way as well, causing both very unstable framerates and causing it to never really return to 60fps even when I was doing something. The input-based framerate limiting of Chill just doesn't work well for that type of game.


----------



## HisDivineOrder (May 11, 2022)

medi01 said:


> On RT front it is curious that in newer titles AMD is catching up (and that  Control thing that was said to use different codepath for NV is hardly indicative)
> 
> View attachment 247033 View attachment 247034 View attachment 247035


Games where RT is being used to its full effect, as it will be in the future, show a major difference. Nvidia wins those. Games where RT is basically a gimmick and isn't really used to any good effect? AMD can mostly keep up. Make no mistake, AMD paid to have publishers dumb down their RT so they can have people like you making statements like you're making. I suppose it doesn't hurt that these are all the current gen consoles can manage because, well, AMD didn't take RT seriously this gen.

But if RT is the future, weaksauce RT won't be what we're all using. I doubt AMD will even care anymore because by then they'll have a serious RT engine in their GPU's, newer consoles will be out or around the corner, and they'll probably do what they always do and EOL these cards to get people to stop harassing them for more performance in RT that's making the fine wine taste like spoiled milk.


----------



## Ravenas (May 11, 2022)

My Sapphire 6900 XTXH Toxic Extreme memory is overclocked to 2250 or 18 gbps, and my GPU is at 2750. Are these XTXH chips?


----------



## ratirt (May 12, 2022)

Valantar said:


> Not low overall, just uncomfortable and ... bad? D:OS is a game that in significant segments of the game has pretty low levels of interaction - shops, dialogue, in-game cutscenes, etc. have very little mouse movement or button presses, all of which cause it to drop framerates to a level where they became bothersome to me. Turn-based combat is possibly even worse, as interaction is bursty - some minor interaction when selecting an action, then none while seeing it play out - caused the framerate to fluctuate up and down in a really uncomfortable way as well, causing both very unstable framerates and causing it to never really return to 60fps even when I was doing something. The input-based framerate limiting of Chill just doesn't work well for that type of game.


You are talking about the second installment Divinity original sin 2? Because I have been spending 100 of hours playing the game and literally seen no frame drops nor hitches or anything.
That is why I'm asking. Especially considering we have a very similar hardware. When you use radeon chill in the game, do you set it for 60FPS limit? Try going higher a bit like 75 or 90. It might solve your problem and still limit some FPS.

To be honest, I have noticed that with CS:GO. sometimes the framerate drops to 30FPS and stays there. Not sure why it happens but it did few times. That is why I sometimes bump the Rchill to 90 or 75.


----------



## Charcharo (May 12, 2022)

HisDivineOrder said:


> Games where RT is being used to its full effect, as it will be in the future, show a major difference. Nvidia wins those. Games where RT is basically a gimmick and isn't really used to any good effect? AMD can mostly keep up. Make no mistake, AMD paid to have publishers dumb down their RT so they can have people like you making statements like you're making. I suppose it doesn't hurt that these are all the current gen consoles can manage because, well, AMD didn't take RT seriously this gen.
> 
> But if RT is the future, weaksauce RT won't be what we're all using. I doubt AMD will even care anymore because by then they'll have a serious RT engine in their GPU's, newer consoles will be out or around the corner, and they'll probably do what they always do and EOL these cards to get people to stop harassing them for more performance in RT that's making the fine wine taste like spoiled milk.


To be fair to AMD, low VRAM amounts like 8 or 10 GB are not going to age well either.

Yes DirectStorage (lol) and Sampler Feedback (not lol, this is serious now) will help low-VRAM GPUs... but since those are standard on consoles too, and we KNOW that beauty sells - I expect Devs to reinvest any VRAM savings back into textures and models. 

Though the 3090 and 3090 Ti will age well for sure.


----------



## Valantar (May 12, 2022)

HisDivineOrder said:


> I suppose it doesn't hurt that these are all the current gen consoles can manage because, well, AMD didn't take RT seriously this gen.


I think that's a ... how to put it, an unreasonably harsh take. "Didn't take RT seriously" does not seem to be a fitting description of a sequence of events that goes something along the lines of "RTRT was considered unrealistic in the near future in consumer products -> Nvidia stuns people with launching it -> AMD responds with their own alternative the next generation, two years later." Regardless if both most likely were working on this in their R&D labs at roughly the same time (somewhat likely, at least), Nvidia's vastly larger R&D budgets tells us that it's highly unlikely that AMD had the resources to really prioritize RTRT before Turing. Nvidia also had no external pressure to produce this, meaning they could hold off on launching it until they deemed it ready - a luxury AMD didn't have due to Nvidia moving first. Managing to put out a solution that more or less matches Nvidia's own first generation effort, even if Nvidia at the same time launched a significantly improved second gen effort? That's overall relatively impressive, especially considering the resource differences in play. Summing that up as "AMD didn't take RT seriously" is just not a reasonable assessment of that development cycle.

That obviously doesn't change the fact that Nvidia's RT implementation is currently significantly faster - that's just facts. But that's also what you get through having massively superior resources to competitors and the first mover advantage that often brings along with it. AMD's current implementation is still a decent first-gen effort, especially considering what must have been a relatively rushed development cycle. That doesn't mean it's good enough - but neither is Ampere's RT, really. It's just better.

As for AMD paying developers to dumb down their RT implementations - something like that, or at least paying "marketing support", and providing some degree of development/engineering support aimed towards optimizing RTRT for current-gen consoles (specifically: not implementing features that these consoles just can't handle at all, instead focusing on more scalable features that work in lighter weight modes on the consoles) is likely happening, yes, but there's also an inherent incentive towards making use of console hardware (and not exceeding it by too much) just due to the sheer market force of console install bases. I don't for a second doubt that AMD will take any advantage they can get whereever they can get them - they're a corporation seeking profits, after all - but even despite their growth and success in recent years I don't think they have the funds to throw money at external problems in the same way Nvidia has been doing for decades. Some? Sure. Enough to, say, contractually bar developers from implementing _additional_ RTRT modes on PC, on top of the console ones, that might make AMD look bad? Doubtful IMO. It's quite likely IMO that they're trying to put pressure on developers in this direction, but a more likely explanation is that given that they're already developing a given set of RT features, implementing more, different RT features (especially more complex ones) is an additional cost on top of that, and one that's only going to pay off for a relatively small subset of customers (PC gamers with Nvidia RTX GPUs, and if very performance intensive features, PC gamers with an RTX 2080 or faster). At some point, the cost of those features starts becoming too high compared to the possible benefits to be worth the effort.


----------



## HD64G (May 12, 2022)

Dear @W1zzard a good review from you as expected! I hope that the press driver you tested those 6X50XT gpus with is the one that made dx11 performance much better (22.5.2 preview). In my R5 5600 & RX5700 combo it made Witcher 3 go from 130 to 140FPS. And I speak about a game that already had the GPU utilization @100%. Something big is altered in this driver and lowered overhead by much me thinks.









						New Updates to RSR and DX11 Performance Optimizations
					

With the release of the new Radeon™ RX 6950, 6750 and 6650 series of products, AMD is ensuring full support for these amazing graphics cards with our powerful, enhanced software solution (1). Available today, AMD Software Preview Driver May 2022 is ready for download, improving performance...




					community.amd.com


----------



## Taraquin (May 13, 2022)

AMD did the old recipe: Overvolt for minor gains and trash efficiency. I wish they kept the voltage of 1.00-1.05v that 6900XT had instead of 1.2v which 6950XT is stuck at. The faster vram would have helped anyways. Almost 30% more powerusage for 7-10% performance is not worth it I think.


----------



## Ravenas (May 14, 2022)

I take back my previous comments due to the overall overclocking capability. Although the chips must be XTXH because overclocking yields ~2800 GCLK.

The memory bandwidth overclocking is wonderful, reaching 18.8 gbps at ~ 2350 MCLK


----------



## medi01 (May 15, 2022)

HisDivineOrder said:


> Games where RT is being used to its full effect, as it will be in the future, show a major difference.


Lies.

For instance WoW RT is where there RT is very noticeable, but AMD wins.

In Cyberpunk 2077 RT off quite often looks better than RT on, but NV has an edge.


----------



## ratirt (May 16, 2022)

It is weird, the card is available in Norway already and costs $730 less than a 3090Ti. Damn what a price difference. Still it costs a bit but the difference in price is noticeable.


----------

