• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Sapphire Radeon RX 6950 XT Nitro+ Pure

Chill is very cool (no pun intended), but has quite a few drawbacks - in games that aren't interaction heavy but need some smoothness it leads to just low frame rates generally (Divinity: Original Sin, for example), while in others it has no effect as you're always using inputs and thus it isn't clocking down. Still, it's a decent idea, and it is very handy in certain situations. It's no replacement for a low power BIOS mode, but it's certainly better than nothing. Then again I also really, really wish AMD could make their global framerate limiter work properly, rather than having it added to and removed from driver releases all willy-nilly (and no, Chill with the same frame rate as both high and low bounds is not a good substitute for a proper framerate limiter).
Why would you get low framerates in divinity when capping the radeon chill slider?
 
Pretty impressive in all but Raytracing for a little over than half the cost of the 3090 Ti.

On RT front it is curious that in newer titles AMD is catching up (and that Control thing that was said to use different codepath for NV is hardly indicative)

1652276997293.png
1652277013279.png
1652277025821.png
 
Why would you get low framerates in divinity when capping the radeon chill slider?
Not low overall, just uncomfortable and ... bad? D:OS is a game that in significant segments of the game has pretty low levels of interaction - shops, dialogue, in-game cutscenes, etc. have very little mouse movement or button presses, all of which cause it to drop framerates to a level where they became bothersome to me. Turn-based combat is possibly even worse, as interaction is bursty - some minor interaction when selecting an action, then none while seeing it play out - caused the framerate to fluctuate up and down in a really uncomfortable way as well, causing both very unstable framerates and causing it to never really return to 60fps even when I was doing something. The input-based framerate limiting of Chill just doesn't work well for that type of game.
 
On RT front it is curious that in newer titles AMD is catching up (and that Control thing that was said to use different codepath for NV is hardly indicative)

View attachment 247033 View attachment 247034 View attachment 247035
Games where RT is being used to its full effect, as it will be in the future, show a major difference. Nvidia wins those. Games where RT is basically a gimmick and isn't really used to any good effect? AMD can mostly keep up. Make no mistake, AMD paid to have publishers dumb down their RT so they can have people like you making statements like you're making. I suppose it doesn't hurt that these are all the current gen consoles can manage because, well, AMD didn't take RT seriously this gen.

But if RT is the future, weaksauce RT won't be what we're all using. I doubt AMD will even care anymore because by then they'll have a serious RT engine in their GPU's, newer consoles will be out or around the corner, and they'll probably do what they always do and EOL these cards to get people to stop harassing them for more performance in RT that's making the fine wine taste like spoiled milk.
 
My Sapphire 6900 XTXH Toxic Extreme memory is overclocked to 2250 or 18 gbps, and my GPU is at 2750. Are these XTXH chips?
 
Last edited:
Not low overall, just uncomfortable and ... bad? D:OS is a game that in significant segments of the game has pretty low levels of interaction - shops, dialogue, in-game cutscenes, etc. have very little mouse movement or button presses, all of which cause it to drop framerates to a level where they became bothersome to me. Turn-based combat is possibly even worse, as interaction is bursty - some minor interaction when selecting an action, then none while seeing it play out - caused the framerate to fluctuate up and down in a really uncomfortable way as well, causing both very unstable framerates and causing it to never really return to 60fps even when I was doing something. The input-based framerate limiting of Chill just doesn't work well for that type of game.
You are talking about the second installment Divinity original sin 2? Because I have been spending 100 of hours playing the game and literally seen no frame drops nor hitches or anything.
That is why I'm asking. Especially considering we have a very similar hardware. When you use radeon chill in the game, do you set it for 60FPS limit? Try going higher a bit like 75 or 90. It might solve your problem and still limit some FPS.

To be honest, I have noticed that with CS:GO. sometimes the framerate drops to 30FPS and stays there. Not sure why it happens but it did few times. That is why I sometimes bump the Rchill to 90 or 75.
 
Games where RT is being used to its full effect, as it will be in the future, show a major difference. Nvidia wins those. Games where RT is basically a gimmick and isn't really used to any good effect? AMD can mostly keep up. Make no mistake, AMD paid to have publishers dumb down their RT so they can have people like you making statements like you're making. I suppose it doesn't hurt that these are all the current gen consoles can manage because, well, AMD didn't take RT seriously this gen.

But if RT is the future, weaksauce RT won't be what we're all using. I doubt AMD will even care anymore because by then they'll have a serious RT engine in their GPU's, newer consoles will be out or around the corner, and they'll probably do what they always do and EOL these cards to get people to stop harassing them for more performance in RT that's making the fine wine taste like spoiled milk.
To be fair to AMD, low VRAM amounts like 8 or 10 GB are not going to age well either.

Yes DirectStorage (lol) and Sampler Feedback (not lol, this is serious now) will help low-VRAM GPUs... but since those are standard on consoles too, and we KNOW that beauty sells - I expect Devs to reinvest any VRAM savings back into textures and models.

Though the 3090 and 3090 Ti will age well for sure.
 
I suppose it doesn't hurt that these are all the current gen consoles can manage because, well, AMD didn't take RT seriously this gen.
I think that's a ... how to put it, an unreasonably harsh take. "Didn't take RT seriously" does not seem to be a fitting description of a sequence of events that goes something along the lines of "RTRT was considered unrealistic in the near future in consumer products -> Nvidia stuns people with launching it -> AMD responds with their own alternative the next generation, two years later." Regardless if both most likely were working on this in their R&D labs at roughly the same time (somewhat likely, at least), Nvidia's vastly larger R&D budgets tells us that it's highly unlikely that AMD had the resources to really prioritize RTRT before Turing. Nvidia also had no external pressure to produce this, meaning they could hold off on launching it until they deemed it ready - a luxury AMD didn't have due to Nvidia moving first. Managing to put out a solution that more or less matches Nvidia's own first generation effort, even if Nvidia at the same time launched a significantly improved second gen effort? That's overall relatively impressive, especially considering the resource differences in play. Summing that up as "AMD didn't take RT seriously" is just not a reasonable assessment of that development cycle.

That obviously doesn't change the fact that Nvidia's RT implementation is currently significantly faster - that's just facts. But that's also what you get through having massively superior resources to competitors and the first mover advantage that often brings along with it. AMD's current implementation is still a decent first-gen effort, especially considering what must have been a relatively rushed development cycle. That doesn't mean it's good enough - but neither is Ampere's RT, really. It's just better.

As for AMD paying developers to dumb down their RT implementations - something like that, or at least paying "marketing support", and providing some degree of development/engineering support aimed towards optimizing RTRT for current-gen consoles (specifically: not implementing features that these consoles just can't handle at all, instead focusing on more scalable features that work in lighter weight modes on the consoles) is likely happening, yes, but there's also an inherent incentive towards making use of console hardware (and not exceeding it by too much) just due to the sheer market force of console install bases. I don't for a second doubt that AMD will take any advantage they can get whereever they can get them - they're a corporation seeking profits, after all - but even despite their growth and success in recent years I don't think they have the funds to throw money at external problems in the same way Nvidia has been doing for decades. Some? Sure. Enough to, say, contractually bar developers from implementing additional RTRT modes on PC, on top of the console ones, that might make AMD look bad? Doubtful IMO. It's quite likely IMO that they're trying to put pressure on developers in this direction, but a more likely explanation is that given that they're already developing a given set of RT features, implementing more, different RT features (especially more complex ones) is an additional cost on top of that, and one that's only going to pay off for a relatively small subset of customers (PC gamers with Nvidia RTX GPUs, and if very performance intensive features, PC gamers with an RTX 2080 or faster). At some point, the cost of those features starts becoming too high compared to the possible benefits to be worth the effort.
 
Dear @W1zzard a good review from you as expected! I hope that the press driver you tested those 6X50XT gpus with is the one that made dx11 performance much better (22.5.2 preview). In my R5 5600 & RX5700 combo it made Witcher 3 go from 130 to 140FPS. And I speak about a game that already had the GPU utilization @100%. Something big is altered in this driver and lowered overhead by much me thinks.

 
AMD did the old recipe: Overvolt for minor gains and trash efficiency. I wish they kept the voltage of 1.00-1.05v that 6900XT had instead of 1.2v which 6950XT is stuck at. The faster vram would have helped anyways. Almost 30% more powerusage for 7-10% performance is not worth it I think.
 
I take back my previous comments due to the overall overclocking capability. Although the chips must be XTXH because overclocking yields ~2800 GCLK.

The memory bandwidth overclocking is wonderful, reaching 18.8 gbps at ~ 2350 MCLK
 
Games where RT is being used to its full effect, as it will be in the future, show a major difference.
Lies.

For instance WoW RT is where there RT is very noticeable, but AMD wins.

In Cyberpunk 2077 RT off quite often looks better than RT on, but NV has an edge.
 
It is weird, the card is available in Norway already and costs $730 less than a 3090Ti. Damn what a price difference. Still it costs a bit but the difference in price is noticeable.
 
Back
Top