• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Sapphire Radeon RX 6950 XT Nitro+ Pure

Joined
May 31, 2016
Messages
4,421 (1.45/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Chill is very cool (no pun intended), but has quite a few drawbacks - in games that aren't interaction heavy but need some smoothness it leads to just low frame rates generally (Divinity: Original Sin, for example), while in others it has no effect as you're always using inputs and thus it isn't clocking down. Still, it's a decent idea, and it is very handy in certain situations. It's no replacement for a low power BIOS mode, but it's certainly better than nothing. Then again I also really, really wish AMD could make their global framerate limiter work properly, rather than having it added to and removed from driver releases all willy-nilly (and no, Chill with the same frame rate as both high and low bounds is not a good substitute for a proper framerate limiter).
Why would you get low framerates in divinity when capping the radeon chill slider?
 
Joined
Jul 9, 2015
Messages
3,413 (1.01/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Pretty impressive in all but Raytracing for a little over than half the cost of the 3090 Ti.

On RT front it is curious that in newer titles AMD is catching up (and that Control thing that was said to use different codepath for NV is hardly indicative)

1652276997293.png
1652277013279.png
1652277025821.png
 
Joined
May 2, 2017
Messages
7,762 (2.87/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Why would you get low framerates in divinity when capping the radeon chill slider?
Not low overall, just uncomfortable and ... bad? D:OS is a game that in significant segments of the game has pretty low levels of interaction - shops, dialogue, in-game cutscenes, etc. have very little mouse movement or button presses, all of which cause it to drop framerates to a level where they became bothersome to me. Turn-based combat is possibly even worse, as interaction is bursty - some minor interaction when selecting an action, then none while seeing it play out - caused the framerate to fluctuate up and down in a really uncomfortable way as well, causing both very unstable framerates and causing it to never really return to 60fps even when I was doing something. The input-based framerate limiting of Chill just doesn't work well for that type of game.
 
Joined
Aug 23, 2013
Messages
576 (0.14/day)
On RT front it is curious that in newer titles AMD is catching up (and that Control thing that was said to use different codepath for NV is hardly indicative)

View attachment 247033 View attachment 247034 View attachment 247035
Games where RT is being used to its full effect, as it will be in the future, show a major difference. Nvidia wins those. Games where RT is basically a gimmick and isn't really used to any good effect? AMD can mostly keep up. Make no mistake, AMD paid to have publishers dumb down their RT so they can have people like you making statements like you're making. I suppose it doesn't hurt that these are all the current gen consoles can manage because, well, AMD didn't take RT seriously this gen.

But if RT is the future, weaksauce RT won't be what we're all using. I doubt AMD will even care anymore because by then they'll have a serious RT engine in their GPU's, newer consoles will be out or around the corner, and they'll probably do what they always do and EOL these cards to get people to stop harassing them for more performance in RT that's making the fine wine taste like spoiled milk.
 
Joined
May 24, 2007
Messages
5,422 (0.86/day)
Location
Tennessee
System Name AM5
Processor AMD Ryzen R9 7950X
Motherboard Asrock X670E Taichi
Cooling EK AIO Basic 360
Memory Corsair Vengeance DDR5 5600 64 Gb - XMP1 Profile
Video Card(s) AMD Reference 7900 XTX 24 Gb
Storage Crucial Gen 5 1 TB, Samsung Gen 4 980 1 TB / Samsung 8TB SSD
Display(s) Samsung 34" 240hz 4K
Case Fractal Define R7
Power Supply Seasonic PRIME PX-1300, 1300W 80+ Platinum, Full Modular
My Sapphire 6900 XTXH Toxic Extreme memory is overclocked to 2250 or 18 gbps, and my GPU is at 2750. Are these XTXH chips?
 
Last edited:
Joined
May 31, 2016
Messages
4,421 (1.45/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
Not low overall, just uncomfortable and ... bad? D:OS is a game that in significant segments of the game has pretty low levels of interaction - shops, dialogue, in-game cutscenes, etc. have very little mouse movement or button presses, all of which cause it to drop framerates to a level where they became bothersome to me. Turn-based combat is possibly even worse, as interaction is bursty - some minor interaction when selecting an action, then none while seeing it play out - caused the framerate to fluctuate up and down in a really uncomfortable way as well, causing both very unstable framerates and causing it to never really return to 60fps even when I was doing something. The input-based framerate limiting of Chill just doesn't work well for that type of game.
You are talking about the second installment Divinity original sin 2? Because I have been spending 100 of hours playing the game and literally seen no frame drops nor hitches or anything.
That is why I'm asking. Especially considering we have a very similar hardware. When you use radeon chill in the game, do you set it for 60FPS limit? Try going higher a bit like 75 or 90. It might solve your problem and still limit some FPS.

To be honest, I have noticed that with CS:GO. sometimes the framerate drops to 30FPS and stays there. Not sure why it happens but it did few times. That is why I sometimes bump the Rchill to 90 or 75.
 
Joined
Jan 23, 2016
Messages
96 (0.03/day)
Location
Sofia, Bulgaria
Processor Ryzen 5 5600X I Core i7 6700K
Motherboard B550 Phantom Gaming 4 I Asus Z170-A ATX
Video Card(s) RX 6900 XT PowerColor Red Devil I RTX 3080 Palit GamingPro
Storage Intel 665P 2TB I Intel 660p 2TB
Case NZXT S340 Elite I Some noname case lmao
Mouse Logitech G Pro Wired
Keyboard Wooting Two Lekker Edition
Games where RT is being used to its full effect, as it will be in the future, show a major difference. Nvidia wins those. Games where RT is basically a gimmick and isn't really used to any good effect? AMD can mostly keep up. Make no mistake, AMD paid to have publishers dumb down their RT so they can have people like you making statements like you're making. I suppose it doesn't hurt that these are all the current gen consoles can manage because, well, AMD didn't take RT seriously this gen.

But if RT is the future, weaksauce RT won't be what we're all using. I doubt AMD will even care anymore because by then they'll have a serious RT engine in their GPU's, newer consoles will be out or around the corner, and they'll probably do what they always do and EOL these cards to get people to stop harassing them for more performance in RT that's making the fine wine taste like spoiled milk.
To be fair to AMD, low VRAM amounts like 8 or 10 GB are not going to age well either.

Yes DirectStorage (lol) and Sampler Feedback (not lol, this is serious now) will help low-VRAM GPUs... but since those are standard on consoles too, and we KNOW that beauty sells - I expect Devs to reinvest any VRAM savings back into textures and models.

Though the 3090 and 3090 Ti will age well for sure.
 
Joined
May 2, 2017
Messages
7,762 (2.87/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I suppose it doesn't hurt that these are all the current gen consoles can manage because, well, AMD didn't take RT seriously this gen.
I think that's a ... how to put it, an unreasonably harsh take. "Didn't take RT seriously" does not seem to be a fitting description of a sequence of events that goes something along the lines of "RTRT was considered unrealistic in the near future in consumer products -> Nvidia stuns people with launching it -> AMD responds with their own alternative the next generation, two years later." Regardless if both most likely were working on this in their R&D labs at roughly the same time (somewhat likely, at least), Nvidia's vastly larger R&D budgets tells us that it's highly unlikely that AMD had the resources to really prioritize RTRT before Turing. Nvidia also had no external pressure to produce this, meaning they could hold off on launching it until they deemed it ready - a luxury AMD didn't have due to Nvidia moving first. Managing to put out a solution that more or less matches Nvidia's own first generation effort, even if Nvidia at the same time launched a significantly improved second gen effort? That's overall relatively impressive, especially considering the resource differences in play. Summing that up as "AMD didn't take RT seriously" is just not a reasonable assessment of that development cycle.

That obviously doesn't change the fact that Nvidia's RT implementation is currently significantly faster - that's just facts. But that's also what you get through having massively superior resources to competitors and the first mover advantage that often brings along with it. AMD's current implementation is still a decent first-gen effort, especially considering what must have been a relatively rushed development cycle. That doesn't mean it's good enough - but neither is Ampere's RT, really. It's just better.

As for AMD paying developers to dumb down their RT implementations - something like that, or at least paying "marketing support", and providing some degree of development/engineering support aimed towards optimizing RTRT for current-gen consoles (specifically: not implementing features that these consoles just can't handle at all, instead focusing on more scalable features that work in lighter weight modes on the consoles) is likely happening, yes, but there's also an inherent incentive towards making use of console hardware (and not exceeding it by too much) just due to the sheer market force of console install bases. I don't for a second doubt that AMD will take any advantage they can get whereever they can get them - they're a corporation seeking profits, after all - but even despite their growth and success in recent years I don't think they have the funds to throw money at external problems in the same way Nvidia has been doing for decades. Some? Sure. Enough to, say, contractually bar developers from implementing additional RTRT modes on PC, on top of the console ones, that might make AMD look bad? Doubtful IMO. It's quite likely IMO that they're trying to put pressure on developers in this direction, but a more likely explanation is that given that they're already developing a given set of RT features, implementing more, different RT features (especially more complex ones) is an additional cost on top of that, and one that's only going to pay off for a relatively small subset of customers (PC gamers with Nvidia RTX GPUs, and if very performance intensive features, PC gamers with an RTX 2080 or faster). At some point, the cost of those features starts becoming too high compared to the possible benefits to be worth the effort.
 
Joined
Apr 30, 2011
Messages
2,692 (0.55/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
Dear @W1zzard a good review from you as expected! I hope that the press driver you tested those 6X50XT gpus with is the one that made dx11 performance much better (22.5.2 preview). In my R5 5600 & RX5700 combo it made Witcher 3 go from 130 to 140FPS. And I speak about a game that already had the GPU utilization @100%. Something big is altered in this driver and lowered overhead by much me thinks.

 
Joined
Aug 9, 2019
Messages
1,648 (0.88/day)
Processor 7800X3D 2x16GB CO
Motherboard Asrock B650m HDV
Cooling Peerless Assassin SE
Memory 2x16GB DR A-die@6000c30 tuned
Video Card(s) Asus 4070 dual OC 2610@915mv
Storage WD blue 1TB nvme
Display(s) Lenovo G24-10 144Hz
Case Corsair D4000 Airflow
Power Supply EVGA GQ 650W
Software Windows 10 home 64
Benchmark Scores Superposition 8k 5267 Aida64 58.5ns
AMD did the old recipe: Overvolt for minor gains and trash efficiency. I wish they kept the voltage of 1.00-1.05v that 6900XT had instead of 1.2v which 6950XT is stuck at. The faster vram would have helped anyways. Almost 30% more powerusage for 7-10% performance is not worth it I think.
 
Joined
May 24, 2007
Messages
5,422 (0.86/day)
Location
Tennessee
System Name AM5
Processor AMD Ryzen R9 7950X
Motherboard Asrock X670E Taichi
Cooling EK AIO Basic 360
Memory Corsair Vengeance DDR5 5600 64 Gb - XMP1 Profile
Video Card(s) AMD Reference 7900 XTX 24 Gb
Storage Crucial Gen 5 1 TB, Samsung Gen 4 980 1 TB / Samsung 8TB SSD
Display(s) Samsung 34" 240hz 4K
Case Fractal Define R7
Power Supply Seasonic PRIME PX-1300, 1300W 80+ Platinum, Full Modular
I take back my previous comments due to the overall overclocking capability. Although the chips must be XTXH because overclocking yields ~2800 GCLK.

The memory bandwidth overclocking is wonderful, reaching 18.8 gbps at ~ 2350 MCLK
 
Joined
Jul 9, 2015
Messages
3,413 (1.01/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Games where RT is being used to its full effect, as it will be in the future, show a major difference.
Lies.

For instance WoW RT is where there RT is very noticeable, but AMD wins.

In Cyberpunk 2077 RT off quite often looks better than RT on, but NV has an edge.
 
Joined
May 31, 2016
Messages
4,421 (1.45/day)
Location
Currently Norway
System Name Bro2
Processor Ryzen 5800X
Motherboard Gigabyte X570 Aorus Elite
Cooling Corsair h115i pro rgb
Memory 32GB G.Skill Flare X 3200 CL14 @3800Mhz CL16
Video Card(s) Powercolor 6900 XT Red Devil 1.1v@2400Mhz
Storage M.2 Samsung 970 Evo Plus 500MB/ Samsung 860 Evo 1TB
Display(s) LG 27UD69 UHD / LG 27GN950
Case Fractal Design G
Audio Device(s) Realtec 5.1
Power Supply Seasonic 750W GOLD
Mouse Logitech G402
Keyboard Logitech slim
Software Windows 10 64 bit
It is weird, the card is available in Norway already and costs $730 less than a 3090Ti. Damn what a price difference. Still it costs a bit but the difference in price is noticeable.
 
Top